ML20206H100
| ML20206H100 | |
| Person / Time | |
|---|---|
| Site: | Shoreham File:Long Island Lighting Company icon.png |
| Issue date: | 04/01/1987 |
| From: | Monaghan J HUNTON & WILLIAMS, LONG ISLAND LIGHTING CO. |
| To: | Atomic Safety and Licensing Board Panel |
| References | |
| CON-#287-3099 OL-5, NUDOCS 8704150258 | |
| Download: ML20206H100 (143) | |
Text
30 %
.5 LILCO, April 1,1987 UNITED STATES OF AMERICA 4
NUCLEAR REGULATORY COMMISSION Docznto APR;31987* 3 Before the Atomic Safety and Licensing Board sr Y.anddm 6
secr-unc A
In the Matter of
)
(b i?
)
-3 LONG ISLAND LIGHTING COMPANY
) Docket No. 50-322-OL-5
) (EP Exercise)
(Shoreham Nuclear Power Station,
)
Unit 1)
)
l LILCO'S RESPONSE TO SUFFOLK COUNTY'S MOTION TO STRIKE PORTIONS OF LILCO'S TESTIMONY ON CONTENTION EX SO On March 27, 1987, Suffolk County filed its "Suffolk County Motion To Strike Portions Of LILCO's Testimony On Contention EX 50" (hereinaf ter " Motion"). LILCO opposes Suffolk County's Motion for the reasons detailed below.
I.
LILCO's Testimony Comparing The Shoreham Post-Exercise Assessment With FEMA Post-Exercise Assessments For Other Facilities in FEMA Region II Is Admissible A.
Prior Board Rulings Acknowledge The Relevance Of Other Exercises Inaccurately citing a single Laurenson Board Order (Order Limiting Scope of Sub-missions (June 10, 1983)), Suffolk County disingenuously argues that prior Board rulings support its claim that evidence about other exercises is not relevant.M In fact, this 1/
Suffolk County's manner of reliance on one particular ruling of the Laurenson Board is as inapposite as its decision to ignore relevant rulings of this Licensing Board.
The order cited involved the issue of whether to admit for litigation various modules of the Shoreham offsite plan adapting it for hypothetical sponsorship by other governmen-tal entitles than Suffolk County (e.g., NRC, DOE, FEMA), or whether to limit the litiga-I tion to the version showing LILCO as the implementing organization (at least in the ab-sence of a specific agreement by such other agencies to sponsor the plan). The decision cited was limited to this one issue, and the Board's limitation of the plan " modules" to be admitted for litigation was premised on its reluctance to speculate on governmental g4150258s704or (footnote continued)
ADOCK 05000322 69 l
G PDR 1
C '
very Board has ruled that "the experience gained in other situations (exercises] will be useful in reviewing the Shoreham exercise."
Memorandum and Order (Ruling on LILCO's Motions to Compel New York State to Answer LILCO's First Set of Interrogato-ries and for a Protective Order)(Dec. 19,1986) at 6.
In granting LILCO's Motion to Compel New York State to produce information concerning other emergency plan exercises for nuclear power plants in New York State, this Board was presented with the same arguments now being raised by Suffolk County.2/ There, it was argued that since each exercise involved different plans, (footnote continued) participation in that plan in the absence of specific indications of such participattan.
Order Limiting Scope Of Submissions,(June 10,1983) at 3. Thus, the Board's ruling fo-cused on which Shoreham plan would be litigated, not whether the plans or planning ac-tivities of other nuclear power plants could be relevant to the litigation of admitted contentions.
Similarly, Suffolk County's reliance on the Laurenson Board's testimony rulings on Contention 92 (State emergency plan), Motion at 6, are misplaced. In Contention 92, the issue was whether a New York State plan existed which included Shoreham. In striking a portion of LILCO's testimony on that contention, the Board ruled that wheth-er or not a New York State plan existed for all other plants in New York was not rele-vant to resolving the issue before the Board. Contrary to Suffolk County's suggestions, Motion at 6, the Laurenson Board of ten admitted testimony on other emergency plans.
For example, in Contention 65, which dealt with evacuation time estimates LILCO in-troduced testimony on evacuation time estimates at other nuclear power plants. S_e_e Testimony of Cordaro, et al.. on Contention 65 at 45-46, ff. Tr. 2337. Intervenor's mo-tion to strike that testimony (Suffolk County Motion to Strike Portions of LILCO Testi-many on Contentions 25,23, and 65, at 8-9 (Nov. 28,1983)) was denied and the testimo-ny admitted. Tr.1298 (Laurenson) (Dec. 12, 1983). Over Intervenor's objections the Laurenson Board also admitted testimony on independent backup power supplies for si-rens and the types of sirens at other plants (Tr. 5561-62 (Laurenson)) and on the credi-bility of LILCO as compared to other utilities (Tr. 9676-78 (Kline)).
Recent Board decisions in the "-03" proceeding confirm that information about other power plants may be relevant to the Shoreham proceeding. Memorandum Memo-rializing Ruling on Motion to Compel Response to LILCO's Interrogatories and to Pro-duce Documents (March 17,1987) (holding that information about monitoring and decontamination of evacuees and vehicles at other nuclear facilities was relevant and discoverable); Memorandum and Order (Ruling on LILCO's March 18, 1987 Motion to Compel) (March 25,1987) (holding that whether relocation or reception centers at other nuclear power plants have been required to obtain an EIS or SPDES permit was relevant and discoverable).
2/
While the major thrust of the argument concerning LILCO's Motion to Compel was the relevance of other FEMA exercises to the issues raised by Contentions 15 and (footnote continued)
6
. scenarios, resources, personnel, geography, local conditions, and the participation, or lack thereof, of State and local governments, no comparisons could be made. State of New York's Opposition to LILCO's Motion to Compel (Dec. 4,1986) at 5-6.
Neither there nor here is the comparative value of other emergency exercises affected by geo-graphic, personnel, or plan differences or by the absence of State and local participa-tion in an exercise. The issue of whether the Shoreham Post-Exercise Assessment dem-onstrates that LILCO's training program is fundamentally flawed should not be addressed in a vacuum; a comparison of other exercise assessments is necessary in order to address that issue most intelligently. S_ee Memorandum and Order (Dec.19,
{
1986) at 4-6.
B.
Other Exercises Are Clearly Relevant To A Determination Of Whether The February 13 Exercise Demonstrated A Fundamental Flaw In The Training Program l
Suffolk County argues that other exercises are irrelevant to the issues raised by Contention EX 50 and therefore moves to strike testimony comparing the Shoreham Post-Exercise Assessment with other post-exercise assessments in FEMA Region II and testimony concerning a content analysis of the post-exercise assessment for the Indian Point Compensating Exercise of August 24-25, 1983.
First, Suffolk County argues that other exercise data are irrelevant because each of the other exercises is unique: they all assertedly involve different plants, plans, sce-narios, exercise objectives, personnel, mobilization procedures, geography, local condi-tions, levels of State and local government participation, training regimens, and criteria used by FEMA to evaluate performance.E otion at 2-5. However, the obvious M
(footnote continued) 16, the relevance of that information to Contention EX 50 was clearly raised. See LILCO's Motion To Compel New York State To Respond To LILCO's First Set Of Inter-rogatories and Request For Production Of Documents, and Request For Expedited Re-sponse And Disposition (Nov. 24,1986) at 6, 8.
3/
These factors are virtually identical to the factors rejected by this Board as not dispositive when New York State sought to prevent discovery of other exercises. See
ji.
.. differences between plants and exercises do not eliminate the value of comparing FEMA's assessments of the Shoreham exercise with its assessment of exercises at other nuclear plants in FEMA Region II. FEMA employs uniform standards for evaluating ex-ercises; thus meaningful comparisons can be made between the post-exercise assess-l ments.. This is particularly true where, as here, the post-exercise assessments were done by the same FEMA Region, essentially the same group of professionals evaluated
{
l the exercises, and the same FEM A coordinators approved the scope of the exercise Sce-narios and objectives.M Second, Suffolk County alleges that tLe testimony compares " apples and oranges" because LILCO's testimony does not consider that the LERO training program was sub-ject to Intervenors' careful scrutiny and that training programs evaluated in other exer-cises were not. Motion at 4, n.6. Suffolk County misses the point. LILCO's comparisons are based on FEMA's evaluation of each exercise and what that evaluation may say about the training of emergency workers. By relying on FEMA's evaluations of each ex-ercise, LILCO relies on a relatively uniform data base that provides a comprehensive and cohesive picture of exercise events collected by a group of professional observers
'who were to evaluate the events at numerous exercises using basically the same criteria.
(footnote continued)
Memorandum and Order (Dec. 19,1986) at 3, 5-6; State of New York Opposition to LILCO's Motion to Compel (Dec. 4,1986) at 5-6.
4/
Suffolk County's attempt to divorce consideration of the Shoreham Exercise from the relevant contextual backdrop of comparable Region II exercises is particularly meritiess since three of the bases on which Suffolk County seeks to preclude such a comparison - the absence of State or local government participation (Motion at 3, n.3 and 4, n.5 and n.6) - are entirely within Intervenors' control. Indeed, this Board has previously rejected the notion that Suffolk County's refusal to participate in emergency planning provides any basis for treating Shoreham as a special case. See Memorandum and Order (Dec. 19,1986) at 3-6.
_ ~ _
g
.- Third, Suffolk County claims that the evolution of criteria used by FEMA-to
+
evaluate performance precludes a comparison of post-exercise assessments. Motion e
at 4. It is true some aspects of FEMA's evaluation criteria have evolved during the last six years; however, this neither guts the comparative value of other post-exercise as-sessments nor makes those assessments irrelevant. Rather, the issue is how to corre-late the earlier critcria' with the current definitions of deficiencies and ARCAs.
LILCO's witnesses, one of whom has had substantial experience in preparing FEMA findings (see Professional Qualifications of Mary E. Goodkind at 1) have explained their method oi correlating the evaluation criteria over time.
LILCO's Testimony on Contention EX 50, at 28-29. Suffolk County's remedy is to challenge, on cross examina-tion, the method by which these relevant comparisons were performed, not to seek to exclude them.
Finally, Suffolk County threatens that admitting LILCO's testimony on post-exercise assessment willlengthen significantly the adjudication of Contention of EX 50.
Motion at 3, n.1. Time and the merits of Intervenors' cross-examination will tell on this
-issue. However, this is not a basis for denying the admission of relevant evidence. See 10 C.F.R. S 2.743(c). Moreover, LILCO disagrees with Suffolk County's assertion that it must explore "the circumstances underlying each of LILCO's 16 other exercises... In order to determine whether the FEMA Post-Exercise Assessment Report for each com-ports with the gloss placed upon the Report by LILCO's consultants." Motion at 3, n.1.
A detailed exploration of the 11 criteria listed by Suffolk County in its motion is not re-quired to determine whether LILCO's testimony appropriately reflects the results of a given assessment. The testimony makes relatively straightforward comparisons. It compares the number of deficiencies and ARCAs for 17 post-exercise assessments, including Shoreham's; it lists the number of training citations found by FEMA in 12 categories of emergency worker functions; and it counts the number of positive and
- - negative critical incidents recorded by FEMA in the Indian Point Compensating Exer-cise Report. All of those comparisons are readily ascertainable without inquiry into differences in plans, mobilization procedures, local conditions, geography, training regi-mens, or levels of state and local government participation.
In each of these areas, the adequate response to Suffolk County's argument is that the County will have an opportunity, on cross-examination, to explore the rele-vance and degree of probative value of LILCO's data and analyses. That is the appro-priate procedure, rather than arbitrary threshold exclusion of material placing the Shoreham Exercise into the context of other exercises supervised and evaluated over a coherent time frame to uniform regulatory guidance by a consistent group of profes-sionals.
C.
LILCO's Testimony On Other Exercises Is Within The Scope of Contention EX 50 Suffolk County also maintains that comparisons of post-exercise assessments are "beyond the scope of the contention and should be stricken." Motion at 5. Suffolk Coun-ty's argument rests on its own, intuitively implausible, assertion that Contention EX 50, all 18 pages of it (with consolidated subparts), is " narrowly drawn"N and on the fact that Contention EX 50 does not specifically refer to other exercises. Motion at 5. The Contention, however, alleges that:
5/
Suffolk County is not the ultimate arbiter of how the contentions should be in-terpreted. In any event, when it suits Suffolk County's purpose. It has urged that con-
-tentions should be interpreted broadly. For example, LILCO argued that portions of Suffolk County's testimony on Contentions EX 40 and 41 should be stricken as beyond the scope of those contentions. LILCO's Motion to Strike Portions of Suffolk County's Testimony on Contention EX 40 (March 5,1987) at 4 (effect of road impediments on traffic guide mobilization beyond contentica): LILCO's Motion to Strike Portions of Suffolk County's Testimony on Contention EX 41 (March 5,1987) at 3-4 (effect of mobi-lization of road crews on LERO's ability to convert a stretch of evacuation roadway into one-way flow beyond contention). Intervenors argued that the contentions should be construed broadly and that they be permitted to offer background information to put the issues in context and thereby demonstrate a fundamental flaw.
Tr. 526-527 (Zahnleuter), 529-530, 535, 537 (Miller). Generally, the Board denied LILCO's motion and declined to apply a strict construction approach to interpreting the contentions.
Tr.1002-1005 (Frye).
. Each " deficiency" and each "ARCA" identified by FEMA, plus each additional error committed during the exercise and identified in other contentions, provides a basis for the Governments' allegation that the exercise results demonstrate a fundamental flaw in LILCO's training program.
These allegations are broad-not narrow. They cannot be resolved in a vacuum, and other post-exercise assessments provide a valid, relevant context in which to assess the findings in the Shoreham Post-Exercise Assessment.
Moreover, Contention EX 50 itself provides more specific support for including such comparisons within the scope of the contention. It states: "in its Report FEMA identified a significant number of training deficiencies."
Analyses of other post-exercise assessments are clearly useful in ascertaining what should be considered a "sig-nificant" number of training deficiencies. LILCO's Testimony at 8-10, 23, 25, 27-32, 62 and Attachments C and D seeks to do precisely that.
D.
Suffolk County Knew That LILCO's Testimony Would Include Comparisons of FEMA's Post-Exercise Assessments Suffolk County's claim that it is suprised by LILCO's use of other exercises in its Contention EX 50 testimony cannot be taken seriously. Motion at 12-13. LILCO dis-closed its intention in both deposition and interrogatory answers. LILCO's Response to Suffolk County, State of New York and Town of Southampton Second Set of Interroga-tories to LILCO (Jan. 5,1987) stated, in answer to Suffolk County Interrogatory 1 (e),
that "[ alt present, LILCO intends to rely on information from other exercises in its testimony on Contentions EX 15,16 21 and 50."
Mary Goodkind, the sponsor of the "other exercise" testimony whom Suffolk County choses not reference in its motion, re-peatedly stated that it was her intention to compare other FEMA post-exercise assess-ments with the Shoreham Post-Exercise Assessment. Deposition of Mary Goodkind I
(Jan. 30,1987) at 78-82,95-101 ( Attachment A):
Q.
Have you at this time performed any research or analysis which relates to Contention 50 -- or may relate to Contention 50 in the testimony you may offer?
~
A.
I have accumulated some past information regarding other ex-ercises in which I have been an evaluator and/or controller, ee****
Q.'
What is the purpose of this task that you are performing in terms of accumulating this material?
A.
I anticipate my testimony may make some comparison to what was observed at the Shoreham exercise. What I ob-served at exercises I had attended in the past.
eeeeee Q.
Are you able to tell me the exercises that you are gathering these materials - from which you are gathering materials?
A.
I have gathered materials from certain past exercises. Again, these are generally the exercises that I attended. There were four in Region II, at Ginna, Indian Point, Oyster Creek, James Fitzpatrick, all in 1982.
I have some follow-on information from Indian Point in subse-quent exercises.
I have some information on exercises that I attended in Re-gion V, D.C. Cook and LaSalle Station, and Region VII, Oconee and Vogtle in 1986.
And then two Quad Cities exercises,1985 and 1986. I don't have the 1985 material, but I have 1986.5/
Tr. at 78-80.
Moreover, Ms. Goodkind specifically stated that her testimony would include comparisons of training citations in other exercise assessments.
j/
All of these materials were produced.
S_e_e Letter to Michael S. Miller from Jessine A. Monaghan (Feb. 13,1987) (Attachment E). The exercises in FEMA Region II were included in Ms. Goodkind's analysis. S_ee LILCO Testimony on EX 50 at 27-28.
(
. Q.
Is it possible for you at this time to tell me what you believe the thrust of your testimony will be on Contention EX 50?
A.
Yes. I think my testimony will demonstrate that based on the materials I have reviewed that the training program was quite effective,... and I think my testimony will draw some com-parisons between this exercise and other exercises.
Q.
Ms. Goodkind, let me ask you a general question in that re-gard. Why do you consider it relevant what has been done at other exercises from a training standpoint?
A.
... We are talking particularly about training, and based on my sampling of the eight exercises in which I was an evaluator, I have se n by looking back through the assessment that FEMA has reummended additional training at every one of those exercises.
Now, the question that I may be looking at is are the citations that have been made by FEMA at the Shoreham exercise (to]
a need for training above the norm of what FEMA has recom-mended at other exercises.
That might be, for example, one of the types of comparisons that might be made.
Tr. at 95-97.
Ms. Goodkind's intention to offer precisely the type of testimony found at pages 27-32 of LILCO's Testimony on Contention EX 50 could not be clearer.
Discovery of these comparative analyses has already taken place.
Ms.
Goodkind's deposition is replete with discussion of comparative analyses, and copies of the exercise documents she had reviewed at the time of her deposition were produced.
S_e_e Letter to Michael S. Miller from Jessine A. Monaghan (Feb. 13,1987) ( Attachment E). The only remaining documents are various underlying work papers, which LILCO will produce by close of business tomorrow, April 2.
The deposition discovery of Mr.
Behr or Mr. Millioti requested by Suffolk County, see Motion at 12, will not produce ad-ditional and materially relevant information beyond that available from the documents and will only be burdensome. Indeed, Mr. Milliott's work is not material or relevant -it
E t
. is not relied on for this testimony. These analyses were performed primarily by Ms.
Goodkind with assistance from Mr. Behr.
II.
Suffolk County Has Already Been Provided With Adequate Information To Assess The Content Analysis Of The Shoreham Post-Exercise Assessment A.
LILCO Appropriately Claimed Work Product Privilege For The Content Analysis During the deposition of Elliott Pursell on January 20, 1987, Suffolk County re-quested production of the content analysis then being prepared by Mr. Pursell at the di-rection of LILCO's attorneys and in preparation for litigation. On February 9,1987, LILCO objected to producing that content analysis because it was work product.
Suffolk County's appropriate remedy, if it required that analysis for " testimony that was to be prepared and filed by Suffolk County on Contention EX 50," was to file a timely motion to compel discovery. Motion at 7. But Suffolk County did not. Rather, it waited silently for nearly two months until af ter LILCO had filed its testimony, and then moved to strike that testimony, or in the alternative, to obtain further discovery.
There is no good cause for further discovery and no basis on which to strike this rele-vant evidence.
First, Suffolk County has received discovery on Mr. Pursell's content analysis Suffolk County questioned Mr. Pursell extensively at his deposition about the work.
content analysis he was preparing using the critical incident analysis technique, how such an analysis was performed, what was measured, and what the results could show.
Deposition of Elliott D. Pursell(Jan. 20,1987) at 34-46,115-22,125-30,140-42 (Attach-ment B). The following examples from Mr. Pursell's deposition illustrate that Suffolk County had full and f air opportunity for discovery concerning the content analysis.
Q.
Critical incident techniques and content analysis are not terms that I'm f amiliar with, Mr. Pursell, so I'm trying to get an understanding for, if one were to do a content analysis of the FEMA report, what would you be doing? What would you be looking for?
t 3
. A.
Positive, neutral and negative statements, specific cetions, behavorial actions that were taken by LERO personnel during the FEMA exercise.
Q.
Could you give me an example, say in terms of the LERO traffic guides, of how you would apply such a content analy-sis?
A.
Yes. For_ example, with the impediment, the fuel truck im-pediment, the traffic guide called in for assistance and also requested additional cones.
That was an action. It was an observable action that was recorded and it was a positive action. So it would be catego-rized as positive.
Tr.34-35.
A.
Well, let me say this. The critical incident technique is a type of content analysis..It's a way of analyzing the content of written material.
Q.
And you are using this technique, the critical incident tech-nique,in your content analysis of the FEMA report.
Is that correct?
A.
Correct.
Q.
So, in other words, it is fair to say that you're not going to be doing a content analysis of the entire report, but only those portions of the report which are critical, or at least more im-portant than other sections?
A.
Well, no. We will go through the entire report. But, for ex-ample, in the report, in the FEMA report, obviously, there are sentences or statements that are merely statements of fact that are neither positive, nor negative, which obviously are not part of that analysis. But they are reviewed, but they are not part of-in terms of file numbers, they are not counted.
.. Q.
Let me ask you, first of all, Mr. Pursell, generally, what is the purpose of the content analysis?
A.
To attempt to take narrative data and quantify it.
Tr. 37-38.
Q.
At the present time, can you tell me any specific critical in-cident examples which you would intend to rely upon in your testimony?
A.
Well, yes. Let me give you one example I can think of.
With respect to the fuel truck impediment, the route spotter was sent out, which was a correct action. In fact, the imped-iment was identified. The fire department was notified. Hess Oil Company was notified.
The traffic was appropriately rerouted. The traffic coordinator called and asked for addi-tional assistance and cones, and the traffic coordinator veri-fled with the traffic engineer that, in fact, the traffic was being rerouted correctly and equipment to remove the imped-iment was, in fact, dispatched to the scene.
So that is one example of where there are a series of positive, effective behaviors.
Q.
Now, are you aware of the fact that there was a delay in the response time by LERO to the fuel truck simulated impedi-ment?
A.
Yes.
Q.
Do you factor that delay at allinto your content analysis?
A.
Yes.
Q.
Do you consider that an example of a negative behavior to the situation presented?
A.
Yes.
l l
Q.
Does the content analysis, or will your content analysis re-flect both the negative and the positive aspects of LERO's behavior during the exercise?
i
, A.
Exactly.
Tr.120-21.
Q.
Mr. Pursell, in these content analyses that you perform, is it the case that if you have more positive examples of positive behavior than negative examples of negative behavior, that the conclusion follows that you have an adequate training program?
Is it simply a weighing process? Is that how it works?
A.
Well, its further than that. Its providing - what its doing is simply taking narrative information such as this or any other type of narrative information, and putting it in a quantitative format, numerical format.
But from the statement that you just made, if it is more post-tive than negative, if it's 51 percent positive and 49 percernt negative, no, that does not indicate -
Q.
Does not indicate an adequate training program?
A.
In other words, you're asking about the majority. Is that cor-rect?
l Q.
Well, let me keep going with this. Do you need to establish an acceptable performance standard in the context of preparing your content analysis, so, for example, 70 percent would be considered acceptable, 69 percent would be considered not acceptable?
A.
You could, yes.
Q.
Would you be doing that, or are you doing that in the context of the content analysis for the LERO training program?
A.
At this point, I do not know.
Q.
So at this point, you have not established such a pass / fall standard.
A.
No.
r 1
lt.-
.. Q.
Is it fair to say that in the preparation of your content analy-sis,' that you, in quantifying the narrative, establish the num-bers of positives and then weigh that against the number of negatives?
A.
That would be the process that we would go through? Is that what you are asking?
Q.
Yes.
A.
Yes.
Q.
Now, in that process, are all positives and all negatives equal in terms of the weight that you attribute to those factors?
A.
Yes.
Q.
So, for example, if the right equipment were dispatched to the fuel truck impediment and that's a positive, that would be equal in weight to the negative, which would be that the re-sponse time was delayed by LERO during the exercise.
Is that correct?
A.
That's correct.
Q.
And in all cases, the weight attributed is the same as simply counting the numbers.
A.
Correct.
Tr.125-27.
Not only did Mr. Pursell explain in his deposition the critical incident technique method of content analysis, how such an analysis was performed, what was measured, and what the results could show, LILCO produced with its testimony any other informa-tion necessary to an understanding of the content analysis. Mr. Pursell's prefiled testi-many describes, in detail, the process by which the analysis was conducted; and the
i
. criteria used for the analysis are provided as Attachment E to the testimony.2 Suffolk County already has in its possession ample information to determine how the content analysis was performed and how LILCO's witnesses reached the conclusions set forth in their testimony.
Nonetheless, LILCO is providing with this response all remaining documents re-lated to the content analyses referenced in the testimony. Those additional documents consist of the Shoreham Post-Exercise Assessment and the Indian Point Assessment marked with checks and Xs to show each positive and negative finding and a tabulation counting the number of positive and negative incidents.
Further discovery is not needed in view of the extensive information already available to Suffolk County and should not in any event be granted af ter the fact given the County's failure to pursue timely remedies (if it thought the underlying materials as important as it now claims).
Any delay in these proceedings to conduct discovery on the content analysis or on the criticalincident analysis technique is not merited.N I/
Contrary to Suffolk County's allegation. Attachment E is not a few self-selected examples from the content analysis. Motion at 8. Rather, as Mr. Pursell states in his testimony at 22-23, Attachment E contains the criteria used by the groups doing the criticalincident analyses to classify incidents.
8/
Suffolk County's argument that its degree of knowledge about Mr. Pursell's con-tent analysis hindered preparation of its direct testimony does not parse. Suffolk Coun-ty's direct testimony is (or should be) based primarily on its theories and calculations of facts it has observed in support of }ts contention, rather than focusing on LILCO's in-herently responsive testimony. Mr. Pursell's work is germane to Suffolk County, pri-marily as to possible rebuttal testimony, if good cause can be shown for it. To the ex-tent that Suffolk County needs the material to prepare to cross-examine Mr. Pursell,it is obvious that the County has already had most of it long since and that it will have had the remainder for ample time before Contention EX 50 is reached.
n.
l
-, III.
The NUREG/CR-3524 Analysis Is Relevant And All Documents Related To That Analysis Have Been Produced A.
No Information Was Withheld From Suffolk County During Discovery Suffolk County alleges that LILCO's witnesses withheld information about the NUREG/CR-3524 analysis during their depositions in early January and February,1986.
In fact, at that time neither Dr. Lindell nor Dr. Mileti were considering the NUREG/CR-3524 analysis, and only well af ter their deposition was an analysis of the Shoreham Post-Exercise Assessment using the criteria in NUREG/CR-3534 considered and performed. Even so, Suffolk County was on notice that Dr. Mileti's testimony would focus on the effectiveness of LERO as an organization in responding to an emer-gency.
See Deposition of Dennis S. Mileti (Jan. 8, 1987) at 117-123, 147-149 (Attachment C).
Suffolk County's motion cuts too broad a swath when it seeks to strike Drs.
Mileti and Lindell's testimony at questions 10,11, and 13. That testimony does not con-cern the NUREG/CR-3524 analysis of the Shoreham Post-Exercise Assessment; it is limited to a discussion of the factors used to analyze organizational effectiveness, par-ticularly as those factors are discussed in NUREG/CR-3524. See LILCO Testimony at Q.10,11, and 13. NUREG/CR-3524 is part of the literature with which Drs. Lindell and Mileti as experts on emergency behavior and organizational behavior are familiar. Such testimony is relevant and admissible.
B.
Suffolk County Has All Documentary Information On The NUREG/CR-3524 Analysis Suffolk County's apparent ignorance of NUREG/CR-3524 is not LILCO's fault.
NUREG/CR-3524 is a publicly available document prepared by Dr. Mileti and others and published by the Nuclear Regulatory Commission. Dr. Mileti's academic vita, provided to Suffolk County months ago, makes clear reference to NUREG/CR-3524. Se_e, e.g.,
Professional Qualifications of Dennis S. Mileti at 7 (generally admitted at Tr. 274-275).
t
. Although it is publicly available, LILCO provides with this response a copy of NUREG/CR-3524 as a courtesy to Suffolk County.
Suffolk County also complains that it requires further documentary and depost-tion discovery "to define terms, to fully discuss the variables used in their study, or to explain clearly what the analysis is or how it was performed." Motion at 14. Such dis-covery is unnecessary; NUREG/CR-3524 itself explains these terms. Seg N UREG/CR-3524 at 3-t--3-5, C-1-C-2.
The application of these terms or factors to the Shoreham Post-Exercise Assessment is described at length in the very testimony that Suffolk County seeks to strike. Seg LILCO's Testimony on Contention EX 50, at 18-20. Finally, the analysis itself is presented as Attachment F to LILCO's testimony.
That attachment shows how each of the objectives in the Shoreham Post-Exercise As-sessment was evaluated with respect to each of the NUREG/CR-3524 factors. In short, Suffolk County already has all of the relevant documents. Ncnetheless, the only re-maining documents -- the notes use to prepare the charts in Attachment F -- are being provided to Suffolk County with this response.
The application of the NUREG/CR-3524 factors is straightforward and uncompli-cated.
Suffolk County has all of the materials relied on by LILCO's witness and underlying the analysis; no further discovery is appropriate, nor would it be fruitful.
Suffolk County's motion to strike or, in the alternative, for further discovery should be denied. The testimony is relevant and Suffolk County has available to it all the perti-nent documents and materials to understand and cross-examine on the analysis.
V.
Suffolk County's Latest Attempt To Delay The Hearings Should Be Rejected Suffolk County claims finally that it requires additional discovery on the "other exercises" analysis, the content analysis, and the NUREG/CR-3524 analysis. Mo-tion at 12-14 As discussed above, such additional discovery is not warranted. Suffolk County already has long had extensive primary and secondary source information from
i e depositions, documents, or both, concerning the types of analyses used in LILCO's testi-mony. With respect to any remaining information, LILCO has provided or, within a day of this response will provide, all remaining documents, papers, and materials related to the three analyses requested in the Metion. Those materials, the deposition discovery already had by Suffolk County, and LILCO's testimony itself provide Suffolk County with the information it claims it requires to conduct cross-examination. No additional discovery is warranted.
Suffolk County's request for a two-week break in the hearing schedule should not, under any circumstances, be granted. Motion at 14. First, Suffolk County has the documents and, at least since the end of January, has had available deposition discov-ery, in which Ms. Goodkind testified that she would be analyzing and comparing other post-exercise assessments with the Shoreham Assessment (Deposition of Mary Goodkind at (Jan. 20,1987) 78-81,99-100,116-118 (Attachment A))t Mr. Pursell testified exten-sively about the content analysis (Deposition of Elliott D. Pursell (Jan. 30,1987) at 34-46,115-122,125-130,140-142 (Attachment B)); and Dr. Milett testified that his anal-yses would focus on tile effectiveness of LERO as an organization (Deposition of Dennis S. Mileti (Jan. 8,1987) at 117-123,147-149). Second, to the best of LILCO's knowledge, Mr. Miller, Suffolk County's counsel on Contention EX 50 (training), will not be directly involved in the upcoming proceedings concerning Contentions EX 47,36,22.a. 49 or the ENC / Rumor Control Contentions EX 38 and 39, and therefore may devote his attention to preparation for Contention EX 50. Based on the recent pace of the hearings, it is unlikely that testimony on Contention EX 50 will be heard within the next two weeks.
Assuming that Suffolk County has analyzed the information already in their possession, there is clearly more than ample time to evaluate the additional documents being pro-vided and to prepare cross-examination.
)
e i,
. Suffolk County's motion is little more than the latest step in an extravagant de-laying strategy designed to lull this Board into gradually allowing the scope of this pro-ceeding to spread and its pace to slow until it rivals the original 1983-85 litigation on the Shoreham plan. The Board must be vigilant against this campaign to sabotage the Commission's directive in CLI-86-11, issued nearly 10 months ago, to conduct an expe-dited proceeding on the exercise.E It is, af ter all, only a review of a 12-hour exercise which is being conducted, not a de novo re-examination of an extensively litigated plan.
Intervenors' demand for yet further burdensome discovery and hearing delays when they already have had, long since, access to virtually all of the relevant information and have slept on whatever rights they had with respect to other information earlier, sim-ply should not be countenanced.
VI.
Conclusion For the reasons stated above, LILCO opposes Suffolk County's afotion to Strike and respectfully requests that the Motion be denied.
Respectfully submitted,
$ lad lN/Y.
W
/J onald P. Irwin
[/Jessine A. Monaghan Thomas E. Knauer Hunton & Williams 707 East Main Street Richmond, VA 23212 (804) 788-8200 DATED: April 1,1987 9/
The predicate for CLI-86-11, the judicial decision that initially made exercises available for litigation - Union of Concerned Scientists v. NRC, 735 F.2d 1437 (D.C.
Cir.1984) - specifically contemplated expedited litigation, 735 F.2d at 1448. Indeed, the majority rejected the fear of dissenting Judge MacKinnon that " holding hearings on emergency exercise evaluations could delay licensing proceedings 'up to two years in cases where the Commission's decisions are appealed to the courts *", i_d. note 21, on the basis that Judge MacKinnon " ignores the potential for time savings by use of expedited procedures." Id. LILCO submits that Intervenors' goal for this proceeding, and its most immediate danger, is that of becoming Judge MacKinnon's nightmare.
ko LILCO, A pril 1,1987..,,,n
,.sg.
CERTIFICATE OF SERVICE MS 'O "
n
. -Q, ;,...
.. :,() ~
In the Matter of
/n LONG ISLAND LIGHTING COMPANY Q':
(Shoreham Nuclear Power Station, Unit 1)
Docket No. 50-322-OL-5 I hereby certify that copies of LILCO'S RESPONSE TO SUFFOLK COUNTY'S MOTION TO STRIKE PORTIONS OF LILCO'S TESTIMONY ON CONTENTION EX 50 were served this date upon the following by hand as indicated by an asterisk, by Federal Express as indicated by two asterisks, or by first-class mail, postage prepaid.
John H. Frye, III, Chairman
- Atomic Safety and Licensing Atomic Safety and Licensing Board Panel Board U.S. Nuclear Regulatory Commission U.S. Nuclear Regulatory Commission Washington, D.C. 20555 East-West Towers 4350 East-West Hwy.
Oreste Russ Pirfo, Esq.
- Bethesda, MD 20814 Edwin J. Reis. Esq.
U.S. Nuclear Regulatory Commission Dr. Oscar H. Paris
- 7735 Old Georgetown Road Atomic Safety and Licensing (to mailroom)
Board Bethesda, MD 20814 U.S. Nuclear Regulatory Commission East-West Towers Herbert H. Brown. Esq.
- 4350 East-West Hwy.
Lawrence Coe Lanpher, Esq.
Bethesda, MD 20814 Karla J. Letsche Esq.
Kirkpatrick & Lockhart Mr. Frederick J. Shon
- South Lobby - 9th Floor Atomic Safety and Licensing 1800 M Street, N.W.
Board Washington, D.C. 20036-5891 U.S. Nuclear Regulatory Commission East-West Towers, Rm. 430 Fabian G. Palomino. Esq.
- 4350 East-West Hwy.
Richard J. Zahnleuter, Esq.
Bethesda, MD 20814 Special Counsel to the Governor Executive Chamber Secretary of the Commission Room 229 Attention Docketing and Service State Capitol Section Albany, New York 12224 U.S. Nuclear Regulatory Commission 1717 11 Street, N.W.
Mary Gundrum, Esq.
Washington, D.C. 20555 Assistant Attorney General 120 Broadway Atomic Safety and Licensing Third Floor Room 3-116 Appeal Board Panel New York, New York 10271 U.S. Nuclear Regulatory Commission Washington, D.C. 20555
k Spence W. Perry, Esq.
- Ms. Nora Bredes William R. Cumming, Esq.
Executive Coordinator Federal Emergency Management Shoreham Opponents' Coalition Agency 195 East Main Street 500 C Street, S.W., Room 840 Smithtown, New York 11787 Washington, D.C. 20472 Gerald C. Crotty. Esq.
Mr. Jay Dunkleberger Counsel to the Governor New York State Energy Office Executive Chamber Agency Building 2 State Capitol Empire State Plaza Albany, New York 12224 Albany, New York 12223 Martin Bradley Ashare, Esq.
Stephen B. Latham, Esq. **
Eugene R. Kelly, Esq.
Twomey, Latham & Shea Suffolk County Attorney 33 West Second Street H. Lee Dennison Building P.O. Box 298 Veterans Memorial Highway Riverhead, New York 11901 Hauppauge, New York 11787 Mr. Philip McIntire Dr. Monroe Schneider Federal Emergency Management North Shore Committee Agency P.O. Box 231 26 Federal Plaza Wading River NY 11792 New York, New York 10278 Jonathan D. Feinberg, Esq.
New York State Department of Public Service, Staff Counsel Three Rockefeller Plaza Albany, New York 12223 h
b
' essine A. Monaghan Hunton & Williams 707 East Main Street P.O. Box 1535 Richmond, Virginia 23212 DATED: April 1,1987
a 4
L s
l ATTACHMENT A t
TIM.
S Klf OF PROC &l. GS UNITED STATES OF AMERICA NUCLEAR REGULATORY COMMISSION BEFORE THE ATOMIC SAFETY AND LICENSING BOARD i
l
- - - - - - - - - - - - - - - - - - -x In the Matter of:
Docket No. 50-322-OL-5 LONG ISLAND LIGHTING COMPANY (EP Exercis (Shoreham Nuclear Power Station, (ASLBP No. 86-533-01-0 Unit 1)
- - - - - - - - - - - - - - - - - - -x DEPOSITION OF MARY GOODKIND New York, New York Friday, January 30, 1987 ace-FEDERAL REPORTERS, INC.
5 ceric vve Rexrtm
+i4 Ner:h Camtoi Scree Washington, D.C. M1 (202) 347-3700 Nationwide CoveraSe 800-336-6646
78 i
Q Have you, at this time, performed any research or analysis which relates to Contention 50 -- or may relate 2
3 to Contention 50 in the testimony you may offer?
4 A
I have accumulated some past information regardinc other exercises in which I have been an evaluator and/or 5
con trolle r.
6 0
Can you tel'. me the kinds of information you have g
accumulated?
3 A
I have obtained copies of executive summaries of 10 exercise assessments for a number of stations where I was ti an observer.
That is primarily what I have done.
I have g
reviewed again FEMA guidance memoranda, 0
Okay.
Let me go back to this accumulation of 13 information.
For the exercises that you have attended as 34 an evaluator 15 controller, you have gathered together the 16 executive su.maries for those exercises?
A Yes.
Various materials.
Typically, executive g
summaries.
In some cases gotten the scenario or the 13 19 objective.
That is primarily what I have accumulate d.
Q What is the purpose of this task that you are 7g 33 performing in terms of accumulating this material?
A I anticipate my testimony may make some comparison
r 79 1
to what was observed at the Shoreham exercise.
What I 2
have observed at exercises I had attended in the past.
3
')
In that regard, how would exercise scenarios help 4
in the making of that comparison?
5 A
Well, it seems to me that at the Shoreham exercise,
6 events moved very quickly at the onset of the exercise.
In 7
other words, it would seem to me that the scenario moved 8
quickly into general emergency, and if you criticize people 9
for their performance, I think you need to keep some kind of view of realities and ask yourself are some of the problems to 11 that we are seeing artif acts due to the scenario or as we dis' cussed previously, may be due to simulation.
12 13 It is just trying to gather some information to 14 make the best assessment that you can, what kind of 15 performance was demonstrated.
16 So, I have looked back at other axercises to 17 see how much time transpired, for instance, between an la alert stage site area emergency, general emergency.
19 O
At this point other than accumulating the various materials from other exercises you attended, have you begun 20 21 the process of making a comparison?
22 A
No, not re ally,
c
l I
f 80 I
t Q
Are you performing, or have you performed, any other research or analyses in connection with Contention 50 2
3 other than what you have just described?
4 A
No.
5 0
And I gather you have prepared no reports at this 6
time?
A That is true,
s Q
Are you able to tell me the exercises that you are gathering these materials -- from which you are gathering 9
to these materials?
it A,
I have gathered materials from certain past in exercises.
Again, these are generally. the exercises that 13 I attended.
There were four in Region II, at Ginna, 14 Indian Point, Oyster Creek, James Fitzpatrick, all in 19 82.
I have some follow-on information from Indian 15 16 Point in subsequent exercises.
1:
I have some information on exercises that I is attended in Region V, D.
C. Cooke and LaSalle Station, and 19 in Region VII, Oconee and Vogel in 19 86.
And then two Quad Cities exercises, 1985 and 20 21 1986.
I don' t have the 19 85 material, but I have 19 86.
Q Do you recall the years of the D.
C.
Cooke and
81 LaSalle exercises?
A I believe it is 1982.
3 Q
And is it fair to say you are primarily focusing 4
on gathering the material from those parts of the exercises 5
you had some responsibility for evaluating?
6 A
For ins tan ce, I mentiened I was gathering executive summaries, primarily.
Information on how the a
exercise was de signe d.
9 In most cases my role as an evaluator was at 10 either state or county EOC, so that encompassed quite a 11 number of functions that were being directed out of the EOC.
l 1
O Have you ever had responsibil ty for evaluating 13 any FEMA exercise, any field activities?
14 A
Any what?
15 0
Any field activities?
A Yes.
16 t;
Q What would these field activities have been?
18 A
Evaluating radiological teams, field teams.
19 Q
Anything else?
20 A
No, I think that is the only time I have been 21 assigned out in the field, and that was only on one 2:
occasion.
I
82 0
At this time, are you aware of any research 1
projects or analyses which are being performed, or have 3
been perforried, by other LILCO witnesses on Contention 50?
A No.
4 0
You were not at the February 13th exercise, is 5
COIE*C 6
7 A
I was not.
g 0
What is your degree of f amiliarity with the LILCO 9
training program at this point in time?
10 A
At the present time I am only f amiliar with the frequency of the training, the way the staf f are rotated 33
(
12 through the training, and the types of materials in general that are used.
I know there are audio-visual materials, and 13 34 work books instruction.
15 0
Have you seen the actual materials at this time?
A NO-16 3;
O Do you intend to review any of the LERO training materials prior to rendering your testimony?
is 19 A
Not in any depth.
There is a citation to one 20 specific section of procedures within the contention.
I plan 21 to look at that.
2:
0 Do you recall which procedure you are referring to?
I
6 95 1
THE WITNESS:
I think I would prefer not to.
2 MR. MILLER:
Lets take a break.
3 (Whereupon, a recess was taken at 4:05 p.m.,
to 4
reconvene at 4:20 p.m.,
this same day.)
5 BY MR. MILLER:
(Continuing) 6 Q
Ms. Goo dkin d, other than your accumulation of 7
materials from other exercises, at this time do you anticipate 8
performing any other kinds of research or analyses in 9
connection with Contention 50?
10 A
I don't know for sure what kind of comparisons 11 or research I may get into.
I haven't thought of anything 12 other than the things I have mentioned at this time.
13 Q
Do you have a copy of the Contentions with you?
14 A
Yes.
15 Q
I ask you to turn to Contention 50, which begins 16 on page 87 I think we have established that at this time i
17 you anticipate testifying on all of Contention 50, including w
n l
18 all the subsumed contentions within that contention, 19 correct?
20 A
Yes, I believe that is correct.
l 21 Q
Is it possible for you at this time to tell me what you believe the thrust of your testimony will be
4 96 on Contention 50?
g A
Yes.
I think my testimony will demonstrate that i
2 based on the materials I have reviewed that the training 3
)
4 program was quite effective, and the training in the' exercise 1
showed many instances where activities were well executed, 3
I 6
and it showed that there has been ef fective training in i
most in s tan ces, and I think my testimony will draw some s
comparisons between this exercise and other exercises.
I 9
Q Ms. Goodkind, let me ask you a general question I
10 in that re gard.
Why do you consider it relevant what has l
been done at other exercises from a training standpoint?
1-12 A
Well, FEMA observers were the ones who graded 13 this exercise, and I feel that most FEMA evaluators do 14 a conscientious and thorough job of evaluating, and their 15 opinions generally are valuable to me what they see and 16 comment on training.
1; I am attempting to make some comparison to f actors
' M..
18 in this exercise and factors in other exercises that FEMA has evaluated because there are some differences in the 19 20 Shoreham exercise, and I think there may be some comparisons 21 that would be valuable between what FEMA observed here, an d 22 what FEMA has said at other exercises.
- - - - - - "'~~~ ~ ~ *~^"
~
l l
l I
97 We are talking particularly about training, and i
based on my sampling of the eight exercises in which I was an evaluator, I have seen by looking back through the 3
4 assessment th at FEMA has recommended additional training at every one of those exercises.
Now, the question that I may be looking at is 6
are the citations that have been made by FEMA at the Shoreham exercise a need for training above the norm of what FEMA has s
9 recommended at other exercises.
That might be, for example, one of the types of 10 comparisons that might be made.
11 1:
Q Are you saying that if FEMA always recommends is additional training, that the f act that at the Shoreham exercise FEMA also recommended additional training doesn't 14 mean as much in terms of conclusions that could or should 15 be drawn?
16 1;
A Well, I think there has been some interest in whether the LERO participants is better than one would le 19 expect, or worse that one would expect.
It seems to me 20 that this is one of the things that we are looking at, and in order to answer that question, it may be interesting 21 to look at emergency personnel in general, and how do they
T 98 1
perform after training.
2 It is usual that people demonstrate that they need 3
additional training.
4 Q
To your knowledge, there has never been an example or instance of FEMA-graded exercise where FEMA did not 5
6 conclude that some additional training is needed?
A Well, I went back to my experience where I was 8
an evaluator, and based on the documents that I re cite d l
l 9
earlier that I had looked at executive summaries and so on, I
l 10 it is my recollection that when I looked through all of l
11 those that FEMA cited the need for more training at each 12 of those exercises.
13 Q
Now, are you aware as to whether any of those exercises involved training issues which rose to the level 14 as being characterized as a deficiency by FEMA?
15 16 A
I don't know the answer to that off hand.
17 0
Would you agree with me that at the Shoreham is exercise, at least two training related issues, that is 19 traf fic impediments and bus drivers, rose to the level 20 of being characterized as deficiencies by FEMA?
21 A
I think that two -- if I can rephrase, I think 22 it is the same thing you said.
I believe that two of the
e 99 deficiencies noted by TEMA have training aspects.
i O
If the other eight exercises you participated in as an evaluator, of those other eight, none of the 3
training issues raised by FEMA rose to the level of being 4
a deficiency, would that lead you to draw any conclusions 5
about the adequacy of the LERO training program at Shoreham?
6 Given FEMA's characterization of these two issues as a
deficiencies at the Shoreham exercise?
9 A
I don't think it is necessarily one-to-one to comparison.
I feel that the deficiencies that were noted at Shoreham may have involved something other than training.
11 1
And training is broad, and it is something that contribute d 13 to the de ficiency.
It might have been a combination of 14 equipment and training, or a combination of precedures and 13 training, and given the subjectiveness also of these kind of evaluations, I know for certain that something that 16 occurs during one exercise sometimes is graded positively, 9
1; 18 and you could have the same action take place at another 19 location, and another evaluator would grade it anothe r 20 way.
21 So, given the subjective nature of these things,
22 I don't think you can compare totally numbers to numbers, and i
' ~ ~ -
f
100 letters of de ficiencies to de ficiencies, without looking i
at more detail.
3 0
If that is the case, "s,
Goodkind, why is it you 4
are making this comparison?
A Well, I am not going to compare strictly on 5
i 6
numbers at Shoreham, and numbers at other f acilities.
But I am saying that looking at other exercise evaluations may s
contribute something to my testimony.
9 Q
And at this time you are just not sure what, t
10 if anything that comparison will contribute, is that l
corre ct?
I 1:
A Well, I have not s tarted draf ting my testimony, la so it is difficult for me to know right now what role that 14 kind of comparison would play.
15 O
Do you think it is fair for one to say that j
in the instance of a deficiency or an area requiring 16
- 6. #
"1 1;
corre ctive action, or for that matter an area requiring 1
18 improvement noted by FEMA constitutes in some way a 19 training problem?
20 A
No, I don't think one could say that.
21 Q
Other than your opinion that you believe you 2:
will testify as to the adequacy of the LERO training program,
101 is there anything else you can tell me about what you i
anticipate your testimony will be on Contention 50?
2 3
A I think my testimony will be discussion of the 4
many instances in which the LERO organization demonstrated 5
the e f fe ctiveness of their training.
6 Q
And would that be based primarily again upon the i
FEMA Report?
8 A
As one of the major inputs.
9 Q
Can you give me an example?
j 10 A
O f wh at?
Q Of an instance of where LERO per formed well from 12 a training perspective, in your opinion?
13 A
Yes.
I think at th e LERO EOC, you see that 14 FEMA makes numerous citations of things that work very well 15 at the EOC.
I think they use the word, ' excellent' in 16 there.
They talk about implementation of procedures.
v 17 I think if you look through the summary dis cussion
'i '
l 18 you see both the words, ' excellen t, ' an d, 'well traine d. '
19 Q
Is it your opinion that those areas where LERO 20 performed well indicate that the LERO training program l
21 succeeded and is adequate?
A Well, as we discussed before, training is l
i
ATTACHMENT B
y
.. g iD.H.)
n I
I H
Id V
i 1
m.
UNITED STATES OF AMERICA NUCLEAR REGULATORY CO!O!ISSION BEFORE THE ATOMIC SAFETY AND LICENSING BOARD l
.E l
- - - - - - - - - - - - - - - -x In the Matter of:
Decket No. 50-322-OL-5 LONG ISLAND LIGHTING CO?!PANY (EP Exercise
(
(Shcreham Nuclear Power Station, (ASLBP No. 56-533-01-OLi l
"n t 1;
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _x i
i I
3
-^
DEPOSITION OF ELLIOTT PURSELL 4
'4 :
l l
e l
l l
Washington, D.
C.
l
)
?
Tuesday, January 20, 1957 l
~-
l l
l ace-FEDER.-\\L REPORTERS,.I.NC. -
i i:~:c~.nr Re.crtm I
i4, Ner:h Critci $cee:
l Wasn:nzten. D.C. 2:0.11
( _, s..,.. -_: 3, -w l
Nationwide Ccveru3e 800-336-6o4e I,
i
r
!/
34 l C*,
C A
Correct.
0 gwar C
Now, would you tell me what is a content analys.
3 A
Well, specifically, what I'm referring to here.
4 the application of a technique known as the critical 5
incident technique, which would be a content analysis of 6
the FEMA report.
T C
Critical incident techniques and content analysi 8
are not terms that I'm familiar with, Mr. Purse 11, so I'r 9
trying to get an understanding for, if one were to do a 10 content analysis of the FEMA repo rt, what weald you be 11 doing?
What would you be looking for?
12 A
Positive, neutral and negative statements, 13 specific actions, behavioral actions that were taken by 14 LEPC personnel during the FEMA exercise.
15 C
Could you give me an example, say in terms of th-16 LEPO traffic guides, of how you would apply such a conter 17 analysis?
le A
Yes.
For example, with the impediment, the fuel 19 track impediment, the traffic guide called in for 20 assistance and also requested additional cones.
21 That was an action.
It was an observable action 22 that was recorded and'it was a positive action. So it woJ1
_i_ C C.,, "~.,
,,.7
'r,~.,~--c.
Tg M
m, o
g0
~~
wsf ce categorized as positive, fc C
Scw the matters yea just mentioned, a traffi:
guide called in and requested additional cones, 3
that is nc 4
reflected in the FEMA report, is it?
5 A
I do not recall.
6 C
Let me back up for a second.
Has it been decided that a content analysis of the FEMA report will be e
pe rf ormed?
9 A
We're planning to do that, yes.
10 C
Will you be doing that?
11 A
Not alone, no.
12 O
Will you and your staf f in New Bern, North 13 Carolina be dcing that?
14 A
No.
Actually, our current plans are that Chuck 15 taverio and Cennis Eshr and myself will be doing it.
16 C
Have you started the process of performing this 17 content analysis?
19 A
Yes.
19 0
Did you start doing that yesterday?
20 A
Fight.
21 C
Approximately how Icng do yea think it will take 1
22 you to ec=clete this task?
i
- 7 7
. _ m
- 7 1I
)(N.17 M D~
01 CI 3f i
3-A We did not start until very, very late in the day yesterday.
We just got to the very beginning, into the 3
first few pages of the FEMA report. It will probably take 4
at least another fall day for as to complete that analysis.
5 Q
to you have a timetable at this point as to when 6
you anticipate completing this content analysis?
A Yes.
We set a date yesterday.
I do not recal:
e the specific date, but I'm fairly certain it was sometime 9
within the next two weeks.
10 0
Will this content analysis look at any docarent 11 other than the FEMA report?
12 A
In other words -
rephrase the question.
13 0
Well, the way I understand what you've told me is 14 that you're performing a content analysis of the FEMA 15 report.
16 A
Right.
17 O
As you're aware, there are a number of documents 18 that were generated, for example, by LERO, not contained 19 within the FEMA report, maybe not reflected within the FEMA 20 report.
21 Will your content analysis look at those other 22 kinds of dec.:ments or wil-1 it focas just upon the FEMA
.s..-
r,
~- y!C.i M I_ 1D,i'.
n.,
i' L n.-
T..,-
.h ~ "_t
01 C d '-
re7 rt and the statements made in the FEMA report; A
At this time, our plans are limited to the FEMA reDort.
~
4 O
The application of the critical incident techni 5
that you mentioned earlier, is that different frc= a 6
content analysis?
A Nc.
g C
They're one and the same?
9 A
Well, let me say this.
The critical incident le technicae is a type of content analysis. It's a way of 11 analyzing the centent of written material.
12 O
And you're using this technique, the critical 13 incident technique, in your content analysis of the FEMA 14 report.
15 Is that correct?
16 A
Correct.
l' O
So, in other words, is it fair to say that you're 18 not going to be doing a centent analysis of the entire 19 report, but only those portions of the repcrt whic.h i
are i
20 critical, or at least more irportant i
i l
than other sections?
21 A
Well, no.
We will go through the entire repcrt.
5 l
22 But, for example, in the repcrt, in the FEMA report,
- C; ""
3S obvicasly, there are sentences er statements that are merely statements of fact that are neither posit:ve, nor 3
negative, which obviously are net part of that analysis.
4 Bat they are reviewed, but they are not part of -- in terms 3
of final numbers, they are not counted.
6 0
What will be the purpose of this content analysis that you, Mr. Behr and Mr.
Daveric will be perform.ing?
e MS. MONAGHAN:
I'm going to object to that 9
question on the grounds that i t constitutes work product.
20 The ase of that content analysis and how it will ce used 11 constitates attorney work product.
12 I instruct the Witness not to answer the question.
13 Perhaps you can rephrase your inquiry, Mr. Miller.
14 BY MR. MILLER:
15 C
Let me ask you, first of all, Mr. Purse 11, 16 generally, what is the purpose of a content analysis?
17 A
To attempt to take narrative data and quantify it.
18 C
And have you performed content analyses before?
19 A
Yes, I have.
20 0
Have you ever performed a content analysis in 21 connection with a FEMA assessment report of an off-site 22 response to a nuclear power plant exercise?
s.-- r. -..,s 7,
n _._,_ n.
7.,-
- i 4
.i L.,
,.~
.....s--
w s,
h L4-6 A._ 4 w
pc:,.
39 A
No.
YcC'>~
c Give me, if you wocid, without even mentionin; 3
specific companies, the last content analysis -- describe 4
for me the last content analysis you performed.
5 A
The last one that I performed?
6 C
Or the first one, if you prefer.
7 (Laughter.)
e A
The first one would have been when I first started 9
working in industry, which was in 1974.
The most recent 10 one was approximately nine months ago.
11 C
Now, in the most recent one, did you have a f o rmal 12 report of a company's performance that you analyzed to see 13 the results of that performance?
14 A
It was a performance report, yes.
15 0
Okay.
And once you had prepared yoJr content 16 analysis, how did you use it?
Wh a t was it ased for?
17 A
Well, it was again to try and take narrative data 35 and turn it into quantitative terms, or numbers, rather, to 19 Icok at a report from a numerical standpoint rather than 20 just frcm a narrative standpoint.
21 C
- s it fair to say that the purpose, then, was te 22 more readily understand what performance evaluation had
'c-
..( C".C r r E.'s."s :
7 - p-n T
v..\\. 6-,Ed gp
. L 1\\E4
g C; 4C been made of that company's personnel?
y A
Pight.
3 0
It almost allows you to, again, focus on a 4
pass / fail criterion of, for example, 70 percent, or 5
whatever other number you pick.
6
!s that A
You could do that with that, yes.
g C
Do you intend at this time, Mr. Purse 11, to use 9
the content analysis you're performing for this FEMA report 10 cf the Shoreha.T plant in the testimony you'll re 11 submitting?
12 MS. MONAGHAN:
I'm going to object to that 13 question on the grounds that it constitutes inquiry into 14 work product.
I instruct the Witness not to answer.
15 BY MF. MILLER:
16 O
Mr. Purse 11, to the extent that you have begun the 17 process of preparing this content analysis, and I know it's 18 preliminary at this point, can you at this point summarize 19 the breakdown between positive and negative statements that 20 you've ascertained?
21 A
We've not even done enough of the FEMA report to 22 make any reliable estimate.
Yesterday was simply a A.CE-1-EDEL'.L r,iEPORh
0 g 01 41
- planning, Y' ward,
an initial beginning for that analysis.
7 C
And you think that you, Mr. Sehr and Mr. Caverio working together will require one more full day to --
3 4
A A full day, yes.
e 0
-- to complete this content analysis?
6 A
Pight.
7 C
You mentioned earlier, Mr. Pursell, that in connection with the deposition you have computed some e
9 general statistical i n f o rma t i on.
I think some of those are 10 ones you've already given to me, I assume.
11 Is that correct?
12 A
Correct.
13 0
Other than the approximately 97 percent of 14 observed objectives that you say were judged not to have 15 any deficiencies, 91 percent of objectives which you say 16 were met, or partly met, and the approximately 72 percent 17 of objectives which were fully met, have you computed any 18 other statistical breakdowns regarding the February 13th 19 exercise?
20 MS. MONAGHAN:
I object to the question on the 21 grounds that it inquires into work product in preparation 22 for his testit.ony, ultimately.
I'll 2nstruct the Wi tness 2 c_r_FcD u7.'.i PCT " TCTS I 'C.
1 u.u an N
p 01 42 i
not to answer.
pa te i
MP. MILLEP:
Well, that's not an appropriate instruction, Ms. Monaghan, when my question is has he 3
l 4
computed any.
I'm not asking him at this time the 5
substance of what he has done, but I'm asking if he has performed any task related to an area of these statistical 6
breakdowns that he has mentioned.
g That's just not an appropriate instruction.
9 MS. MONAGHAN:
Could I have the question read 10 back, please?
11 MP. MILLEP:
I'll be glad to rephrase it.
12 BY MP. MILLEP:
13 0
My question is, and I'm not going to repeat, Mr.
14 Purse 11, my examples that you've previously given to me, 15 have you performed any other statistical analyses or 16 computations other than those already revealed to me today 17 during the course of this deposition?
18 A
Yes, I have.
19 0
In the analyses that you have performed, have they 20 been analyses based upon the FEMA report of the February 21 13th exercise?
22 A
Yes, that's correct.
n n s - - -- - - -
T ht t-N.DE,, i_ ru r UL,c20,1.\\ L..
m m
0 ',
01 43 3
C Any doCJments other tnan the FEMA
(&Brd report?
2 A
NO*
3 C
N w the analyses, at this time unspecified, which you have performed, do you consider those complete 4
file 5
analyses?
6 A
No.
7 0
You're still in the process of working on those 8
analyses?
9 A
Yes.
10 0
Would you generally describe for me the analyses 11 that you are in the process of performing?
12 MS. MONAGHAN:
I'm going to object to that en the 13 grounds of work product.
14 MR. MILLER:
Are you instructing the Witness not 15 to answer?
16 MS. MONAGHAN:
Yes.
Do not answer the question.
17 BY MF. MILLER:
18 O
Co you intend, Mr. Purse 11, to rely Jpon these 19 analyses in the course of presenting testimony in 20 proceedings regarding the February 13th exercise?
21 A
Yes, I do.
22 C
Co you anticipate that you will formalize these n.C.- ~ [r,
,A l M c C R I u, nC..TL.N L.
s r, -n
- - is t' r,
..m
0, 44
- 01 7
analyses or memorialize these analyses in writing?
war-o A
I would anticipate that, yes.
3 0
And is it fair to characterize these analyses as statistical analyses based upon the FEMA report?
a c.
A Yes.
6 0
Have you performed any correlational analyses i
based upon the FEMA report?
e A
Correlational analyses?
9 C
Yes.
10 A
No.
11 O
Have you employed validation techniques in the 12 context of your review of the FEMA report?
13 A
No, I have not.
14 O
Have you performed any reliability analyses in the 15 context of your review of the FEMA report?
16 A
No, I have n'ot.
17 C
Have you conducted any comparison test in the 18 context of year review of the FEMA report?
19 A
Weald you define -- what dc you mean by 20 "ccmparison test"?
21 O
I'm just asing words that you used earlier.
22 A
Ckay, like T-test, high square test.
No.
A..CE-rEDER.AL KEPORTEFS,. INC.
w.
I C.
45 f.
rd 3 7
You didn't think I really understood what I was a
asking you, do you?
3 (Laughter.)
Pursell, now you've mentioned that you're 4
Mr.
5 performing or in the process of preparing a content 6
analysis of the FEMA report and that we have some unspecified statisical analyses which you're performing 7
e based upon the FEMA report.
9 Are there any other analyses or are you performing 10 any other research regarding the FE.MA report or your 11 retention as an expert on LILCO's behalf in these 12 proceedings at this time?
13 MS. MONAGHAN:
I'm going to object to that as work 14 product.
I'll instruct the Witness not to answer the 15 question.
16 MR. MILLER:
Ms. Monaghan, again, my question at 17 this tire is is he performing anything other than, or 12 preparing anything other than these statistical analyses or 19 the content analysis which we've discussed?
That's a yes 20 or no question, and it's not appropriate to instruct him 21 not to answer that sort of question.
22 MS. MCMAGHAN:
If that's the question, you may n-m,,,
iC.D,1.\\.L.
.iCC-f CCEnnf T,
n - " V.\\
u:...,:.
,. ;.m me
i L e ",
46 answer that specific question, Mr. Purse 11.
gMCg3 fd,
THE WITNESS:
It's not really a yes or no questier 3
because discussions as to what analyses may or may not be 4
done have not been completed.
5 BY MP. MILLER:
6 0
Okay.
My question for right now -- we'll probably 7
come to that but my question for right now is are you presently preparing or performing any analysis or research e
9 of any kind with respect to your retention by LILCO in this 10 proceeding other than the content analysis that you've 11 mentioned and these unspecified statistical analyses?
12 A
No.
13 C
Are you involved in discussions which may result 14 in your preparing or conducting research or f urther 15 analyses with respect to your retention by LILCO as an 16 expert in this proceeding?
17 A
Certainly.
19 0
Would you state for me, Mr. Pursell, any other 19 analyses or research which you believe could be appropriate 20 from the standpoint of rendering testimony in this case 21 with respect to the February 13th exercise?
22 A
Let me say that at this point, I cannot.
m nCE-rEDER-E. M?ORTERS, INC.
c a0 00
3 I
it-
. 0-ssa F
1 reviewed sGa
- i. preparati:n for this deposition.
r Could you tell me what you celieve will ce tne 3
gist of your testimony with respect to Contention 50?
4 A
I believe that in the FEliA report, that you can 5
find positive examples of positive behaviors which 6
contradict some of tne allegations made in Contention 50.
7 0
Now, do you believe that you'll be able to find 8
examples of cehavior which support any of the allegations 9
made in Contention 50?
10 A
iss.
11 0
Can you think of any examples at this time?
12 A
Not without looking -- I'd need to have the 13 contentions and the report to be able to go through it.
I 14 cannot thini; of any examples.
15 Q
In describing what you believe your testimony will.
16 consist of on Contention 50, I'll characterize it in terms 17 Of you believe there are positive examples of positive 18 behavior which contradict some of the allegations in 19 Contention 50.
20 Is that correct?
21 A
That's correct.
l 22 2
Is that, in essance, what you celieve will be *.he f
i i
w s-in l
.m== '_. ~.
== m.e
.y s s
, - m m, e fa. %!sN C.g 1.
rR
, R t*
5
- /**
==.m e --
l
=
g r
gg r
L
-s s
.y
(
..,. * = - *
\\.fm St A* B' J t e*1 M
- NA w
lI 116
.C
'F i
results of tne content analysis which is now being prepared a
i on Contention 50 in the FEMA report?
3 A
Yes.
4 C
To your knowledge, will the extent of your 5
testimony on Contention 50 be to discuss the results of the 6
content analysis which is being performed based upon the 7
FEMA report?
8 A
Repeat the caestion again.
9 C
~o your knowledge, will the extent of your 10 testimony on Contention 50 be related to the content
~~
11 analysis which is presently being perf ormed from the FEMA 12 report?
13 A
It will be related to that, yes.
14 O
And you believe that would be the extent of your 15 testimony on this contention?
16 A
Just the content analysis?
17 C
Yes.
le A
Mo.
19 C
What else do you believe you would be testifying l
20 abcut with respect to Contention 50?
\\
21 A
Well, first ofall,witnrespectktoContention50, i
22 I've already mentioned some, percentage authe rs earlier.
t
\\
l C-.
-.f 7,
- r..,Q.n,s. i.
...-.-_...s.
-,, ; T
..-s._.
..N a
_.m
1 I
, g:
117 r
1 Tne content analysis, soecific critical incidents, and at r
i this point, I will nave to say anything yet to be 3
identified.
Eat I would not limit it to the content 4
analysis.
5 0
Mr. Purse 11, at this time, is it fair to say that 6
the testimony yea are likely to render can be rendered
\\
+
l 7
without being knowledgeable about the LEPC training prograr 8
itself?
9 A
Well, I guess there's a two-part answer tc that
\\
10 cuestion. First, I think an evaluation cf the FEMA report 11 can be made in absence of a review of the training program.
12 However, I plan to review the training program.
13 0
Now, is it fair to say that the content analysis
)
14 could be performed in the absence of any review of the LEPO t
l l
15 training program?
16 A
Absolutely.
17 C
And is it fair to say that the percentage numbers 18 or statistical analyses you've referred to earlier today 19 could be performed in the absence of any review of the LEPO 20 training program?
21 A
Yes, that's correct.
22 O
And is it fair to say,that the specific critical
-.. W r > n
- 7. 1-n T
s:
4 L
u 6. '
- a...-
- %4 vi.
.m..._...,t,.
,\\--
was a
- =
,,-.O
\\'s. -wg s s rw C'"
'l I
119
. C-F l /g; incidents wnich you have indicated you're preparing cou:c r
I be made in the absence of any review of tne LEPC train ng 3
program?
4 A
Yes.
5 0
Notwithstanding that, Mr. Purse 11, I gatner yea do 6
intend to review and become more knowledgeable aboat the LEPC training program prior to your testifying at the e
Shoreham proceedings.
9 A
That's correct.
10 C
Now, what is the reason for that?
11 A
So tnat I will be able to then see if any other 12 types of analyses can be done, which is, as I indicated 13 earlier. It would not be limited to what I just said, that 14 in view of the training program, there may be other types 15 of analyses that could be done.
16 0
When do you believe you will begin fariliarizing 17 yearself more with the content, if you will, of the LEPC 18 training program? Is that a process you've already started?
19 A
Mo.
20 0
When do you believe you will start that process?
21 A
I haven't -- we haven't set a date yet.
22 O
Do you have any idea at thic time as to how long C.- -
,, 3 7,
7
- . - r-
~. d " c, O_ Mag _1_C ", /\\ ' ~;*5 t \\.I-
'q f
1 o.
L.
l tnat process may take yo;7 2j f',
el' A
- doubt tnat it would take a great deal cf t.re 3
because, as I said, I would like to look at the training 4
program to see if it provides any other opportunities for 5
analysis, not to become an expert witn respect to the 6
training program.
0 Is it fair to say tnat if you become knowledgeable 8
aboat the metnodology and conceptually how the training 9
prograr is str;ctured, you could then make judgments about 10 whether fartner analyses coald be or should be perf ormed?
11 A
I would expect so, yes.
12 C
Mr. Parse 11, going back to this mention of 13 specific critical incidents, we touched on this earlier.
14 Are these critical incidents different from the content 15 analysis wnien is being performed?
16 A
No, it's the same.
17 C
Okay.
When you gave me your list and you had 18 critical incidents separate from the preparation of the 19 content analysis, you did not mean to imply they're 20 different.
21 A
! meant content examples, in terms of you asked --
22 I the;gnt tne question was what would my testimony consist 1
i 7
- 7. - -, -
T s,. r _ n.._-..
a.
s.-
,.. s. _. n.~.,..i._..
m s
+h
'9 I
~
12C
, ?-
L of?
And it would consist of the critical inciden-
.re a-I technique, wnicP is the content analysis, and seme conten:
3 examples.
4 0
At the present time, can you tell me any specific 5
rritical ircident examples which you would intend to rely 6
Jpon in yoJr testimony?
7 Well, yes. 'et me give you one exarple can thinr.
8 of.
9 Witn respect to the fuel truck impediment, the 10 roJte spe:ter was sent out, which was a correct action.
- n 11 fact,.tne impediment was identified.
The fire department 12 was notified.
Mess Oil Company was notified.
The traffic 13 was appropriately rerouted.
The traffic coordinator called 14 and asked fer additional assistance and cones, and the 15 traffic coordinator verified with the traffic engineer 16
- nat, in fact, the traffic was being rercated correctly and 17 equipment to remove the impediment was, in fact, dispatched 16 to tne scene.
19 So that is one example of wnere there are a series 20 of positive, effective behaviors.
21 0
Now, are you aware of the fact that there was a i
22 delay in the response time c'y LERC to tne fael truck
& _ w. i,.- A.-
_ _..._ - _., i,,.
m, w.
1 I
3-.
i.
.6 E
simulated irpediment!
I st; I 4
I A
Yes.
3 O
Co you factor tnat delay at all into your content 4
analysis?
c A
Yes.
6 0
to you consider that an example of a negative benavior to tne situation presented?
E A
Yes.
9 0
Coes the content analysis, or will your content 10 analysis reflect both the negative and tne positive aspects 11 of LEEC's behavior during the exercise?
12 A
Exactly.
13 0
Now, with respect to your testimony on Contention 14 50, Mr. Purse 11, at this time we've discussed the content 15 analysis and the percentage numbers that you have already 16 stated on tne record.
And you've indicated that there are 17 still tnings that could be identified in the future, but at 18 this time, you presently have no knowledge as to what those 19 things may be.
20 Is there anything f urther you can tell re abcut 21 wnat yo; believe year testimony on Contention 50 will 22 consist cf, etner than as.previously stated?
,- n,,,
nm
.5CO A C L/ b. AL.L DC k,',N i CSb O C.
,r
~
i f
d
.3..
.O g....
2
..=
g.
.. s.
..%=.. s.
.c.
3..,..p..
.. =.
,J~
.s g.
.4 p...
3
,3.
q, 2.
s..',
4.,.....:.
2
...2....
- 3.. 3.?.. : 23
.3. ; s.
a 2
,0
.r,
.5
- i..
a.
..s. e 2
. 3./ e O.. o....,,,a
.,3..
v...
.... ~.
a.x3...plas o.c
.s.
.e
...j s.
%.,2..13.f i g.r
..2
. e ~., 3 4.f m.
- 2. g 3...y,.1. a.
- e..:
4 3,, 2..
a.,...a....,
.3
.,2
.... -..a.
- 3..
...3...
. 4.L..4*.-.i
- 1..'.... '.
- y.".,.=**-9
.s
^
y.
.o a..y'.
.. a.
,. D.,. p.,.,. o 2.
.a 3.
.4 9
WORK 37
.,; a..-
3
.c 7 e.*
'.'.4 a *. ". ". ~.. '. "....'
' '.**D*.*.*'.'.'."."*
8 -
.e a a.
. o s..,....
.s<
.s....d 3..
f o.
...m.
as
....., r 3.,
..fpe C.c n a a.,
.,2 O.....ic.
....a on.s.
3.,
- O,., a n o,.a o
.A<
13 3 3l.
4.,.
4.q 3
- 3. g. (. 3.. 10
- . 0 " "....S
. ' ".. e r * #. 3.'
.'s."....S*..
.t at
,. c.... "l e.
S.3..
.n
.a..,...
4
.s m.: 3,
c 4....S
..%3
,m m. a..
.e s a
S. a. m,.3..... a.,
.. y
.g a..-. s. a...
su.
..,=
. y
......a.
- 2..m.
7
.)
- ..s
= a.....
.i a. ;=y.3....,.a.,
e.g
.=.8%.2=.
..3-.O
=. *..
4*
.....3. =. 3
""'^.h*
..a..".I.E a. 3
- O.
3.]
- 3. N. A. /.* *.*.M
- 3
.." 3..*..*.*.;*
y******3.=.=9 a
.s 3....
.g..
1 s s
v
..S..,.f,3
.A 7
.%....,.. g,...e,
23.s..3 2.
... ;;.. ;.4.,.,.
3
...c tt
.S
.,.g.3 3
. a..
==.=e*./. s. a.
- g.9]. 3 g.9..%.
..%..*.e..
N, m s.
.r..
,g..
...a m
3..**..*.*..'
1 a..
- 3...a.,,=..t..*.%.I s.
ye $...%
- m. t G
- g
,3
- 3.,,,.4. G 4
.= %
.a
.g
.,4
.4.
.g....
s e.
.. =..,...
.g g.
.... 2.
3..2.
30y g.,,.
g.
.g 1
\\
l
. " " - " ' " * =. * - ' =
g
""*"=___9__DO.
.g S,
g
'E
^ ~ ~
g^
f-l
~~
.z.. s......a.
.a f.
. : ~...,. _~. 3
?
3...; a.
- m. a.
.. e *
.s.
{
3
.a.
.g... A.
s.
.a.4.
34 3..,.....*3 s
s
.., u. i 4,... a.
g.
4 s
.g.. ; 4 33
,4.
4
.3,,
32 a.w:3
.3
.6 y<=
J w.
C,...<.
. h.2
..n.
a..
.s-2 s.
.. - r.a. ^s 3.;
.,.. 3.
3.,.
....,.. 4,, 'a.
.s
. 1
~...,
3...
u.
.2
,3 o, 3 a-c :........
.. s. a.s.,.
a...
.s
.sa..
.no.3 "1
- 3.,0...,..
..*v..
1.*
m.,
ww...n.
s. 3..:,. a.
.e 4
f C..,
g2
...w a n
., a 4.:.,1.'1.,.
...e n3 a.4..V w
3
...)
.a -. 3 1..
a.........
..g
... ~.., c:.0 s.
. m 3 3..,...'. '/ *. S 3 *.."" * * * ". W G ','.". "..'. 3 's
.. s 3..
e.
.2 w
... s.
. age 5 s.
n e,,. 4,/ 3 o, 3
.w.
.s
.. s... s. u...
..,. 1.A as.
34. 3 a.
f
.a..
..o g=.
4
,.4 v
....3
.... s.
...e..
.a. a.s.s.,, 3
.p a a...
4 t
- ~. _>.
t
.95.
a
- .3
....y.
.2..
,... s.33, 3
- 1 2.....=.=
4
.,a......o.
- 2.... s. 1
.3
.g o.e
,g 4
- ...s
. 3., g q.. '. 3 w.
g wL 3..
- 4. s... a.
a
- 3
.....s I
I I.
i
.4
.T.e l
(
s..---
L.
- ~
s s-
J.w
.O.....
J
.s X 3.. ',
- a.....
2......
4,.,
3."
s*
.... s.
J.. 3. '.
....=f.
A.... a.. =. a.. =.. %
...mm.4
.g..
.. n.. l.,
=..g
.j a..,...
.3
.g.g.
- 3.. g s%....4 34 y.o..
. 3, 3
.go..
......' 4
.. ' J..
4
~.% 3.
.ag
.e s. s g. e
. e '.. ' '... *. 'W'.S a'
.'O.'..'g..#
".?'.'.
.a a
.a 4 01 w.6 4
q
~
- a., w i.o.~
6.
a.
- s. g a * ~ =.. o. e
)
... g
.,w2 e,
.g t..
...s...
.,s...
O 92 gg 3..e
..-.r2
.-.... =3.2JF
=
a y**.
...e'.
=
Q a 3 m...s.
.S. a.
t. =.m. j
$. i.m...
.a.=,
~..=.a.
c
. m..=J e # a e
..s
.,3
..3...;
3...=.-
s..a f
s.y
....s..
2 3.. f s...,
a.
o.
..,. o
.' a D..,S a..'
7..e :.
.h
..s
. O -.. a n c. e a.a.4a-4
- 2..-.. a.
.s.r o
,D a s..of.ai
~
J.
a.4.-2
=
2
..w
.t a g.e
- a. m....e. 3...!
- a.q q d. 7
.1 1
- o. r. 3.i *.,,/,.S.
. A 1.
4
..
- A. m =. a.
.3.=..s a.. g.e,
7 S. o.......a.
.2.;m. e i
. y.j
.. 2
....d.*.*
3."..'..','/ a~.' c~
."..".r'.
~.~.S.
i
.g n a. r e.
3.
3.* ;....
,,4 c....
a.
a..
a.
O
_3.3.,
ywa......
.a....
l
..ae O
.s j
.e. z. :.....
.3,....,.
7 1
1 0
3
?
I 1m m.
.....a
- a
..s.a.
- a. 1 6,/ O,w.
, S a.
n 7 n - a.
- 3....
a
.g 3.,
w
.v s
...a 9 %
- 4 g
j
,M..'
J,.4..
./
4
.. "J '.
.J 3
3
~.W.3 e
MF.A4 Qm W
5 94
.9 J
p-e
- A
.p..
- 4..a...
g.
04 c..
m.
J..
...a.'
.4 s....
i I
l l
N
.M M
M
'Elh W..
g n
QO.
N Q
g Q m
,3
..-2....
- 4...
.g w... 4
,7 3...,
A.,.,..a
.22
.s 3....
y..
...,..3..,
- 3. -...2. n..A..s 3..,
.g o... '. a.
1
.a. m a. n..a,.
.s -
m f
3 ;.:...
..,s.. ;., a a-Oc..a.
. ( s..:.-.
a..s ---.u~
y ~~ ~ '
.4
.*3.-..
..,...a.
=...<
.s. : a..-.. e.
...a
. u. g t ;-.,
a.n.
a y:.
- y,.
- ...:.... a. a. n... g
.=. p.. a..,.
-n
.a 20 unanticipated and unrehearsed situations occurred.
e,
.2 3.-.. 1s we. ". -
.q..
- k.... - '.., w...=..a. v,...-"...2.4..
s,
, 3,
.s......
- 2. p
-3.*3...'..
fp w w 3......
y,....:
3..
....g.
34
.3...
- 3. 3 c
8"-
g m
w m c....
g
,t m...
]
.,=
2.2
.e r."sn s..
..2
....=
3.-
.s
, 2 3.,......,-
- 3... -,s...>=.
- =. s
.1. o.
e.
-.2...
c 3
-3n..,..,.
.a-c,-
2 J
a....,=.....
..w...
.s.
.. 2.s and... a u o n. a..n t.i
.7 a.l l a.,e s
'..".a -....a.
". * - ~ ^
.,. 3 u w
o
~rai....,
.- ^ ~,
a..
.n a s n a.. o "e.. e s s f ".. '.' " o..= # # e -.. v a..' /
e o
.3.2.
.s.-
r.33
.34 rgg 3 =. r...,,.., g.1
- 3. 3.3r.
.s a.._,=
a..
s....a C o.o.. o. n...,.
a,,
.3
.:=,..
n-..
.an-.
3
?
.n I'
askin; cecause your response earlier to
.e 2.
2..,
..-_.a.,.
...s.
.n.
..,...n,,3,...,.-
3..
.o m.
u...
-=sso.
y
.y
- S a.
A.
M vv.-
e..
^
J V
4 " e.. w a..*. a. d y" o "w
=,, e. =.. i = ~..= t ". a *w
..".a.
^^-
C w" 7 ' 2. 7 'w _i 'in
-ves7'". Chal.'en*3 4-
- 4 h e ". h a..-
- "4.A a
e sv"
=..
.an..
s
..2.....
.. 3.3.' a.. c, a s w h a.... e -..h e,o r. o,. =.,..". a s..
. ". - = -.. -..
a
.o 33 oe...g o.
.a.. : e - o...- o..
.;n..
s-
.-~..
A R_;h. I understand.
. +,,,.. an,,
st,u,a->.s
- a. C-..,....;on v,
.3 s
.s -,.
a i.
e a
,t
- 3
- 0.,.. s. 2.. 3
.w3.
s.
- h. 4 -
- .. o.
3g w3 3 w,.
.s.i.' =. V e ' ' D G
'd < ' '.I ' '
a o
w..
2 y
90 d.-
v a, " ~. a.
'o a s e " " y o n
.v^. ".- - a. v i =. w a. " a n a s' j" s e s
- .h ". s
.# =.- w 'L..*.
u l.
..w. 2
.-,..p 4.,p.,.
.A ra.g.a..
s
.-n.4.V.
-. - a t. ;,
,X,_..e-l 3. :..ha.
.......*4 2..
.e
.6 a.-
s
& p.m
- =^f=y__.m.
e
3 I
0-14; I
g.
Why would you have to prove that by way of
~
statistical metnodology?
What if tne evaluator simply :ame 3
out and said after the fact, I was biased and favored one 4
side over the other?
Then the content analysis would be 5
useless, wouldn't it?
6 A
If a person says that they are biased, saying that 7
you're biased and what their recordings are may or may not 8
be tne same thing.
9 0
But bias may also go to the area of whien areas 10 were even examined in the first place during tne 11 evaluation.
12 Correct?
13 A
Pardon?
14 0
Bias of the evaluator may also impact which areas 15 were examined during the evaluation.
16 A
Which areas were examined.
17 0
What aspects of the training program were looked 18 at and which aspects were ignored, bias can influence that 19 sort of decision, couldn't it?
20 A
I suppose that's possible.
21 2
'Jouldn't that have an impact also on the validity 22 of the cortent analysis?
m r_CE-rEDERAL REPORTERS, INC.
g 141 A
Not if the -- first of all, you nave to remember gi 2
the people that were selected to be observed were randomly 3
selected.
4 Q
Are you talking about with respect to Shoreham or I
l 5
are you talking about with respect to typical content 6
analyses?
i 7
A I'm talking acout with respect to tne people who i
8 were coserved in :ne exercise.
9 C
Is it your understanding that FERA randomly l
10 selected those persons observed at the exercise?
t 11 A
That's my understanding.
12 O
I.
it your understanding that that random 13 selection was based upon statistical methodology of random 14 selection?
15 A
Yes, that there was no bias introduced into that 16 00servation process.
17 0
If it, in fact, turned out that FEMA did not 13 employ statistically sound random sampling techniques in 19 choosing those persons evaluated during the exercise, would 20 that impact tne validity of the centent analysis that 21 you're preparing?
22 A
It sacul-i not affett.: significantly.
The only i
i l
i
("
}
g
1 *A.m
.c g1 way :: would affect it is ;f you <new for a fact ena; yo; 2
specifically picked certain individuals and specificall',
3 designed it.
You would have to specifically alter tne 4
results. Otherwise, any degree of randomness that is going 5
to be entered into it is going to be a positive factor in 6
lending credibility to the results.
7 7
Is the content analysis less valid if 8
statistically sound random sampling tecnniques were not 9
employed in the course of evaluating the LERO personnel?
10 A
There's no, measurement tecnnique for what we're 11 talking about. In other words, it's purely a matter of 12 subjective opinion.
There's no way I can really answer you 13 oy saying, yes, there's a cut-off point at which --
14 But what I am saying is that in order to say tnat 15 this was ciased, okay, at the ceginnin;, that this observer 16 was biased or that the people selected were biased, that 17 must ce statistically proven in order to show that 1 has 18 an impact.
You can't just assume that or say thf..
19 That's the point that I'm trying to make.
20 Q
Okay.
But understand, my questions were all 21 hypotheticals.
22 Yes.
n
l l
1 t
l ATTACHMENT C 4
I f
1
,.,,,,.. - - -.. ~, -,, - - _..
l TIM. SCK1F1
~
O lnl
']
Ollr7
-9 i
lC H
H H
V..
VL 1
2 UNITED STATES OF AMERICA NUCLEAR REGULATORY COMMISSION BEFORE THE ATOMIC SAFETY AND LICENSING BOARD
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _x x.
In the Matter of:
Docket No. 50-322-OL-5 LONG ISLAND LIGHTING COMPANY (EP Exercise)
(Shoreham Nuclear Power Station, (ASLBP No. 86-533-01-OL) i Unit 1)
_X
(
i E.
DEPOSITION OF DENNIS MILETI l
l j'!
Hauppauge, New York Thursday, :Tanuary 8, 1987 l
ace-FEDERAL REPORTERS, INC.
5tetotuh' Re:vrters
+44 North Camtol 5treet Washmeton. D.C. '0001 j ',
(202) 34~-3M Nationwide Coverage 800-336-6646 l
117 i
the training for those activities worked, and that the 2
purposes of exercises -- in this case, this exercise -- is i
3 to reveal problems and there were some that were revealed, i
l 4
and they can be corrected.
j i
5 0
Whether or not some of those problems can be 6
corrected, would you agree that the problems that arose i
demonstrated sone problems with training?
8 A
Cn the surface, and I can't say because I haven't I
9 dug deeply into sene of the things that were observed, but i
I i
10 as I recollect somebody didn't know about when they were l
gi supposed to look at their badges, or whatever it was they 1
i used to see what kind of radiation, and I think that can be 8
13 addressed through training.
14 And, as I recollect, there was a person or two 15 who didn't identify or drive their whole bus route, and I think that that is a problem that can be taken care of 16 1,
i t.
through training.
18 Those are the couple that come to nind.
I think 19 on the sum, the exercise revealed that the LILCO training 20 does work.
21 Q
I take it though with respect to the two 22 examples that you provided, better training might have
118 I
alleviated those two problems?
2 A
Well, I don't know.
I haven't dug into those 3
particular cases.
I don't even know that I will dig into 4
those particular cases.
I can't even say that those people 5
were trained.
6 But, I think the purpose of an exercise is to 7
see if an organization can achieve its goals, and I think 8
the LILCO exercise illustrates that the LERO organization 9
was basically able to achieve its goals of emergency manage-10 ment.
I think that it illustrates that the training was 11 successful.
12 Q
Have you been asked to nake any recommendations 13 to impreve the training of LERO personnel in any respects?
14 A
No, I have not.
15 Q
Do you intend to do so?
16 A
I have no idea.
If an idea came to mind, I 17 certainly wculd share it.
But, I don't even know if I would 18 have an idea regarding training.
19 I think the training program is obviously 20 adequate because of what happened in the exercise.
21 Q
You testified on training back in 1984, didn't you?
1r 119 1
A Yes, I did.
2 Q
Okay.
And, I believe that you were on a panel 3
with a number of other people; is that correct?
4 A
I don't ever remember testifying alone, so I'm 5
sure that I was on a panel with other people, yes.
6 Q
Do you know who you are going to be testifying i
with on this particular contention?
8 A
It's possible that I've been told.
I know who 9
the attorney fron Hunton & Milliams is that's handling this 10 issue.
I don't know who else is on the panel.
11 But that doesn't mean that somebody didn't 12 mention that to me.
i 13 O
Do you intend t'o get into the specifics of the I
14 training that has been conducted to date and why that's 15 adequate?
16 A
I'm not sure.
I don't think so.
17 0
Have you conducted an analysis of the training 18 that has been done to date?
19 A
No, I have not.
20 Q
Is that -- is the training that has been done 21 to date anything you can comment on other than by looking at 22 the results of the exercise?
l 120 1
A Yes.
2 Q
Okay.
And, how is that?
\\q 3
A Well, I've been on a tour of the new LILCO train-4 ing center and I've seen some of the classrooms and the lay-4 5
out and the gist of how that building and how the activities in it are supposed to be organized and coordinated, 6
certainly 7
not in any detail.
Again, I've only been thinking about this 8
for a few days.
9 And, so I have seen some of the things.
- And, then I have seen prior versions -- I'm presuming changes have 10 11 been made; I don't know that for a fact of training.
12 manuals, of training modules or whatever they were called 13 and commented on them in detail; and although I don't think 14 I've ever observed the physical act of training going on, i
15 I certainly have seen most cf the, in fact all, training 16 modules that LILCO had and developed and was planning on 17 using up until a couple of years ago.
18 Q
Do you intend to give any testimony on the 19 training manuals that have been developed or the training 20 classes that have been conducted?
21 A
That hadn't crossed my mind, but that's not a 2
bad idea.
Maybe I should consider it.
121 1
Q Well, have you been asked to do that?
2 A
No.
I haven't been asked to prepare any testimony 4
1 3
yet, I mean to talk to anyone about what I might do or not
-l N
4 do.
I haven't really thought much, except considering looking 5
at again the bottom line, what makes for an efficient or 6
effective energency response.
And if the evidence for that 7
is there, one would have to presume the training was 8
adequate since the purpose of having the whole training 9
program is to achieve that goal.
10 Q
Let me refer you to Page 89.
In the middle of l
11 the page there is a paracraph which states:
Every sentence l
of a LILCO training deficiency revealed during the exercise 13 is not described at length in this contention because they 14 are so numerous; virtually every error made by a LILCO 15 player during the exercise involved to some degree the failure 16 of the LILCO training program to prepare perscnnel adequately 17 to perform necessary actions.
18 The caragraph goes on.
Do you see that?
I 19 A
Yes.
20 0
Do you agree with that statement?
21 A
No, absolutely not.
22 O
In what way don't you agree with it?
122 1
A I don't agree with it, because I think what's at i
2 point here is the functioning of an organization, not the
-4 functioning of individual members within the organization.
-l 3
'N 4
Q Well, isn't it true that in order to test the 5
training of an organization you have to test the individuals?
6 A
It is true that if you took individuals away there 7
would be no crganization.
But, no, the answer to your 8
cuestion is, you test the organization and see if it can 9
achieve its goals and accomplish what it was set up to do.
10 And it may be the case that an individual falls 11 down.
But what is relevant is that the job the organization 10 was set up to do is completed.
13 0
putting aside whether or not the job is completed, 14 isn't it true though that failures by particular LILCO 15 players means that they were not properly trained?
I 16 A
That's theoretically possible.
It's also possible 17 that a failure could have been due to other things.
For la example, a stomachache.
There are many reasons why somebody i
19 might fall down on the job.
20 0
Do you intend to do any analyses to determine 21 which problems were caused by stomachaches and which were t
22 caused by improper training?
l.
123 t
A I assure you, I would be real surprised to find 2
that I'm doing that.
I can't say it's not possible, but i
3 the probability is so low we might as well dismiss it.
What I
is relevant is whether the organization can do its job.
4 That is what happens in emergencies.
5 Emergencies are responded to by organizations.
6 7
0 What is the basis for your testimony that the organization overall responded well?
a g
A I think the basis for that testimony is on the 10 basis of what I've seen so far it appears to me that people's public health and safety would have been maximized to the 11 12 extent that it's possible to do in emergency planning in this 13 exercise; that is, pecole who needed to have evacuated would-14 have evacuated, and exposure to radiation would have been
,5 minimized.
l 16 0
Hell, without hypothesi:ing what oeople would have done, what is it about the results of the exercise is that leads you to believe that overall the exercise was a 19 success?
20 A
Mithout looking in detail, there have been several 23 things.
First, the comments offered by FEMA in its assess-22 nent.
I think the comments offered by FEMA in its assessment i
f 1
- ' - ~
147 I
i 1
1 1
That example was a thousand times more valid than ij 2
any statistical sample we could have selected simply because 3
as goes the Bank of America and Chase Manhattan Bank other l
' l banks follow.
i l.
Q Well, in the situation of emergency response where s
you don't have those dominant organizations, in the case
.I I posited you have 50 ambulette drivers, none of whom could i
i j
be equated to the Bank of America versus some savings and 1
loan here.
Now, given that type of population, isn't it 1
true that from a sample of one you can't generalize to a I
i
)
sample of 50?
j A
No, nor could you generali:e from a sample of 49
)
1 1
to a population of 50.
If you had exercised 49 ambulette 1
l drivers, you don't have any more of a statistical basis for l
.I i
making a conclusion about Number 50 who wasn't exercised I
than if you exercised one ambulette driver.
l 0
But, don't you have a more valid setentific basis for drawing a conclusion than if you --
A Absolutely not.
You either have a statistically valid representative prcbability sample or you do not, period.
And there is no mincing words here.
You either have it or you don't.
The car has either got gas in it or w
e-
148 1
it doesn't.
2 And in some regards, you are even running greater risks if you think or pretend you have a statistically valid a
4 sample sometimes when you don't.
l 5
And you are presuming, which I disagree with, that 6
an exercise needs to test individuals as the unit of analysis.
7 And I disagree.
It is to test organizations as the unit of 8
analysis, because organizations respond to emergencies.
9 Individuals do not respond.
l 10 0
Well, if an organization is made up of 50 people t
l 11 and you only test one of those people how can you draw a 1:
valid conclusion about the organization?
13 A
Well, let's take this exercise as an example.
14 0
Good.
15 A
The exercise is made up of an organization that I
i 16 has a whole lot more people in it than 50 and a whole lot i
17 more than 50 were exercised.
Hundreds upon hundreds of people 18 were exercised.
19 But what's relevant is that key functions, tasks, 20 components of the organization, or the organizational system, 21 be exercised to see if it can be mobilized in terms of 22 people, if it can be mobili:ed in terms of ecuipment, if the
. - - - - - - - - - - - - - - ~ ~ -
e' I
149 1
goals of that particular function can be achieved.
And, no, 2
you don't have a statistical basis for generalizing to 50 3
from one, nor would you have a statistical basis for 4
generalizing from 49 to 50.
5 How many people do you need to include in order s
to make a prudent judgment?
That would vary by task and l
7 from function to function and not based on statistical I
8 sampling.
9 Q
Well, using that criterion then, wouldn't you 10 have a better basis to make a prudent judgment on the 11 capability of ambulette drivers, let's say the universe of ambulette drivers is 50, is you looked at more than one?
13 A
If you looked at more than one you could make 14 conclusions about more than one.
15 0
You could make a better judgment?
16 A
- don't know that it would be a better fudgrent.
17 The peint, the :udgment you want to make is not a statistical 18 basis for generalizing to the population of ambulette drivers 19 but whether or not that emergency function can be fulfilled.
20 0
And it's your testimony that ybu can make that 21 judgment simply by looking at one individual performing a 22 particular task?
ATTACHMENT D
.s l
t i
\\
\\
\\
\\
\\
\\
NUREG/CR-3524 ORNL-6010 Dist. Category RX i
Energy Division ORGANIZATIONAL INTERFACE IN REACTOR EMERGENCY PLANNING AND RESPONSEi
\\
J. H. Sorensen I
Oak Ridge National Laboratory I
\\
E. D. Copenhaver i
Oak Ridge National Laboratory i
D. S. Mileti Colorado State University M. V. Adler Oak Ridge National Laboratory k
i
\\
Manuscript Completed - September 1983 j
Date of Issue - May 1984
(
\\
\\
Prepared for Division of Facility Operations i
Office of Nuclear Regulatory Resear:n i
U.S. Nuclear Regulatory Comission
\\
Washington, DC 20555 i
Under Interagency Agreement DOE 40 550 75
\\,
NRC Fin No. B0491
.I
\\
\\
Prepared by thr.
Oak Ridge National Liboratory
\\
Oak Ridge, Tennessea 37831
\\
operated by Martin Marietta Energy Systems, Inc.
for the U.S. DEPARTMENT OF ENERGY under Contract No. DE-AC05-840R21400 s
1 CONTENTS Pace ABSTRACT v
EXECUTIVE
SUMMARY
vii 1.
INTERFACE IN EMERGENCY PLANNING FOR NUCLEAR POWER PLANTS 1-1 1.1 Research Issue 1-1 1.2 Researen Objectives 1-1 1.3 Researen Framework 1-1
.1.4 Researen Methods 1-3 2.
PLAN REVIEWS 2-1 2.1 Sample Selection 2-1 2.2 Data Collection 2-1 2.3 Findings by Functional Response Tasks 2-3 2.4 General Fincings 2-6 3.
CASE STUDIES 3-1 3.1 Case Study One 3-1 3.2 Case Study Two 3-4 4
TEST EXERCISE 4-1 4.1 Interf aces During the Exercise 4-1 4.2 Tne View from the Utility EOF /TSC 4-2 4.3 The view from the State E0C 4-3 4.4 The View from Local EOCs 4-3 5.
FINDINGS AND THEIR IMPLICATIONS 5-1 5.1 Findings from the Plan Reviews 5-1 5.2 Findings from the Case Studies in 5.3 Findings from the Test Exercise 5-2 5.4 Inglications for Emergency Planning Regulations 5-2 APPENDIX A - Dynanics of Interface:
Organizations During E me rgencies A-1 APPENDIX B - Sone Potential Problems in Emergency Response B-1 APPENDIX C - Discussion Topics on Organizational Interface C-1
~
APPENDIX D - Test Exercise Evaluation Topics 0-1 APPENDIX E - Data from the Case-Studies and Exercise E-1 APPENDIX F - Description of the Exercise F-1 l
1 REFERENCES R-1 1
1 l
l i
t L
t ABSTRACT Tne purpose of this research was to determine if existing regulations nave led to ef fective interfaces between utilities and offsite organi-
- ations in emergency planning and response.
Findings suggest that regulations have provided the necessary framework for achieving adequate interfaces.
That interface has been achieved is demonstrated by compre-hensive response plans and good cohesiveness among organizations involved in emergency response.
Interface problers identified in the research can be reduced by better implementation of existing regulations rather than by revision of existing ones.
f
I EXECUTIVE
SUMMARY
Ine purpose of this research was to determine if existing regulations resulted in acequate interface between utilities and of fsite organiza-tions in emergency planning anc response.
To address this question, we attempted to assess two elements of emergency management which could be used to measure the level of interface:
the comprehensiveness and cone-siveness of planning and response. Comprehensiveness of planning was determined by a detailed review of emergency plans.
Comprehens i venes s of response was determined by the evaluation of a test exercise.
To assess cohesiveness, we identified, from a review of relevant litera-ture, a set of factors associated with cchesive response.
Two case studies were used to measure the presence of these factors in planning.
Second, a test exercise was observed to measure the presence of tnese factors in response.
FINDINGS FROM THE PLAN REVIEWS The two purposes of the review of the emergency plans were to determine whether these plans would constrain a cogrenensive response to an emer-gency and to determine if planning activities were reasonably coordinated.
In general, tne review suggested that there is adequate interface between utilities and offsite organizations in the planning process and that tnis interaction nas led to well-coordinated planning documents.
Altnough tnis conclusion applies in general, we also noted several areas in which interface during the planning process can be igroved.
FINDINGS FROM THE CASE STUDIES ine case studies attemoted to measure and assess the cresence of factors that nelo promote conesive response ef forts both witnin and between organi:stions.
We found that, internally, organizations are quite cone-sive. Coneston tends to break down, however, in relationships across organizational response networks.
Organizations demonstrated flexibi-lity in tneir response systems, wnich increases the ef fectiveness of response.
FINDINGS FROM THE TEST EXERCISE The test exercise was used to determine if comprehensive planning led to t
a coordinated response and to assess how cohesiveness may enange during a simulated response.
Overall, the actual response that we observed was not as well coordinated as the planned response.
This discrepancy was mainly due to problems in implementing procedures and not from inade-quate plans and procedures.
In addition, we found that internal cohe-siveness witnin an organization increased during the exercise but that interorganizational conesiveness decreased.
CONCLUSIONS Findings suggest that inclementation of existing regulations have led to comprehensive planning ef forts. Minor inprovements to plans can be made, but they fall within the scope of existing rules, regulations, and implementation guides.
On these grounds, no regulatory changes are warranted.
Findings suggest that regulations will lead to fairly comprehensive responses to an emergency.
Evidence from the case studies and the test exercise suggests that problems are due to poor execution of emergency plans and to lack of resources. Existing regulations explicitly deal with these problems, which can be reduced by better enforcement of the regulations.
More difficult and abstract to assess is the level of cohesiveness in and between emergency organizations.
Our work revealed that cohesive-ness as measured in the planning process is strong within the organiza-tions but weaker between organizations. The difference was even nere pronounced for cohesiveness in emergency response.
The major reason for lack of cohesiveness was poor communication and a lack of knowledge about whom to conminicate with.
This is made more problematic by a lack of legitimacy among organizations; that is, some organizations do not have confidence in others or do not believe they play an inportant role.
Overall, our research suggests that the problems experienced at the Three Mile Island Nuclear Plant with respect to planning and coor-dination will not occur in another emergency unless existing regulations are not followed or existing plans are not properly inclemented.
However, mechanisms for ensuring that plans are properly inplemented are part of the existing regulations.
1
~,--...------m-r
-eve---
+wr
T"'t-"'"" " ' " " " - - - ' " ' ' - ' ' '"
'*~
~ ~ ' '^
1.
INTERFACE IN EMERGENCY PLANNING FOR NUCLEAR POWER PLANTS 1.1 RESEARCH ISSUE One of the many " human factors" issues in nuclear power plants (NPPs) emphasized by the accident at the Three Mile Island (TMI) Plant was the problem associated with organizational conflict, communication breakdown, lack of planning, and lack of coordination.* While the uncertainty and the characteristics of the accident helped in creating these problems, so did the nature of the organizational interfaces both prior to and during the emergency.
As a result, the U.S. Nuclear Regulatory Commission (NRC) and the Federal Emergency Management Agency (FEMA) issued revised rules and implementation guidelines on emergency planning which set forth require-ments that intend to assure adequate interf aces between the utility and of fsite organizations.**, t The purpose of the research reported here was to evaluate whether existing regulations result in adequate inter-faces among organizations involved with emergency response.
Where appropriate, recommendations for improved interface are offered.
1.2 RESEARCH OBJECTIVES In addressing the general topic of evaluating interfaces among response organizations, three general research objectives were advanced:
(1) to assess how utilities and power plant organizations interact with offsite organizations, (2) to determine if utility plans are consistent with local, state, and federal plans, and (3) to determine if emergency planning efforts will result in a compre-hensive and cohesive response should an accident occur.
1.3 RESEARCH FRAMEWORK Figure 1 illustrates a general model of interactions in emergency preparedness and response which guided this research.
The key element
- President's Commission on the Accident at Three Mile Island.
The Need for Change:
The legacy Of f ice, Wasnington, D.C., 19 79.
of IMI, U.S. Governrent Printing Agency, Criteria for Preceration and Evaluation of Radiological **U Nuclear Regulatory Commission / Federal Emergency Management Response Plans and Precareoness in Sucoort of Nuclear Power Plants, NUREG-0654/FEMAREP-1, Rev.
1, 1980 tu.S. Nuclear Regulatory Commission, " Emergency Planning; Final Regulations," Federal Register Vol. 45, No.162,1980, 55402-55418.
i 1-2 O R G A NIZ A TiO N A L PREPAREDNESS m
CH AR ACTERISTICS PLANNING If R EL A TlO N S FLEXIBILITY WITHIN WITHIN O R G A NIZ A TlO N S O R G ANIZA TIO NS
RESPONSE
NETWORK RELATIONS FLEXIBILITY BETWEEN BETWEEN O R G ANIZ A TIO N S O RG ANIZ A TlO NS r
RESPONSE
RESPONSE COMPREHEN5lvENESS COHESIVENESS ORNL Dwg. 84-10289 Figure 1 A Descriptive Model of the Relationships Between Emergency Planning Response Network and Response Effectiveness
t 1-3 in the model is the process by which interaction takes place.
As depicted in the middle box, planning and response are characterized by (1) intraorganizational relationships, that is, the set of inter-actions that take place among nembers of an organization, (2) interorganizational relationships, that is, the set of inter-actions between organizations and their members, (3) intraorganizational flexibility, or the ability of individual organizations to change in response to new environments or 3
ci reums tances,
(4) interorganizational flexibility, or the ability of organizations to change relationships with other organizations and (5) response networks, or the pattern of interactions among all organizations in the response effort.
The nature of the relationships, flexibility, and network is deter-mined to a large extent by the characteristics of each organization and by the planning efforts that take place before or between emergencies.
In turn the organizational process has a direct effect on response comprehensiveness and cohesiveness.
In this research, comprehensiveness and cohesiveness were used as indicators of effective interface.
This model also helps to depict the relationship between planning and response. As such, it is possible to have comorehensive planning but not a comprehensive response or vice versa.
Thus, in order to determine the adequacy of regulations, it is necessary to systematically assess organizational relationships, flexibility, and networks for emergency response and planning.
1.4 RESEARCH METHODS Studying interactions during NPP accidents is limited by the lack of serious accidents for which empirical studies can be conducted.
Given this problem, a research design was formulated to study interac-tions as they are thought to occur in an emergency and to attempt to understand how the interactions may differ during an actual emergency.
The first task of the research team was to document interaction by reviewing sets of emergency plans for a sangle of reactor sites.
The second task was to review literature on organizational interaction during disasters and other emergencies.
Based on this review, a set of conditions or factors that help explain cohesive response was developed.
The third task was to conduct interviews with representatives of all emergency response organizations at a given site.
The purpose of the interviews was to measure the presence of factors that are associated with cohesive response.
Finally, a test exercise at an NPP was used to examine interactions under a simulated emergency.
Pre-exercise inter-views were held with organizations to ascertain the perceived nature of interactions.
Overt behavior during the exercise was then observed.
i Exercise participants were subsequently interviewed about their experiences during the exercise to again determine the presence and j
absence of factors associated with a cohesive response network.
4 i
f 2.
PLAN REVIEWS 2.1 Sample Selection Resource limits prevented studying all NPP sites in the U.S.
A non-random sample of sites was chosen to meet the following criteria:
(1) Geographic Coverage one site in each state with an operable reactor e
one site in each NRC region e
one site in each FEMA region e
(2) Multiple Jurisdictions range in number of local political jurisdictions e
sites with multiple states in emergency planning zone e
(3)
Nuclear technology a
reactor types covered e
different vendors covered (4)
Population in EPZ e
rural and urban sites population range e
Our chosen sample consisted af 23 sites that met the above goals.
We were successful in collecting data on 17 of those sites, and sets of emergency plans were analyzed for each.
The mix of reactor sites repre-sents a full range of variation in factors important to consider when evaluating probable emergency responses by organizations.
As a result, the purposive sample selected provides a strong, albeit nonstatistical, basis for generalizing findings.
2.2 Data Collection As the first step in evaluating linkages among organizations, it was necessary to determine, on the basis of information in the plans, whien organizations performed specified tasks and their assumptions about wnich tasks other organizations were to perform.
Functional emergency response tasks for a radiological emergency have been sum-marized in Table 1; both key decision points and routine tasks arv iden-tified.
This functional description was used to index how each organizational emergency response plan (utility, state government, local government) assigned responsibility for each task.
The purpose of this procedure was to verify that all plans written for a particular plant were consistent in organizational assignments--not to evaluate the ade-quacy of the plans.
To the extent possible, the three principles specified in the literature review as necessary to help ensure effectiveness of emergency response will also be addressed:
(1) work and role definition, (2) interorganizational network integration, and (3) maintenance of organi-zational and interorganizational flexibility.
Work and role definition
t 2-2 e
e e
C 6
.g e
C
- =
C se O
C De
.w e
ess
=
0 w
6 C
=bwe
=
w w
b t
CW 9
[
e w
m @ =
w w
e O
awm
=
w W=
w
=
C A
=
C C
- bb 3 g
m oO 3 Cw 9
D
- *b F
w rO w e
C s
6 9e w
h g
w
-4 0
3 w
w wh w
w e 9 hw 7
w 9
M b ww E
C w 9w # C w
C O
D U E #
C "3
- Eb w C ab b g
=
w 6
96 he % W gw pw C
D OW b
WD ww Pb" h
w w w & 9 @9 a-K w==
Tb C
q g
= =ww w m
&>=
P bwh W &
L w kV C
a TD 9 C bw 2 6=
= w C4 w
E 6 tw 9 C
C 9 p h
WE bee C L
w CE
- b 9 t w *C wed
- b hw w C=
C w
w che w E e h 2w g O e
eC ET berc
- km a
bC Cw m 6O g &&= w &
C @m C hm* 9C SCC mwww-
= 09 ew awwS-9 m w &= C &
Ob eww 9 6
O-e w 9 9
=W-9 9 9
- C 9 =4w 9 = F*w 3
w 9 9 C W*C 9 C 3 3w eam b C O
9**-
b b w
eC"* OT E e**
=D
C
=
- -w b g wO Ow--A b -
Be
= 6 2 w 0 w w 9 3 9 C9-g t w w b hw b em e= b 3 E=9 6 w b 49 6 0 wh F
&h 0 > 9 w
ab4 -CC=Q-AwK3kw= ws bmEmQw St>w C*wwE 3
Q
=p h
w w h 3
hh h
-CG w
w O
C A 3
C 6 3
e e e e e o e e e e...
w6 e O e e e W e o O
- e * *
.O
&wga94 wDb= PK
=*m-E tw95 9AW
-4M 9A E 94WD p
=
T 6
w O
C g
h w
-3 e
o w
g b e e
wm N
k 9
gm N
Em N
b C
O e
o e
m w
w w
J e
C e
9 e
w
=
C3 9
b
=
b e
C
- e Q
0 w
e E
h w
w C
w b
C k
h r
=
w
=
w C
C 6
C 3
w F
9
=Ce O
[
w O
-TO w C P
b we w
w e
O 9
V C
3 9
CN 9
b C O ww W==0 P
W
= *e g
Ot 3
3w h
6 w
-9 Och C
3 w
w aw Qw 3 w = =
0
= e
=
3 04 C e C
l
-ww
- 9 0 es eg ee h
=3 C P C
C =
9 l
Eww.a P
Q b
>-40 w
w
=O =C
=O F EVE
=
a 3
- a. w *>
b
-h
==
C h
C 9..w e-C w
I
C w
eT w
O=
Q=
0 OQ Ow==Cw O 9
w ww C
l 9 w g w V C
=
0 9
6 bow Ce b w b b==
w=
9 CWw
=w A w0 W=6 C w
3 l
=3
=w 3 3
w w 9 m
=
w e3
= %
==
w
% ww @
0 9 5 ww Cww Ce=E w
w C 3 hw Om 1,
Ow
,9 a - 3 wO 3*
b=Q09 h
- W 4
-w
= g w ww we
- w w
0
=w
-3
-- O 9 0w w k 9 9 w 3
wsDaw w
w e
e eb 3f 3 D b b C O 9 =Ow-e
- C
w(
thh CA2a%1 g-E w
e 9
w 9
wm O
'O I
4 www sC-eme4Qw w
e=W e w
C ebeew &
C ent e*%b t-e9 De Cw 9 w
3 l
FCw a b C CCw***
- kC ewsww bwa C C >= 64 &bw g-6 6 9
4 J O w ew 6 w 0 9=
e C*
3 0 609ew9 09 C Ow9 4ew O C w
OOb b
=
I
~w
9=7
%=
>c9 3 w e=w C$ =E g b Ew Em 6 0 9*w 3 webw 3 e
C eC-m em e #m gm E w eeg mmem e w webb,m@w t.o.
ow e=ww g.
O w =
6,3 w e
b-w9w
-w Ceg w-w C C ww 9 w 3
Oww e...
ewO-e.CeO,6
( w-.9. eO wb=e8 9
m 3 bwA CD4 b(w
%Q w
wQ
=Cw.
etwAwCwwEwm DAJE -CCE43Q-w E
h
=
43
@C C
w 9
w a
w w
=Oh 3
WM 9AE 9A
.w h
3 w h 3
- h 3
b C
t
- e O e.
4 *
- O *
- C W * *
- O e e *
- e w &
- e
- O e e e o e e e e e p
"w 943 ga bw 94W2 9AWTW wM 94WM 9A wgtw pg*
b w
t b
w w
w 0
g *e e
=
w e e
w e e
e wm
(-
N LO n
b o
w g
i.
i i
2-3 includes identifying the emergency response tasks to be done and who is to do them.
Interorganizational network integration occurs when the plans of more than one organization are melded and when decision cri-teria are present regarding shif t of responsibility / authority and level of participation in each emergency response task.
Maintenance of organ-izational and interorganizational flexibility reflects the ability to change decision making processes and operational procedures in order to respond quickly to new circumstances; this need for flexibility is some-times contradictory to the earlier requirements for well-defined roles and authority.
Analysis of the indexes prepared for each of the plans from the 17 NPPS allows us to draw generalizations about interface in relation to each general category of functional response and about emergency planning in general.
2.3 Findings By Functional Response Tasks Each of the major functional response tasks (see Table 1) is reviewed below.
Detection and Warnino 1.
Key Decisions - There is uniform agreement that the utilities must make the initial event classification to start the emergency response process both onsite and offsite.
In each of the 17 plans, the responsibility for the sounding of a public warning is clearly specified.
2.
Routine Tasks - Likewise, there is uniform agreement that the utility must give notification to state or local officials to activate off-site emergency plans.
Initial accident monitoring is a utility responsibility until the offsite emergency plans are activated; there-fore, no conflicts were noted in this area.
Comunications and Control 1.
Key Decisions - There seems to be little problem in the activa-tion of utility and state emergency operation centers, but establishing shifts between state and local governments in command and authority remains the thorniest issue to define in written emergency plans.
Most of the uncertainty may come from the authority to make decisions versus the authority to implement decisions once they are made; for example, of ten a decision is made at the state level but carried out at the local level.
It is sometimes not clear if enly the state can initiate action or if the local government can set some plans in action before decisions are made at the state level.
There appears to be a need for more inte-gration of activities in state and local plans and more specification of authority for segments of the integrated activities.
For example, the organization charts appearing in each plan showing who has primary and secondary responsibilities for tasks in that organization's plan (not overall responsibility) sometimes appear to conflict with the texts of plans that do try to address interorganization responsibilities.
The
2-4 plans could also be more specific as to how interactions with voluntary organizations (such as the Red Cross) will be handled.
However, more specification of authority between organizations nay be an inhibiting factor in maintenance of organizational and interorganizational flex-i bi li ty.
2.
Routine Tasks - Communications appear to be well defined with regard to communication networks and the equipment available for trans-mittal.
What is less clear is exactly what information is to be com-municated and what is necessary except for the state, local, and public warning procedures interactions are necessary in addition to the state, local, and public warning procedures, which appear clear-cut and appor-tioned to more than one organizational level.
Ambiguities in exercising authority ney stem from some decisions being made at the state level but carried out at the local level.
Where and how the local organizations can make decisions normally reserved for state organizations and how state disaster and emergency service organi-zations coordinate overlapping responsibilities with state departments of health are examples of this type of problem.
Some emergency plans are clearer than others on lines of responsibilities and when they shift from one organizational unit to another.
Some plans also address what hapDens if utility and state governments disagree but do not resolve the differences.
The issue, once again, may be how specifically interorgan-izational authority can be spelled out without precluding flexibility.
Accident Assessment 1.
Key Decisions - The placement of radiation monitoring equipment is straightforward and not likely to cause conflict.
Decisions about plant conditions and estimating population exposure are less clearly defined; this can cause confusion.
One possible conflict identified in one set of emergency plans examined concerned possible disagreement be-tween the state organization and the utility regarding classification of the accident.
The utility plan states that the plant nanagement will maintain the classification it believes correct and will act accordingly even if the state organization classifies the situation as more severe.
2.
Routine Tasks - There seems to be fairly uniform agreement about which organization exercises authority and implements most of the accident assessment tasks themselves.
When more than one organization is involved in accident assessment, it is sometines unclear how they will coordinate activities with each other.
protective Action 1.
Key Decisions - The authority and responsibility for key deci-sions about sheltering, site evacuation, general evacuations, and radio-protective drug utilization appear to be well defined.
The general
2-5 evacuation decision is an example of one that can be made by the local organization if the need is deemed immediate and the necessary state organization Is not available, thus retaining some interorganizational fle xi bi lity.
2.
Routine Tasks - There is reasonable agreement about organiza-tional responsibilities in carrying out the protective action tasks, and no major conflicts are anticipated in this area.
Emergency Relief 1.
Key Decisions - It appears that the local organizations make more decisions in the area of emergency relief than in any other segment of our breakdown of functional emergency response tasks.
Once the state has made the decision to recommend evacuation (under Protective Action),
there is agreement in about 16 of 17 sets of plans on who is to make decisions about maintaining emergency facilities.
One ambiguity was found where the state could take over this function from local govern-ment if they so chose.
2 Routine Tasks - In most cases, it is clear that the local organizations retain primary responsibility for most of these tasks, with the states offering assistance where needed.
Specification of tasks and interorganizational integration appear to be well defined in the written plans.
Recovery 1.
Key Decisions - The decisions on reentry to contaminated zones is always the state's decision offsite and the utility's onsite.
Little ambiguity exists in this area.
2.
Routine Tasks - Again, the state plays the primary role in the recovery tasks offsite, and most of these tasks are carried out by the state organizations, of ten assisted by local organizations.
The primary reason for amDiguity lies in the minimal overlap of functions between organizational levels.
Maintaining Emergency Preparedness 1.
Key Decisions - No major conflicts exist in the plans regarding the training and equipment needed.
What is less clear is how the equip-ment will be acquired.
l l
2.
Routine Tasks - All organizational levels accept that they have significant responsibility in training and testing personnel, in maintaining and testing equipment, and in evaluating the effectiveness of their efforts.
l l
i 2-6 2.4 GENERAL FINDINGS From this analysis, we were able to assess five potential problem areas:
1.
We could identify functional emergency response tasks not covered by any organization.
2.
We could determine if overlapping authority existed for response tasks.
3 We could identify faulty assumptions about the responsibilities of other organizations.
4 We could reveal conflicts in authority and responsibilities.
5.
We could assess the level of awareness possessed by one organiza-tional level of the entire response system.
In the first problem area, major gaps concerning coverage of each of the functional response tasks were not found.
If organizations carry out what they acknowledge in the plans, a comprehensive response will be
- achieved, in the second problem area, we found a great amount of over-lapping responsibilities for certain emergency tasks.
Plans in general, however, do not specify the boundaries of shared responsibilities nor present a mechanism for coordination of efforts.
This could be problem-atic if communications during an emergency are not effective.
In the third area, we found that occasionally one organizational level assunes that some other organization is going to perform a task while that other organization does not present information in its plan about doing that Although this was not common, it could create problems in an task.
emergency.
In the fourth area, conflicts in authority and decision making were not evident except in one case where two different organiza-tions claimed to be in charge of the same task.
Thus, on paper, clearly defined lines of authority exist.
In the fif th area, we found that most plans attempted to reflect the functioning of other organizations and the structure of the entire response effort.
In many cases, organiza-tions could go further to improve this type of information in their plans.
Overall, the review of emergency plans suggested that utilities are becoming reasonably well coordinated with offsite organizations in their planning efforts.
The review also suggests that the weakest area of co;e-dination is between the Federal government (NRC, FEMA, others) and all other organizational levels.
This, we suspect, can be attributed to the f act that Federal agencies have not developed response plans commensurate in scope and finality with others. Consequently, utilities do not have a good picture of the details of Federal involvement, and this is reflected in the plans.
In addition, the reviews solidly demonstrate that planning has led to comprehensive response mechanisms.
These findings demonstrate ade-quate interactions in the planning process.
However, it was demon-strated that actual response nay differ from that outlined in the plan
I 2-7 for numerous reasons.
Potential problems that might cause this dif-ference in behvaior have been listed in Appendix B.
Accordingly, the next two sections describe activities that were undertaken to assess the adequacy of interaction in emergency response settings and the cohesive-ness of both planning and response.
l 1
I
i 3.
CASE STUDIES Two case studies (one at a relatively new plant and one at a well-established one) were undertaken to gain greater insight into interface.
The studies sought to identify (1) the degree to which emergency response organizations possessed characteristics of a cohesive system and (2) site-specific factors which may influence interface but which would not be identified in reviews of emergency plans.
Information was collected for the case studies through discussions with re organizations involved with emergency response. presentatives of major These included the util-ity and agencies of state and local governments.
A checklist of topics was developed to measure the existence of factors related to effective response (see Table A-1).
Two groups of factors-those measuring organ-izational relationships and interorganizational networks--were selected on the basis of the literature review (Appendix A).
Several measures of flexibility were developed.
Table 2 provides examples of values for each factor associated with a cohesive response.
The checklist used for the interviews is given in Appendix C.
3.1 CASE STUDY ONE 3.1.1 Evaluation of Cohesiveness Discussions were held with utility personnel, state organizations, affected local organizations, and federal regional offices.
Five state organizations were represented, including those responsible for emergency management, radiological health, human resources, and transportation.
Local organizations included were county emergency management, police departments, the Red Cross, and another volunteer organization connected with communications and radiological health services.
From these discus-sions, we could code each organization's level of effective and cohesive emergency abilities according to the factors presented in Table 2.
This provided us with a measure of cohesiveness for each separate organization and for each organizational level.
In addition, it allowed us to make judgments about how well the entire system rated on each of the factors.
Appendix E provides a summary of data on which the following conclusions are based.
1 3.1.1.1 System Cohesiveness (1)
Intraorganizational Relationships.
Overall, the emergency response system demonstrated a high degree of internal cohesiveness.
This means that organizations, by themselves, possess characteristics that will make them effective in an emergency. Most had clearly defined roles in an emergency that were clearly understood.
Second, in all but one case, it was clear who was in command.
Third, division of respon-Sibilities among personnel within the organizations was well understood.
Fourth, organizations had mechanisms for setting priorities (or already knew the priorities) in emergency response.
Fif th, eight organizations 1.
3-2 Table 2.
Organizational factors used to evaluate cohesiveness Factors Values Associated with Cohesiveness Organizational Relationships Role definition Clearly defined responsibilities Authority Clearly defined powers and authority hierarchy Terri tory Clearly limited boundaries of authority Priority setting Understood mechanism for setting priorities Norma ti venes s Similarity between normal and emergency responsibilities Legitimacy Responsibilities are viewed as significant Communications ability Ease and clarity of access and information Knowledge level of understanding about responsibilities Intra-and Inter-organizational Flexibility Formalization Ability to deviate from written procedures Adaptabili ty Ability to respond to new situations Control Ability to exercise and retain authority Interorganizational Network Domain Clearly defined civision of responsibility Dispute resolution Mechanism for negotiating differences Legitimacy of roles Acceptance by other organizations Resource acequacy Sufficient resources to perform role Autonony Ability to relinquish for good of system Communications ability High level of linkages between organizations Authority Network hierarchies are clearly established Interaction clarity Organizations know whom to interact with Knowledge Functioning of the system is understood
-w y
r--
py,---.--7g-
rm y-e p--
w -
-.m,
,m.
i 3-3 have emergency responsibilities that are similar to the normal duties.
Sixth, all demonstrated a sense of importance in their emergency response roles.
Seventh, in all but one case, communications within the organizations were at least adequate.
Finally, most organizations exhib-ited a good sense of knowledge about their emergency roles.
(2) Flexibility.
The systens demonstrated a high level of flexi-bility, that is, the ability to respond to contingencies and unantici-1 pated situations. Only one organization had somewhat formalized proce -
dures that would constrain flexibility.
Most showed some degree of adaptability.
Finally, the organizations exhibited the type of control over their functioning in an emergency which allows for changing inter-nal priorities.
(3)
Interorganizational Network.
The cohesiveness of the system as a whole was adequate but was not as well integrated as for individual organizations.
The domain of various responsibilities or the way in which responsibilities of organizations are specified were fairly well established.
Few, however, had mechanisns established for resolving any disputes or conflicts that might occur.
Most organizations felt accepted as an important part of the network. Resources to carry out emergency functions were identified as a potential problem area for some.
One of the weaker areas concerned communications ability, where a lack of ade-quate linkages and hardware emerged as the dominant problem.
Another problem area concerned clarity of interaction.
Many organizations did not demonstrate a good sense of knowing with whom they would be working in an emergency, as was also the case regarding their knowledge of the functions of other organizations.
The response network showed fairly clear lines of authority, and organizations displayed signs of willingness to cooperate with all involved in the response effort.
3.1.1.2 Organizational Cohesiveness (1)
Utility.
The NPP organization for emergency response demon-1 strated a reasonably effective structure.
It was very cohesive internally and demonstrated flexibility.
Some minor problems were evident in the way the utility interfaced with other organizations.
Potential problem areas include communications, clarity of interactions, and ability to resolve disputes.
(2)
State.
Organizations within the state showed varying levels of effectiveness in their structure for response.
Several rated well on almost every factor; one was notably poor.
Problem areas were reflective of these for the system in general.
(3)
Local.
Organizations at the local level showed good, although not outstanding, response structures.
They were internally cohesive but I
were less adept at interacting with other organizational levels.
Com-munications appeared to be a major problem area.
i
3-4 3.1.2 Observational Notes Overall, mixed appraisals of state-utility interfaces were received, although the stronger response was that of good working relationships.
The major link between the state and utility is through interactions between the emergency organization and utility corporate emergency plan-ners (often not located at actual plant site).
This precludes much interaction between the state and onsite personnel.
The state radiolo-gical health organization felt a lack of qualified personnel for radiological emergencies.
This lack can lead to a questioning of tech-nical ability; such a problem can affect working relationships between the state government and the utility. Most felt one great value of the exercise was the opportunity to meet and work with the people who would be responding to an emergency.
Local government has a minimal amount of interaction with the uti-lity regarding planning and response other than through the annual test exercises.
In the past, some of the agencies have had difficulty with initial notification procedures and feel the procedures are too complex.
Another major area of concern for local government is the lack of under-standing the public has concerning sirens and emergency response.
More utility cooperation concerning public education is needed, particularly in reaching seasonal populations and in improving a utility-disseminated brochure on emergency preparedness and evacuation plans.
A general overall feeling conveyed by local organizations is that the utility is not highly concerned about working with them except to meet regulatory requirements.
We had difficulty in assessing utility / Federal interactions because they were not readily observable.
This dif ficulty suggests that greater attention should be placed on planning Federal involvement in emergency response.
3.2 CASE STUDY TWO 3.2.1 Evaluation of Cohesiveness l
The same techniques for assessing cohesiveness were used in case study two.
In this study 6 smaller number of organizations played key roles in emergency response.
The state emergency management organization has overall responsibility for response and is assisted by a separate i
health department, which is responsible for radiological assessment.
Although the primary responsibility for implementing response lies with local government, the state activates and maintains control over local organizations because of multiple jurisdictional involvement. All con-l tacts between the utility and local government, whether for planning or response, are channeled through the state.
The state emergency planning organization assumes responsibility for coordinating all state and local planning with the utility.
In this context, the cohesiveness of the system and of organizations is reviewed in Sects. 3.2.1.1 and 3.2.1.2 with respect to the factors in Table 2.
i 3-5 3.2.1.1 System Cohesiveness (1)
Intraorganizational Relationships.
The internal cohesiveness of emergency response organizations is extrerely high. Almost uniformly, we coded organizations high on each dimension measured.
In only one instance did an organization display a problem, and this was a lack of response priorities.
(2)
Flexibility. As with internal relationships, the organizations rated well on flexibility.
In every case, organizations showed evidence of being able to deviate from formal procedures, adapt to new situations, and neintain internal control.
(3)
Interorganizational Response Network.
The cohesiveness of the entire network displayed greater variability, with problems in three par-ticular areas.
Fi rs t, the drive for autonomy, or the desire to maintain control over responsibilities, could be detrimental to response.
- Second, communication abilities are not well demonstrated in some cases.
- Third, lines of autnority for offsite response have not been well defined.
Other factors that were problems in case one did not emerge as problems in this system.
These include interaction clarity, knowledge of the functioning of the entire response system, and resource availability.
3.2.1.2 Organizational Cohesiveness Unlike case one, no great variations between organizations were observed.
The utility demonstrated cohesiveness in nearly every factor At the state and local level, some organizations rated margin-assessed.
ally better than others, but no single organization consistently measured poorly.
Overall, the system reflected effective response planning.
3.2.2 Observational Notes The organizations interviewed saw no problems in getting adequate resources for an emergency response and clearly understood with which l
otner organizations they would be working.
In many cases, there was frequent contact among the agencies.
The state emergency response organization has a somewhat military-oriented structure, and some of its staff, as well as the local emergency response directors are former military officers.
The emphasis of some of these persons on the control aspects rey be a reflection of both the plan and their own backgrounds.
Other agencies tended to stress coor-di nati on.
Turf battles were considered by those interviewed as more likely to occur because of personality conflicts than because of plan deficiencies.
The state and local staff persons believed they had a clear under-standing of what their own and other.. agencies' responsibilities are and
i 3-6 thought that their work was well accepted.
Several stated that commit-ment to emergency planning by their upper management prompts acceptance and support throughout the rest of the organization.
The personnel interviewed anticipated that during an emergency there might be some communications problems.
These included mechanical problems, too few telephones, lack of direct contact between the local government and the utility, problems with interpreting technical lan-guage, a somewhat cumbersome formal message system in the state emergency operations center (E0C), and possible personality conflicts.
Difference in time zones within the state was identified as a point of possible confusion.
The state E0C located at the state capital as well as the state E0C located close to the plant operate on the time zone of the capital.
The local government response organization would operate on local ti n.
Actual interfaces between different levels of an organization had not been tested through exercises, although the set of plans described the assignment for key decisions and routine response tasks.
Some examples of potential problems were discussed. Although information on control is given in the plans, a city or local government might not accept direction from the county or state in actual practice.
Enforcing control authority was expected to be difficult.
Another expressed con-cern was that the state radiological health department is dependent on the utility for atmospheric-release projections, and there are no inde-pendent checks on the utility's projections.
In summary, weak points agreed upon by all included data com-munications (the utility hopes to install an automatic system), control of field teams from different agencies, recovery and reentry operations, notification of the public in rural areas, sheltering arrangements (since a nongovernmental agency is providing them), depth of technical ability and training for monitoring teams, adequacy of staffing for monitoring teams, ability for timely response in a quickly developing incident since monitoring teams have several hours of travel time to the site, and inadequate allowance for human nature and unanticipated response in the plan.
i 4
TEST EXERCISE In conjunction with one case study, observation of a test exercise was used as a means of collecting further data about interface.
This simulated situation provided a useful surrogate for examining interaction in an actual emergency.
It is assumed that the interface problers revealed by the exercise would reflect general deficiencies in existing regulations as well as illustrate the problems corrected by those same regulations.
Findings were derived from observations of activities and interac-tions in the utility technical support center (TSC) and emergency operations facility (EOF) and in the state / county E0Cs.
No Federal agencies participated in the exercise except as reviewers.
Participants in the exercise representing the major emergency organizations were debriefed, utilizing a checklist of topics designed to complement that developed for the case studies (see Appendix D for the test exercise check-list). A comparison of the exercise results with those of the case study offers insight on how organizational interactions ney change in an actual emergency.
Appendix F provides information on the nature and scope of the exercise.
4.1 INTERFACES DURING THE EXERCISE By comparing data from the case study with those from the exercise, we can systematically observe potential differences between interface as reflected by planning activity and planned response and interface in actual response in an exercise.
Again, we distinguish between cohesive-ness of the system and that of organizations.
4.1.1 System Cohesiveness Differences in cohesiveness between planning and exercise response reflected the general patterns emphasized in the case study.
Organiza-tions, in general, displayed greater internal cohesion and similar levels of flexibility, but there was poorer cohesion between organiza-tions.
(1)
Intraorganizational Relationships.
Organizations demonstrated improved cohesiveness in the exercise from what was observed in the case study.
Responsibilities were better defined, and people within organiza-tions had a clearer delineation of roles and an improved sense of legiti-macy. Communications, identified as a potential problem in the case study, were an even greater problem in the exercise.
(2) Flexi bi li ty.
As in the case study, organizations demonstrated adequate flexibility.
Overall, ability' to respond to new situations proved higher, although ability to deviate from written procedures decreased slightly.
4-2 (3)
Intraorganizational Response Network.
Problems identified in the case study were exacerbated in the exercise.
Factors that did not pose problems in case studies either remained adequate or were rated more effective in the exercise.
Across organizations, three factors proved extremely problematic.
First, interorganizational comnunications constrained effective interac-tion.
This was heightened by a lack of interaction clarity and any observable means of solving differences.
In addition, the exercise revealed some problems of legitimacy among organizations that were not evident in the case study.
On the other hand, the exercise revealed that there was an increased level of knowledge about what other organizations do than was measured in the case study.
Furthermore, division of responsibilities remained adequately defined.
This suggests that organizations are adequately prepared, and know what to do but have some difficulties in coordinating those efforts with others in the emergency response system.
4.1.2 Oroanizational Cohesiveness When the cohesiveness of a specific organization in the test exer-Cise is compared with that of the same organization in the case study, several patterns emerge.
Fi rs t, the utility displayed less cohesiveness in the exercise.
This is mainly attributed to less cohesive ratings on interorganizational factors.
Second, the same trend and cause are ob-served for all state organizations.
Third, an opposite trend is found for local government.
Organizations at this level demonstrated increased cohesiveness during the exercise. _0ne gossibh explanation is_
pt organizations with little experience actG~alTy, performing emergency-tasks show lower cohesiveness in response than in planning.
On the other hand, those with more practical experience demonstrate more cohe-siveness in behavior in comparison to planning.
This underscores the necessity of simulated response to develop cohesive organizational interaction.
4.2 THE VIEW FROM THE UTILITY EOF /TSC The major problem observed onsite was communications with offsite organiza tions.
Difficulties with the offsite notification procedures started immediately with technical problems with phone lines.
This was made more severe when the person designated as the state warning point, l
recebed by using automated dialing to a predetermined list of persons to be notified, gave instructions to call someone else.
This caused delays in notification and also undermined the utility's confidence in offsite organizations.
The personnel communicating the information on the stan-dard notification form had difficulties in delivering messages and could l
not provide explanations or more detailed information.
Information up-l dates were quite untimely.
Once the EOF was activated, better communi-l cations ensued because of more direct and continuous lines of contact.
i 4-3 Compunications seemed to break down, however, during the transition of congnunications and other functions from the TSC to the E0F.
Some difficulties existed regarding the coordination of protective actions between the utility and state. Although the site emergency coordinator made a recommendation at 9:45 a.m., following the 9:00 a.m.
General Emergency, to activate the warning system, disagreement over evacuation led to a delay in that recommendation until 11:08 a.m.
Consequently, evacuation war not ordered by the state until 11:55 a.m.
This points to a need for more timely coordination between the state and utility on protective action decisionmaking.
4.3 THE VIEW FROM THE STATE E0C The state's most difficult situation was with gathering enough information to make an informed decision regarding evacuations.
A State of General Emergency was declared at 9:00 a.m., and the evacuation deci-sion did not becone effective until 11:55 a.m.
Part of the decision-making confusion stemmed from the utility declaration of a general emergency with a recommendation for no protective action.
The state radiological emergency team was unable to get more than a minimum of information transmitted from the utility for up to I h before and after tne utility declared a general emergency and needed more data regarding plant status, projected doses, and other problems.
Though the state and the utility had standardized dose conversion factors, isopleths, etc.,
some confusion remained regarding population doses versus individual doses and in some field measurements.
Smoother flow of information from the utility to the state E0C would have been expedient, as would have been quicker assessments and recommendations from the state radiological team to the state EOC director.
Once the evacuation decision was declared, the simulation of eva-cuation was accomplished quickly, though both the emergency medical ser-vices and the Red Cross stated they would prefer more lead time for e va cua ti on.
The mejor events as posted at the state EOC log board are summarized in Appendix F, Table F-1.
4.4 THE VIEW FROM LOCAL E0Cs The local county EOCs experienced their greatest difficulties in communications as well, particularly those with the utility.
The infre-quency of incoming information and their inability to open better com-nunication links with the utility were frustrating.
The local E0Cs would like to have a utility representative at the local E0Cs or at least one person per county in the utility EOF designated to provide information directly to the local E0Cs during emergency test exercises.
i 4-4 Since the county units maintain ongoing relationships with the local and state units dealing with different types of emergencies, it is only the utility which does not have frequent contact and therefore has fewer information channels through which to communicate.
For example, the local EOCs were not advised of the first radiation release for 2 h.
Because communications were slow, there were times when contradictory information was received and it was hard to verify what was happening.
Particularly when the utility EOF had to be evacuated, communi cations were lost for a significant time period at a vital point in the test exercise.
All local organizations would have welcomed more frequent updates of information; and better radio communication links / equipment are needed, although all thought the amateur radio network instituted for the first time was helpful.
The shelter program worked well, but the Red Cross could have used earlier notification that evacuation was imminent.
Lack of communication between state and local units about a deci-sion to distribute protective drugs (KI) to emergency workers created some problens on the county level.
The county E0C was unsure what situation prompted this action and felt that the local health personnel should at least be informed of the decision, if not allowed to par-ticipate in reking such decisions.
Better radio communications are still needed, even though the ama-teur radio network which was used in this exercise for the first time was extremely beneficial.
However, the state radio van and even the phone line to the utility did not work part of the time during the test exercise.
Some concern was expressed about the initial message on the Emergency Broadcast System which stated that the governor had taken over emergency operations before the public had been told that there was any problem at the power plant.
This, however, ray have been a function of the brevity of the overall test exercise; everything had to be done in a highly compressed time frame.
~
i 5.
FINDINGS AND THEIR IMPLICATIONS In this section, we present the conclusions from each of the three research tasks and discuss the implication of the findings for regula-tory change.
Each task is discussed and a final set of implications is presented.
5.1 FINDINGS FROM THE PLAN REVIEWS The two purposes of the plan reviews were to determine whether plans would constrain a comprehensive response to an emergency and to determine if planning activitier were reasonably coordinated.
In general, the review of emergency plans suggested that there is adeouate interface between utilities and offsite organizations in the planning process and that this interaction has led to well-coordinated planning documents.
Although this conclusion is generally true, we also noted several areas in which interface during the planning process can be improved.
Specifically, we found that (1) All major response functions are covered by the plans.
(2) Few conflicts about responsibilities exist.
Those that do can be alleviated through further communication between the utility and the appropriate offsite organization in the planning process.
(3)
Plans do a good job of describing the responsibilities of other organizations in the emergency response system.
Several plans reviewed could be improved by providing more details about the roles of other organizations.
(f. )
One area that could be improved concerns situations where organi-zations share responsibilities on the same response task.
Mechanisms for dividing or coordinating the shared responsibilities are lacking in the plans.
(5) A second area requiring improvement concerns the level of detail in plans about the responsibilities of Federal agencies.
Although the utility plan clearly specifies the role of, and coordination with, NRC, the level of information about other Federal agency involvement does not provide an adequate picture of Federal responsibilities.
5.2 FINDINGS FROM THE CASE STUDIES The case studies attempted to measure and assess the presence of f actors that help promote cohesive response efforts both within and be-tween organizations.
organizations are quite cohesive.We found that, internally (intraorganizationally),
in relationships across organizational response networks. Cohesion tends to Organizations demonstrated flexibility in their response systems, which will help increase the effectiveness of response.
Specific findings include the following:
i 5-2 (1)
The major barrier to effective interf ace among organizations is communication ability and hardware.
(2) Response cohesiveness is-constrained, in addition, by uncertainty over who should be communicating with whom and by a lack of know-ledge witnin an organization about the roles of other organizations.
(3) Response organizations showed some variability in their overall levels of cohesion as measured by the research.
Key organizations such as the utility, however, rated high on an index of cohesiveness.
(4) local organizations have the weakest interfaces with the utility and must rely on the state for information and guidance.
(5)
Individual personalities within organizations play a strong role in facilitating or preventing interaction in the planning process.
This will vary from site to site and with changes in personnel within organizations.
(6)
Offsite organizations are constrained in their interfaces with utilities by a lack of equipment and trained personnel.
This is particularly evident in radiological monitoring and assessment.
5.3 FINDINGS FROM THE TEST EXERCISE The test exercise was used to determine if comprehensive planning led to a coordinated response and to assess how cohesiveness Fey change during a simulated response.
Overall, we-found that the observed response was not as well coordinated as the planned response.
This was mainly due to problems in implementing procedures and not from having inadequate plans and procedures.
In addition, we found that internal organization cohesiveness increased during the exercise but that interorganizational cohesiveness decreased.
Specific findings include the following:
(1) Communications difficulties created the major problems in achieving effective interfaces.
This was exacerbated by confusion over the proper lines and contents of communications.
(2)
Legitimacy posed an interface problem for sone organizations.
Attitudes of the utility toward offsite organizations with respect to their technical ability and resources was a constraint to inter-action.
Furthermore, offsite organizations did not fully trust utility personnel regarding communications about plant status and protective action recommendations.
(3)
Poor implementation of the offsite notification procedure created problems but is sound in principle.
5.4 IMPLICATIONS FOR EMERGENCY PLANNING REGULATIONS j
The purpose of this research was to determine if existing regula-tions resulted in adequate interface between utilities and offsite organ-I izations in emergency planning and response.
To address this question, we attempted to assess two elements of emergency management that could be used to measure the level of interface:
the comprehensiveness and
i f.
\\
1 i
t 5-3 i,
i-cohesiveness of planning and response.
Comprehensiveness of planning was determined by a detailed review of emergency plans.
Comp rehen-siveness of response was determined by the evaluation of a test exercise and case studies of two response networks.
To assess cohesiveness, we identified, from reviewing relevant literature, a set of factors asso-ciated with cnhesive response.
The two case studies were used to measure the presence of these factors in planning, and a test exercise was observed to measure the presence of these factors in response.
Findings suggest that implementation of existing regulations has led to comprehensive planning efforts.
Minor improvements to plans can be made, but they fall within the scope of existing rules, regulations and implementation guides.
On these grounds, no regul_LtEy. chartges are warranted.
Findings suggest that regulations will lead to fairly comprehensive responses to an emergency.
Evidence from the case studies and the test exercise suggest that problems are due to poor execution of emergency plans and lack of resources.
Existing regulations explicitly deal with these problems, which can be reduced by better enforcement of regulations.
More difficult and abstract to assess is the level of cohesiveness in and between emergency organizations.
Our work revealed that conesive-s ness as measured in the planning process is strong within the organiza-tions but somewhat weaker between organizations.
The difference was even more pronounced for cohesiveness in emergency response.
The major i
reason for lack of cohesiveness was poor communications and a lack of knowledge about whom to communicate with.
This is made more problematic by a lack of legitimacy among organizations; that is, some organizations do not have confidence in others or do not believe they play an impor-tant role.
In a related fashion, personalities of individuals within i
organizations constrain effective interaction and heighten the legiti-macy problem.
We feel that the communication problem can be solved within the existing regulatory framework.
It is chiefly a matter of better equip-ment, better training, and greater interaction in the planning process.
The problers and potential resolutions are acknowledged and understood by the organizations involved.
The legitimacy and personality issues are not addressed by current regulations.
Although they contribute to interface problems, it is our belief that they cannot be effectively or even inefficiently solved by regulatory change.
Whether they can be reduced by enforcement of existing regulations is also doubtful.
Any effective solution to the legitimacy problem must come from within the organizational system itself and not through regulatory change, which could create more problers than it could solve.
=.
i
)
APPENDIX A DYNAMICS OF INTERFACE:
ORGANIZATIONS DURING EMERGENCIES BACKGROUND An elaborate body of literature exists on organizational behavior and interorganizational relationships. A subset of this literature con-cerns the response of organizations to a variety of disasters.
Several attempts to summarize this literature already exist (Mi leti, 1980; Quarantelli and Dynes,1977; Mileti et al.,1975; Barton,1969; Dynes, 1970; Fritz, 1961, 1968).
Organizational response to disaster is not unique.
On both theoretical and applied grounds, organizational behav-ior relationships in disaster reflect the more general findings of organ-izations research.
In this section, we review what is known about why organizations are effective or not in response to disaster, as well as what is known about why organizational coordination does or does not To do so, we have organized the findings into four categories occur.
based on the iritra-and interorganizational dichotomony and the predi-saster preparedness and warning versus the disaster response period.
From this review, we are able to specify the factors that lead to cohe-sive planning and response.
ORGANIZATIONAL EFFECTIVENESS IN DISASTER WARNING RESPONSE Studies that have explicitly focused on the behavior of organiza-tions during disaster warnings have been scant compared to other types j
of organizational studies of disasters.
Some dozen or 50 studies, L
however, have revealed that several factors do seem to affect the effec-(
tiveness of organizations ouring disaster warnings.
A general conclu-sion of these studies is that disaster experience enhances the ability l
of an organization to participate in the warning process, as well as l
respond to warnings (Mileti et al.,1975; Barton, 1970; Dynes,1970; 1
McLuckie, 1970; Anderson, 1969; Moore, 1956; Eliot, 1932).
The capacity of an organization for comnunication (Leik, Carter, and Clark,1981; i
Mileti et al.,1975; Kennedy,1970) has also been pointed out as central to organizational effectiveness. A third factor that has been docu-mented as important for organizational warning effectiveness is the per-ceived probability of the disaster (Anderson,1969; Fritz,1961; Fritz and Williams,1957; Spiegel,1957; Instituut Voor Sociaal Onderzoek Van Het Nederlandse Volk Amsterdam,1955).
Organizations are quite reluc-tant to participate in warning dissemination if organizational officials are not reasonably confident that the hazard will materialize.
Fear of negative public reactions for issuing a false alarm is a main reason, j.
The fourth factor research has shown to be linked with the effectiveness of organizations during warnings is the structure of the organization l
itself.
Factors of structure shown to have an influence on the ability of an organization to mobilize in the preimpact situation are varied.
i A-2 Mobilization is typically quicker and less problematic for organizations that dispersed rather than centralized and formalized decision-making structure (McLuckie,1970; Instituut Voor Sociaal Onderzoek Van Het Nederlandse Volk Amsterdam,1955); and little role conflict for organi-zational members (Thompson and Hawkes,1962).
ORGANIZATIONAL EFFrCTIVENESS IN DISASTER IMPACT RESPONSE A large number of studies have been performed which provide a sound basis for concluding what determines the effectiveness of organizations, as individual entities, in their disaster response.
The findings of these studies, wnen brought together, point out that seven key ingre-dients are necessary for an effective response.
The first of these is labeled normativeness.
That is, it has been found (Adams,1970; Anderson,1969; Dynes, Haas, and Quarantelli,1967) that the less an organization has to change its predisaster functions and role to perform in a disaster, the more effective is its disaster response.
In essence, organizations whose daily operations can be switched to the topic of the emergency at hand do better than organizations who must adopt new opera-tions that are unique to the emergency.
Second, and closely linked to the notion of normativeness, is the ability of an organization to be flexible.
Organizations that are better able to vary from standard operating procedures during the disaster are typically more effective than those that cannot be flexible (Drabek et al.,1981; Kreps,1978; Stallings,1978; Weller,1972; Brouilette and Quarantelli,1971; Haas and Drabek,1970; Drabek and Haas,1969a,1969b; Dynes and Warheit,1969; Warheit,1968; Dynes,1966; Moore,1964; Barton,1962; Form and Nosow,1958).
An organization that i
is rigid in structure, in general, has a difficult time dealing witn the uncertainty of disaster (Dynes,1969) and adapting to its needs.
The result is tnat effectiveness suffers.
A third major factor affecting the ability of an organization to be effective in disaster. emergencies is work definition.
Extensive evi-dence exists on which to conclude that organizations who know what to do, how to prioritize work, and how to administer the activities are more effective.
The issue of work definition is particularly important in organizations for which emergency work is not a daily routine.
In this case, definition of disaster or emergency roles as part of emergency operations is essential (Haas and Drabek,1973; Adams,1970; Kennedy, 1970; Dynes,1969; Thompson,1967; Barton,1962; Form and Nosow,1958).
Organizations must be able to see emergency response as their job and have clearly defined roles to play.
In addition, the clear definition of tne internal authority structure of an organization must be spelled out (Dynes,1969; Form and Nosow,1958).
This need is particularly acute since authority in organizations during emergencies typically shifts from wnat it is during routine operations.
A-3 To complement authority, the work domains or territory of each organization, as distinct from other organizations, should be clearly defined (Dynes, Quarantelli, and Kreps,1972).
Fourth, adequate resources are necessary for efective repsonse (Kreps,1978; Dynes, Quarantelli, and Kreps, 1972).
It has also been suggested (Form and Nosow,1958) that interorganizational resource dependence helps ensure an adequate supply of resources.
A fifth important ingredient for effectiveness is information and communication ability (Quarantelli,1970).
Organiza-tions that are able to effectively get and share information with others typically enhance effectiveness (Dynes,1969).
Sixth, organizational legitimation, or the claim to be able to do their emergency-tied work with approval and recognition from other organizations, is related to effective response.
Finally, internal organization cohesion between members is an impe-tus for organizational effectiveness.
Comitment (Dynes,1970; Quaran-telli and Dynes,1977), group cohesion (Form and Nosow,1958), and a lack of role conflict (Dynes,1969; Barton,1962; Thompson and Hawkes, 1962; Form and Nosow,1958) typically all signify that organizational workers are ready to get the job done.
INTER 0RGANIZATIONAL COORDINATION AND EFFECTIVENESS IN DISASTER WARNING RESPONSE Little systematic research has been performed on this topic beyond a few studies (cf. Leik Carter, and Clark,1981; Mileti, et al.,1975; McLuckie, 1970; Anderson, 1969).
The results of these efforts indicate that the interorganizational elements essential for effective inter-organizational warning-tied interaction and communication are definition of an organizational role in warning (cf. Mileti, et al.,1975; Kennedy, 1970) and preemergency patterns of interorganizational comunication on which to build during the emergency (cf. Barton,1969; Dynes,1970).
Put simply, for warnings and information flow between organizations to be effective, organizations must define dissemination as part of their job, and then communication will still favor familiar lines.
In addi-tion, effectiveness is enhanced if information is clear, unambiguous, and comunicated' in a speedy fashion (Anderson,1969).
i INTER 0RGANIZATIONAL COORDINATION AND EFFECTIVENESS IN DISASTER IMPACT RESPONSE A rich research history has explored the nature and character of interorganizational relationships in emergencies in an effort to trace i
through their impact on emergency response effectiveness.
An overriding conclusion of this research is that interorganizational coordination enhances the effectiveness of the overall response to the emergency.
Tnis research has snown that many concepts form a basis for the effec-i tiveness of overall response.
Wnen the different approaches of 1
l l
1
i A-4 researchers to the topic are taken into consideration, the array of con-cepts, however, fits well into four general categories:
(1) domain con-sensus and role specification, (2) network definition and integration, (3) communication, and (4) autonomy maintenance.
Domain consensus and role specification refer to the degree to which each organization knows what it and other organizations are to do during the emergency (Dynes,1978; Kreps,1978; Dynes, Quarantelli, and Kreps, 1972).
Put simply, the effectiveness of overall response to an emergency escalates if all responding organizations know who is to do what and if those boundaries are well understood.
Knowing, however, is not enough to ensure effective response; it is also necessary that organ-izations view each other's jobs as being important or legitimate (Dynes, 1978; 1969; Stallings,1978; Warheit, 1970).
This facilitates clear lines of authority between organizations (Drabek et al.,1981; Thomson and Hawkes, 1962; Rosow, 1955). Clear lines of authority between organ-izations nelp avoid conflict and enable conflicts to be resolved when i
they do occur, although not as well as does a predetermined mechanism for settling disputes.
Integration across organizations is a second important factor and is easier to achieve if participating organizations interact normatively during nonemergency periods (Drabek et al.,1981; Dynes,1978; Brouilette,1971; Form and Nosow, 1958; Clifford, 1956).
Organizations tnat are used to interacting with each other are easier to coordinate for interaction in an emergency. Coordination and integration across organizations in emergencies also escalates as a function of organiza-tions having overlapping members or the same people being on boards, panels, committees, and the like across organizations (Dynes,-1969).
The notion of interlocking membership suggests that interaction, com-munication, and coordination are facilitated if people have overlapping organizational roles.
In this same vein, the existence cf boundary per-sonnel, people who are charged with interorganizational communications, usually guarantees that interaction occurs (Dynes, 1969, 1978).
Inter-action escalates coordination, and coordination enhances effectiveness (Drabek et al.,1981; Dynes, Quarantelli, and Kreps,1972; Dynes,1970; Warheit, 1970; Barto6,1969; Parr,1969; Drabek,1968; Fritz and Marks, 1954; Kutak, 1928).
An additional factor that has been shown to facili-tate network integration and coordination is knowledge about other organ-izations (Dynes, 1978).
If organizations understand about the internal i
operations and structure of other organizations, it is easier for them to coordinate and communciate with those other organizations.
The general idea of network definition and integration for interorganiza-tional coordination, although comprised of several concepts, is straight-j forward.
Interorganizational coordination increases as a result of work to integrate the organizations participating in the interorganizational emergency response.
A key device for enhancing intergation is the construction of resource linkages across organizations (Drabek et al.,
1981; Dynes,1978,1970b; Kreps,1978; Stalling,1978; Ouarantelli and Dynes,1977; Warheit, 1968; Demerath and Wallace, 1957).
Resource
A-5 sharing and interdependence across organizations sometimes foster other avenues for interorganizational linkages.
The major conclusion of this research suggests that coordination and effectiveness of interorganiza-tional emergency repsonse increase as a result of prior efforts to cast participating organizations into an integrated response network.
Third, communication between organizations is another essential ingredient for an interorganizational emergency response system to be coordinated and for heightened effectiveness (Drabek et al.,1981; Dynes,1978; Brouillette,1971; Quarantelli,1970; Dacy and Kunreuther, 1969).
Efficient interorganizational communication is essential for the provision of information between organizations regarding their spe-cialized roles and tasks and for the quick dissemination of news about the changing context of the emergency.
Finally, autonomy maintenance (Mileti et al.,1975; Dynes,1970; Dynes and Warheit, 1969; Parr, 1969; Warheit, 1968; Quarantelli and Dynes,1967; Thompson and Hawkes,1961), or the stnJggle on the part of individual organizations to resist giving up autonony, is a major con-straint to effectiveness.
A requisite for an effective response to an emergency is that participating organizations be convinced that incon-sequential losses of autonomy are in the overall interest of an effec-tive response.
SUMMARY
AND CONCLUSIONS The behavior of organizations is a critical component of emergency planning and response.
Moreover, factors important to casting the effectiveness of that response are not only limited to organizational ones, but also extend to those which profile the type and intensity of interorganizational relationships that go on between organizational Table A-1 presents a summary of factors important for emergency actors.
response coordination and organizational and interorganizational system effectiveness which were derived from the literature review.
Three key principles critical to effective response emerge from this table.
F i rs t, organizat-ions must know what they are supposed to do and who is to do it.
Second, organizations must be integrated with other organiza-tions.
Third, they must maintain flexibility.
Each specific factor is related to one or more of these general principles.
The specific fac-tors and the three overriding principles will be used to develop eval-uation criteria and to guide the analyses used in this research.
1
-t-
-e----
w
-v. -. -
,,.-e v-
.-.c
=
--m--
---.....---%--,-.,.---y
->m,-
A-6 Table A-1 Summary of organizational factors related to effective response Organizational Interorganizational Relations Network Planning and Warning Period e Disaster experience e Role definition e Communications e Communication e Perceived probability of disaster e Organizational structure
- Flexible decision making
- Role conflict (territory)
Disaster Response Period a Normativeness e Domain consensus (boundaries) e Flexibility e Legitimacy e Work definition e Dispute resolution
- Role definition e Authority
- Priority setting e Interaction
- Authority e Flexibility
- Knowledge e Knowledge e Resources I
e Resource linkages e Information and communications e Communication e Legitimacy e Autonomy e Cohesion 1
f 4000.e' s
sw t dQ'st?la. 8400.t*5 19 late.it;' 'L b ' *ai a
8 7,att'Jae-Telst 83te*t 4 a*23 tal 4
SettCt'0*'oe**'*q (stat ;tell Cat'0a
- *C O'*tCt p C l ell s et i et*
901 *ttega'itt
- *tt 'lteaCf 13 3tt ' d't f*t*39* 7
' 3fl*lts*Ce t 3 tae age, a ll' ' '
- e! '0*
- . *1deettatat Seer *
- t e *0ett e t ' 3a '; ' e ll Ca t ' 0*
J t 'l' t e act ' e t ' t*
. ?*t et t. 0' *1 ' t t ' s e**3t 3e *t s *996
- J84 39*108 l 4381e;193
( Jw ' amtat f e ' ' ert
.3**tC1 attle 99 "C'
insta
' "filett *01 493t'l:300
- '9168%e *0t >ttet3 40t'Cet'0a 'l J et* ' Jos ef 8eDi'; e64eg,.e****4
[g,ipmy g teolg
- tde't *01 *tsC*tJ 3e ed**'*9 J'Pf C1 *ttlett *01 J'sta
- tineet *01 e at'lt300 a
- tlleet 901 Sf!'totJ
' stC'l'0* 13 se** *2
- edt 4 e*0*l 3*tC tet J:'e'
- 0!' 0 eePa'ag Pet 00at'3'l' ties att CePP'90 Ovt ia't 's. eC C lotat 90*' t 3*'*6 E2 'Datat 'e'te t e
Sete att belleet3 ldte *'l'9tteeretto
'*' 3*9418 0a *0 t 138'ungn' Ca t t0
.*f0*9et'Op a' lC 94*wa'Cetto (ammva 'C et t ealit0*! *0 '
4Ct'vete LLC EJr s 13*aet *0s *es:*es e
' 84;'! el at t.5083 9 (1t48la ;3mm.a 'e' 3*1 *t t e0**
- e *0es et.
! J. '3at*' *0! #ea' 43*t
+
- J a ' Dat* ! 8e*
.e
.iaal Ove*300et!
JtC
- l' 3* t 3 40* ' vett "O*
%ect
- At*l0aati v eve'!a0lt 18 3DePett a
(StdO llla ;Dmmead e.t#0*'tp #'#PePC"F
"'ll80'
"'I E' """* OI S el0*ati act ses*leDie t
<tt'l'Dal *0: "e 3e
.3*Df!*t'3*
'3*
- 39t*01
- l'at e*3 *t t e t ' 3* 3' *tl00*l'3'l'ty
' 19t*ttat '9130*1'3*t'95 *tt303 i
ACC'Ot*f ellelletat
.OCe tt ee4 ' et 'Sa 90* *
- 3*'*$ lamellag 90e'Daeat
- 8 #;to * *
- 2a. ; e t ' 3'
' 14. ' 3*t *! ' e ',
- f *
- ll' *$
%Ct entt
- f%20*l'3'e tet* Jae tt
.J ' IfCI e441749 3814
' 8t*1J**e 4*ese
- f
( J.* 29t** **
.19:
- t;'
e De*0*3 *01
.e**'t*
4.!
.ete
- 0'
.1t; ;P**t;t'e
!ste * ' *e stet 895e'* 5 *0! 039P.**;e't*
(it' tete newPCt it**l
- 8*3C f 34 *f '3 ->=t3
- C3**t;* e
- Date **te*0Petti
- CJ'*t:,
4 #110' *ttt0P010$'sel '3*0tt'08%
0
[3vlDatat 'e'le*e
. =':ce.cliastit ese'43'
't 7 a01 Setectes
- 04:4 #01 Collectet
- .*'armat'Oa 90t 3** a'Catto ilt'*ett Gott 9 0054*e
- "'St e t te ' et ' 0*
- (**3ateel ellemDt' sal 43:4? * *eal00Pt
- 40t 33**
- 401 C P4m a ' Ca t ed a
- eca Olv t
.*l 9:Itat Jete e
- ** lCd ' ?w l et '0*
- 40% Dem #ttette w
).
k
e 9
)
B-2 400t*0ts e Coat'*wtSI Iv Ctt0*dl *dle n a
O c t * * * ' e ' *
- 30
- l
- . #eateCitet ett'0*
i'te eveCoet toa a Pt'leaaf t act eteC*to
- ee'a'ag act **19't:300
- ed**'*g 90 Sf6't,93
- 000059 30 not eaDe east 10 30 i
' Dt0D t '91151 itse**g t* cal #0etet'Oa a0t see'les;e weatrol feeCvett0*
lel *01 40**f C t ly et t * *e t et
- *tC 39 mea 04 t
- 0a act 3 et*
3tC 'l' Da a0% *e39 le*e el 390l'C es ** * *t itCtt3" A Jalite 0*0ttft'Sa
- 9 3. ' Para t *
- l l '
- 1 '* e ' ' l itC ' l '09. 90t ' ' ' id t ' 3* *0t 4439
'aC 3**tC t ' y 4591
- 01 unto 4pattpe a *e9 9 a 005.79 Oal'te)
' 0 0w i p* tat ** ll' a q 8 e r ' l w
- act 1949 eCC0wat 80P 900.14t'08 Sal'tt
- 090019 401 33Cetet
- 'Seetavett De'l0*at '
Sve'l 00e tt
- tC D*Pe*4 #*0t tC t ' et 4Ctt0*$ ' t' Dw e l ' C l
- C eaa01 *teC# O'30t
- 20 *l0*
4eaa01 SGC ' Je att000 selvfft(' tat 'a83P*et*S4 Sete
- 084a alCette 13 ereat Stet 0a v
J'PtCt 9tateel eveC e tea Ow01'C 0001 *01 Seadet el dat 'C 'Dette
- ealeertet'pa 'Getel SIOCato
' tet*)t#Cp De* lea 9tl adee'l4 Die 0* *'lltag s
- e* *0t 094*w ' Cat t et t
- Dw0 f t :
a 8'*ttt'0al 90! iteta 3 P ' t ate' f ' C i tat vtatC los v s es'Id319 a
- .llet *dt*20'3tect'et 3*,45
- 01 see'leOle
- Jttel'g* pa ult *01 *d39
- d**0% Of $t it *
- 3ett'
'900*Pectly.592 l Da t *01 eCCell to 00et* 2* e*!
'*edtevat t Ot'60*ati
'
- tlc 0al' O ' I t t y OefelJo* tt
- 3a't0* a *ea topolv t Olitt u
e ede l para t 901 e ve ' 140 ' t Sa '* 4*C t '3*
- *eetGwatt pe'l3*at '
- t t00al t 3 t i t ty Sve'; Jos t"
- telart tae'PDamtatal ;0*te*'astt0*
' Ce**01 '3C#!t
'aeotteett'*4.*:tt3***g 43.'3*ta!
Pt10 Pal'S'11ty 3 *t' ' Joe t
- t*teptacy Pel't'
&Ctteete ape *ste e*eege* p meo'cel fattisties attilisa att ease a lle'ag.adede3't
- ?4*wa';ett?*l 'e' facittes =0! ses tes!t
' '4
- l 00 *'
e';t'*S'
- 'eal20*t e! ' 3*
"0* eve' eS'e
'Ovtti *0t elle339
- St*10*atl v see'34 Die a
OtC09testadte/ tlelete Coate*'ast'09
'940tGwett De*10aat' 43 4 ' 3me* *
- 4aaet IJCatt t*4000vete medal Of Jelt*tavt'0*
ilteelll# leeCeett0a Otat e*l
[veCuet tea P0stel 3136eed
- 8tColt 901 **f0Petc 90t let e9 8'Out3e se*10 5 (at'staC.* 5t**' Celt $wealtes attel 90t Cpapwa
- C et te peeltanti '*4090 watt Pel00*l10tiftsel *01 0493 0.*
'adotovett tv001p 'Pelaw';f 4 Cell 10 il 313Ca ed taset4*ett medal O' 31st*'3.!'ca a
I.
etC0etPp OttatPy
- 'ae003uste Sete " 0**4t t0*
' StC ' S t 0a act *ede CO*#1'C1 Set
- 09:1910#
- PelatPllDtitty *01 Clee*
ISOd/egP1Cultw e CentP01 e
- '940etwett DePleaatl 10 40*tt0*
- a0t '80ltefatte
i APPENDIX C DISCUSSION TOPICS ON ORGANIZATIONAL INTERFACE INTRA 0RGANIZATIONAL RELATIONSHIPS la.
How does your organization fit into the broader scope of emergency response activities in the event of a nuclear power plant accident?
b.
Is it clear what your responsibilities are?
2.
What things do you anticipate having to do in an emergency that are not covered in the emergency plan and implementing procedures?
3a.
In an emergency does the structure of authority within your organization change from normal situations?
b.
If yes, how?
If no, describe.
4 Is it clearly understood by all who are in charge?
Sa.
Does anyone's emergency responsibilities overlap with others in your organization?
b.
If yes, how?
6.
How clear are the priorities for your organization's emergency response activities?
7 In what ways do your emergency response activities differ from what you do on a daily routine?
8a. Are there people within your organization that you feel don't understand what you do in an emergency?
Are there people who don't agree about their own emergency jobs?
b.
9.
Are there any problems in communicating with others in your organization about emergency response activities? Why or why not?
10a. How closely are you required to follow procedures in the emergency plan and implementation procedures?
b.
What are the consequences of not following them?
lla. Does your organization have adequate resources to carry out its emergency responsibilities ?
b.
If not, what is lacking?
i C-2 INTER 0RGANIZATION RESPONSE 12.
Do you have the means of obtaining adequate resources?
13.
Is it clear which other organizations you will interact with in an emergency? Please explain.
14a. Do you interact with these organizations as part of your normal activities?
- b. How f requently ?
15.
Is someone specifically responsible for communicating with each of these other organizations during an emergency?
16.
Is it clear which other organizations your organization can tell what to do in an emergency? Which can tell yours what to do?
Please explain, 17a. Do any other organizations have the same responsibilities in an emergency ?
J
- b. If so, where do yours end and theirs begin?
18a. Are there any other organizations that are difficult to work with?'
- b. In whst ways?
4 19.
Do you feel that regulations about what to do in an emergency are a burden?
20a. Are there any other organizations who do not understand what you do in an emergency?
- b. Who feel what you do is not important?
FLEXIBILITY
- 21. What problems do you anticipate in communicating with other organizations in an emergency?
- 22. Do you anticipate that the control your organization has over what it does would be lost or changed in an emergency?
23a. Do you feel the entire plan and strategy for emergency response will successfully work if an emergency occurred?
- b. What are the weakest links?
APPENDIX D TEST EXERCISE EVALUATION TOPICS 1.
What was your role in the exercise?
2.
Is that the role you would actually perform in an emergency?
3.
Was it clear what your responsibilities were in the exercise?
4 Was everything you did in the exercise described in th? emergency plan and Emergency Operating Procedures?
5.
Was it clearly understood who was in charge? Explain.
6.
Did anyone's emergency responsibilities overlap with yours? How did you coordinate your ef forts? What problems did this cause?
7 Was your role in the emergency exercise similar to your normal work responsibili ty ?
8.
Was it clear how you were to set priorities among things to do?
9.
Did anyone disagree with you over your responsibilities or tasks?
10.
Were there others who did not understand your responsibilities?
Who did not feel they were important?
- 11. Did you have adequate resources to carry out your responsibilities?
12.
Did you have adequate means of obtaining more resources?
13.
Was it clear with whom else you were to communicate with?
To work with (if different)?
14 Have you ever interacted with these people before? Describe.
Was the nature of your interaction different in the exercise?
15.
Did you have any communications problems? Explain.
- 16. Was there anyone difficult to interact with?
17 How closely did you follow written procedures? Why ?
- 18. Did you ever feel things were not under control? Des cri be.
19.
What were the weakest aspects of the emergency response organization?
l
APPENDIX E DATA FROM THE CASE STUDIES AND EXERCISE i
-- -,,. ~,
-s
i Table E-1.
Case study one matrix
- Organization Or Cohe!anizational State local iveness Factors Utility A
B C
D E
A B
C-D E
F INTRA 0RGAN ZATIONAL RELATIONSH,PS Role definition C
C 11 C
C C
C C
U C
C U
Authority C
C 11 C
C C
C C
C C
C C
Terri tory C
C U
C C
C C
C C
C C
N Priority setting C
C C
C C
C C
C C
C C
N Normativeness N
C C
U C
N C
C C
C C
C legitimacy C
C C
C C
C C
C C
C C
C Communications ability C
C C
U N
C C
C C
C C
N Knowledge C
C C
U N
C N
C C
C C
N FLEXIBILITY T
Formalization N
C C
C C
C C
C C
C C
C Adaptability C
C C
C C
N N
C C
C C
C t
Control C
C N
C C
C C
C C
C C
N
' INTER 0RGANIZATIONAL RESPONSE NETWRK Dqmain C
N U
C C
C C
C C
C C
N Dispute resolution U
?
C U
?
?
N N
C U
U N
Legitimacy of roles C
C N
C C
C C
C C
C C
C Resource adequacy C
N C
ll C
C N
N C
N C
C Autonom C
U C
N C
C C
N C
C C
C Commun ycations ability N
C C
U N
C C
C U
U U
C Author:ty C
C N
N C
C C
N C
C C
N Interactlon clarity N
C C
N C
C N
C U
U U
U Knowledge C
C C
N N
C N
N N
N C
U
- In this table, C= cohesive value, U=uncohesive value, N= neutral value, and ?= inadequate data.
?
i E-3 Table E-2.
Case study two matrix
- Organizations State Govt.
Locol Govt.
Factors Utility A B A
B C INTRA 0RGANIZATIONAL RELATIONSHIPS Role definition C
C C C C t
Authori ty C
C C C C C Territory C
C
?
C C C Priority setting C
C U
C C C Norma ti veness C
C C C C C Legi tima cy C
C C C C C Communications ability C
C C C C C Knowledge C
C C C C C FLEXIBILITY Forma lization C
C C C C C Adaptability C
C C C C C Control C
C C C C C INTER 0RGANIZATIONAL RESPONSE NETWORK Domain C
C N
C C C Dispute resolution
?
?
?
?
?
?
Legitimacy of roles C
C C C C C Resource adequacy C
C C C C C Autonony C
U
?
C C
?
Communications ability N
C N
C N C Au thori ty C
C U
C N C Interaction clarity C
C C C C C Knowledge C
C C C C N
- In this table, C= cohesive value, U=uncohesive value, N= neutral value, and ?= inadequate data.
/
- - - - - - - - - - - - - - - - - - - - - - ' ' ~ ' '
- ' ' ' ~ '
E-4 Table E-3.
Test exercise matrix
- State Local Utility A
B C
A B
C D
E INTRA 0RGANIZATIONAL RELATIONSHIPS Role definition C
C N
C C
C C
C C
Authority C
C U
C C
C C
C C
Terri tory C
C C
C C
C C
C C
Priority setting C
C C
C C
C C
C C
Normati veness C
C C
C C
C C
C C
Legi timacy C
C C
C C
C C
C C
Communications ability C
U U
N U
C C
C C
Knowledge C
C N
C C
C C
C C
FLEXIBILITY Formalization U
C C
C C
C C
C C
Adapta bi li ty C
C C
C C
C C
C C
Control C
C C
N N
C C
C C
INTER 0RGANIZATIONAL RESPONSE NETWORK Domain C
N N
C C
C C
C C
Dispute resolution U
?
U N
?
?
?
N U
Legitimacy of roles C
C U
C C
C C
C U
Resource adequacy N
N C
N N
U C
C N
Autonomy N
C C
C C
N C
C N
Communications ability U
U U
N U
N U
U U
Authority C
C U
C C
N C
C C
Interaction clarity V
U U
C U
U U
U U
Knowledge C
C N
C N
C C
C N
- In this table, C= cohesive value, U=uncohesive value, N= neutral value, and ?= inadequate data.
I I
t APPENDIX F DESCRIPTION OF THE EXERCISE
1 F-2 Table F-1.
Major Exercise Events Abbreviated controller scenario from exercise plan 2000:
Unusual Event reported by personnel at Plant 0410:
Radiation material detected in release piping 0606:
Analysis of discharge canal indicates radioactive material 0730:
Jet pump problers 0800:
Reactor scram 0801:
Rods stuck out, reactor critical 0805:
Inject stand-by liquid control 0940:
Diesel generator a3 on fire 0950:
Reactor subtritical 1018:
Temperature increase and increase of airborne radioactive gases; gases in reactor building 1020:
Heavy steam in minipipe tunnel 1025:
Radiation alarm sounded in reactor building 1031:
Auxiliary operator missing in reactor building 1040:
Auxiliary operator found, seriously injured and contaminated 1040:
Radiation levels in reactor building at 4 at 40 R/h 1045:
Fire extinguished in diesel generator 1100:
Reactor building radiation levels at 68 R/h 1200:
Reactor building radiation levels at 136 R/h 1200:
(Based on above events, it is expected that the state radiationprotection section, the Nuclear Regulatory Commis-sion (NRC) the Utility Technical Support Center, and General Electric will jointly decide to do an orderly blowdown and depressurization.
Expected reaction:
state activated sirens and EBS message. )
i F-3 Table F-1.
Major Exercise Events (continued)
Major Events at the Utility 0519:
Unusual Event declared 0622:
Release into canal reported 0625:
Alert status declared 0738:
Manual scram 0743:
16 rods fail to scram 0800:
Site Emergency 0900:
General Emergency declared because of leak in dry well 0905:
Site evacuation sounded 0920:
Decision made to tell state that evacuation nay be necessary 0945:
Recommend state activate warning system to notify public as an advisory but no evacuation notification 0945:
EOF Activated 1108:
Evacuation recommendation issued 1220:
EOF begins relocation
_ _ - _ _ -~
~
e F-4 Table F-1.
Major Exercise Events (continued)
State E0C Posted Log Board 0820:
Scramred reactor.
16 fuel rods failed.
Damage to core.
0912:
General Emergency with no Protective Action recommended 0945:
Plant did site evacuation 1015:
Plant E0F at site evacuated 1030:
County 1 reported 306 in shelter
- 1155:
Decision at SERT to evacuate 4 sectors (witnin 2 miles of plant) i 1200:
County 1 opening 2 shelters 1245:
KI being administered to emergency workforce 1245:
Evacuation of 3 sectors complete 1300:
Evacuation of ath sector complete
- Some county tests, etc. were conducted out of sequence.
I 1
if f
REFERENCES Adams, David.
"The Red Cross:
Organizational Sources of Operational Problems." American Behavioral Scientist 13(3) (Jan.-Feb.), 392-403 (1970).
Anderson, William A.,
" Disaster Warning and Communication Processes in Two Communities," Tne Journal of Communication 19(2) (June),92-104 (1969).
Barton, Allen H, "The Emergency Social System," in Man and Society in Disaster, George W. Baker and Dwight W. Chapnan, Eos. (Basic Booxs, New Y ork, 1962).
Barton, Allen H, Communities in Disaster (Anchor Books, New York,1969).
.Brouillette, John R., " Community Organizations Under Stress:
A Study of Interorganizational Comm;nication Networks During Natural Disasters," Dissertation, The Ohio State University, Columbus, 1971.
Brouillette, John R., and E. L. Quarante111, " Types of Patterned Variation in Bureaucratic Adaptations to Organizational Stress,"
Sociological Quarterly 41(Winter), 39-46, (1971).
Clifford, Roy A., The Rio Grande Flood:
A Comoarative Study of Border Communities, National Acadeny of Sciences / National Researcn Council Disaster Study #7, National Acadeny of Sciences, Washington, D.C., 1956.
Dacy, Douglas C., and Howard Kunreuther, The Economics of Natural Disasters (The Free Press, New York, 1969).
Demerath, Nicholas J., and Anthony F. C. Wallace, " Human Adaptation to Disaster," ' Human Organization 16(Summer), 1-2 (1957).
Drabek, Thones E., Disaster in Isle 13, Disaster Research Center, Ohio State University, Columbus,1968.
Drabek, Thomas E. et al., Managing Multiorganizational Emergency Responses: Emergent Search and Rescue Networks in Natural Disaster and Remote Area Settings, Institute of Benavioral Science, University of Coloraco.
Boulder, Colorado,1981.
Drabek, Thomas E., and J. Eugene Haas, " Laboratory Simulation of Organizational Stress," American Sociological Review 34(2) (April) t (1969a).
l
.~
i R-2 Drabek, Thomas E., and J. Eugene Haas, "How Police Confront Disaster,"
Transaction 6(May), 33-38 (1969b).
Drabek, Thomas E., and Enrico L. Quarantelli, "When Disaster Strikes."
Journal of Apolied Social Psychology 1(2) 187-203 (1971).
Dynes, Russell R, " Theoretical Droblems in Disaster Research," Bulletin of Business Research 41 Sept.). 7-9 (1966).
Dynes, Russell R., Organized Behavior in Disasters: Analysis and Conceptualization, Disaster Researen Center, The Onio State Uni versity, Columbus,1969.
Dynes, Russell R., Organized Behavior in Disaster, (D. C. Heath and Comany, Lexington, Mass.,1970).
Dynes, Russell R., "Interorganizational Relations in Communities Under Stress," in Disasters: Theory and Research, E. L. Quarantelli, Ed.,
Sage, Beverly Hills,1978), pp. 49-64 Dynes, Russell R., J. Eugene Haas, and E. L. Quarantelli, " Administrative Methodological, and Theoretical Problems of Disaster Research,"
Indian Sociological Bulletin IV(July), 215-227, (1967).
Dynes, Russell R., E. L. Quarante111, and Gary A. Kreps, A Perspective on Disaster Planning, Disaster Research Center, The Onio State Uni versi ty, Columous,1972.
Dynes, Russell R., and George Warheit, " Organizations in Disasters," EMO National Digest 9(April-May), 12-13 (1969).
Eliot, Thomas D., "The Bereaved. Family," The Annals of the American Academy of Political and Social Science (March),
1-7, (1932).
Form, William H., and Sigmund Nosow, Community in Disaster, (Harper &
Brothers, New York,1958).
Fritz, Charles E., " Disaster," in Contemporary Social Problems, Merton and Nisbet. Eds. (Harcourt, New York, 1961), pp. 651-94 Fritz, Charles E., " Disasters," in International Encyclopedia of the Social Sciences (Macmillan, New York, 1968), pp. 202-207 Fritz, Charles E., and Eli Marks, "The NORC Studies of Human Behavior in disaster," Journal of Social Issues 10, 26-41 (1954).
Fritz, Charles E., and Harry B. Williams, "The Human Being in Disasters:
A Research Perspective," The Annals of the American Academy of Political and Social Science 309(Jan.), 42-51 (1957).
9 R-3 Haas, J. Eugene, and Thones E. Drabek, 9 Community Disaster and System Stress:
A Sociological Perspective," in Social and Psycholocical Factors in Stress, Joseph E. McGrath, Ed. (Holt, Rinehart and winston, Inc., New York,1970), pp. 264-86 Instituut Voor Sociaal Onderzoek Van Het Nederlandse Volk Amsterdam, Studies in Holland Flood Disaster 1953, Committee on Disaster Stucies of tne National Acadeny of Sciences / National Research Council, Volumes I-IV, National Acadeny of Sciences, Washington, D. C.,
1955.
Kennedy, Will C.,
" Police Departments:
Organization and Tasks in D sister," American Behavioral Scientist 13(3) (Jan.-Feb.), 354-61 (1970).
K reps, Gary A., "The Organization of Disaster Response:
Some Fundamental Theoretical Issues," in Disasters: Theory and Research, E. L.
Quarantelli, Ed. (Sage, Beverly Hills,1978), pp. 65-86.
Kutak, Robert I., "The Sociology of Crises:
The Louisville Flood of 1937," Social Forces 17, 66-72 (1938).
Leik, Robert K., T. M. Carter, and J. P. Clark, Community Response to Natural Hazard Warnings, University of Minnesota,1981.
McLuckie, Benjamin F., "A Study of Functional Response to Stress in Three Societies," Unpublished Doctoral Dissertation, Departments of Sociology and Anthropology," The Ohio State University, Columbus, 1970.
Mileti, Dennis W., " Human Adjustment to the Risk of Environmental E x t renes, " Sociology and Social Research 64, 328-47 (1980).
Moore, Harry Estill, "Toward a Theory of Disaster," American Sociological Review 21(Dec.), 734-37 (1956).
Parr, Arnold Rich.ard, " Flood Preparation - 1969:
Observations Concerning the Southern Manitoba Spring Flood Preparations," EMO National Digest 9(June-July), 25-27 (1969).
Quarantelli, Enrico L.,
"The Community General Hospital:
Its Immediate Problems in Disaster," American Behavioral Scientist 13(3)
(Jan.-Feb.), 380-91 (1970).
Quarantelli, E.
L., and Russell R. Dynes, " Response to Social Crisis and Disaster," Annual Review of Sociology 3, 23-49 (1977).
R os ow, Irving L., " Conflict of Authority in Natural Disasters,"
Unpublished Doctoral Dissertation, Department of Social Science, Harvard University, Cambridge, Mass.,1955.
i R-4 Spiegel, John P., "The English Flood of 1953," Human Organization 16(Summer), 3-5 (1957).
Stallings, Robert A., "The Structural Patterns of Four Types of Organizations in Disaster," in Disasters: Theory and Research, E. L. Quarantelli, Ed. (Sage, Beverly Hills,1978), pp.87-104 Thompson, Janes D., Organizations in Action (McGraw-Hill, New York,1967).
Thompson, James D., and Robert W. Hawkes, " Disaster, Community Organi-zation and Administrative Process," in Man and Society in Disaster, George W. Raker and Dwight W. Chapman, Eds. (Basic books, New York,
1962), pp. 268-300.
Warheit, George Jay, "The Impact of Four Major Emergencies on the Functional Integration of Four American Communities," Dissertation, The Ohio State University, Columbus,1968.
Warheit, George Jay, " Fire Departments:
Operations During Major Community Emergencies," American Behavioral Scientist 13(3)
(J a n.-Feb. ), 362-68 (1970).
Weller, Jack Meredith, Innovations in Anticipation of Crisis:
Organizational Preparations for Natural Disasters and Civil Disturbances," Dissertation, Department of Sociology, The Ohio State University, Columbus,1972.
4
,,.m
___._,._,.,..,_-__,_,,,m,
______.-.,________..,m.,.__
,_._m
h NUREG/CR-3524 ORNL-6010 Dist. Category RX INTERNAL DISTRIBUTION 1.
R. B. Braid 14-18.
E. Peelle 2-6.
R. M. Da vi s 19-23.
J. H. Reed 7.
W. Fulkerson 24-27.
J. W. Sims 8.
P. M. Haas 28.
T. J. Wilbanks 9.
T. P. Hamrick 29-30.
Center for Energ &
10.
S. V. Kaye Environmental Information 11.
O. H. Klepper 31.
Central Research Library 12.
A. P. Malinauskas 32.
Document Reference Section 13.
T. C. Morelock 34 Laboratory Records 35.
Laboratory Records (RC) 36.
ORNL Patent Office EXTERNAL DISTRIBUTION 37.
Director of Energy Programs and Support, Department of Energy, Oak Ridge Operations, Oak Ridge, Tennessee 37830 38-437 NRC Category RX, Human Factors Safety (NTIS - 10) 438-440.
Technical Information Center, Department of Energy, P. O.
Box 62, Oak Ridge, Tennessee 37830.
441-447 Energy Division Advisory Connittee Distribution
n ac =:..a 33s
' ai':a
'eia an.
>:a:::;
,, w..,,,,,.,,..,, w,,,,,
BI5 uCG:te.= Hic D AT A SHEET NUREG/CR-3524 ORNL-6010 p,. r. r.i. : s. s -.s a :
..sw--
Organizational Interface in Reactor Emergency Planning and Response
- 2 3
- .... ;. ::3 ;3 :.. -
I i
r ast-ca 2
- s : a t 2 s = : = : :. >.1 r :
J. H. Sorensen D. 5. Milet1 c..
E. D. Copenhaver M. v. Adler March 1984
, e n n : a... a a : a. : a :.... i :.. :
.a.
a:: a s:1
-c.:e : s :nn l :4 5 'is:-r 5: i:
Energy Division
! gay 1994 Oak Ridge National Laboratory P. O. Box X 3 1,,,,,.,,,,
Oak Ridge, TN 37831 4.,,.....
l ' ? 180 J0 2.;:a;a...2- ;r.. a. g :. :.i2,,.,.. a g 3 3,. :.., y,
..p Division of Facility Operations.
i Q "; i
~~-*:-s';
Office of Nuclear Regulatory Research U.S. Nuclear Regulatory Cochission
! ' '* C
!!ashington, DC 20555 Under Interacency Acreement DOE 40 550 75 80a91
!a r.as :saf>:=.
... :: :3,..i :,..,,,,,,,,.
NUREG lFY83
- s s.**.r.'s.raav.cre; t
s ass a.:
- n o as 3
,u.
The purpose of this researcn was to determine if existing regulations have led to effective interfaces between utilities and offsite organizations in emergency planning i
and response.
Findings suggest nat regulations nave provided tne necessary framework j
for achieving adeouate interfaces.
That interface has been achieved is demonstrated i
by comprehensive response plans and good Cohesiveness among organizations involved j
in emergency response.
Interface problems identified in the research can be reduced by better implementation of existing regulations rather than by revision of existing i
ones.
a 1
i l u ss...:a:> r : :::...n.
,;g,:....,.
Emergency Planning I
Organi:ational Interface f
Human Factors
- s'..s e a5 :e e,a. :: :
e.3 9
u' a.....,,c.,,..r
.,,,.a.
NTIS Unclass...ified I
$2
ATTACHMENT E
i i
H UNTON & WIII.IAMS 707 East Mom StegET R. o. Sox 1535 aooo se==. 6w... ave =ws. =
- Racarason,n. Vamo 3r:A a3sta e o.oni aso
,co m.
=ata.=oto= o e soo3.
=tw 'omm =sw 'oma oo..
tess a=ont aos.......oo TE 6t mas om t 804 788 8200 Test.=o saiaao...oco c.est venoi=...a== toas e vt=tm eaa.
w=, w.
TELEn 684425f e o non a...
o=t.. =ove r..ov.eg ooeown v.mo..a a s..e
- o nos 80.
2,.. o=,. o...a...o.
east.o= =cov= casows== at.oa et...mo=........ 30 0 February 13, 1987
,...,..==..Z....,...,,
s exo c=......oos =oao
= o so..
- a.neaa. vino.=.. ano s o
-=0......,==........o.
vist*=o=t vos 3.a+aaoo tagg e=o=t.... 3 9 a s..
24566.300001 s= =o o,.e+o..
=
.o... 8552 Michael S.
Miller, Esq.
By Federal Express Kirkpatrick & Lockhart 1900 M Street N.W.
Washington, D.C.
20036
Dear Mike:
This morning I received your letter of February 12 which I believe seriously mischaracterizes the course of discovery con-cerning the LERO training documents and LILCO's responsiveness to your discovery requests.
First, your letter implies that LILCO has not been respon-sive to requests made during depositions for LERO training docu-ments.
As you are well aware on November 5, 1986, LILCO produced for inspection and copying all LERO training documents generated between the February 13, 1986 Exercise and October 29, 1986.
After representatives of your office had the opportunity to re-view the documents and designate certain of the documents for copying, LILCO copied and produced 3 boxes of documents on November 10, 1986 and produced an additional box of June 6 drill documents in early January.
In accordance with our agreement,
i i
HuwTow Be WILLIAxs Michael S.
Miller, Esq.
February 13, 1987 Page 2 prior to producing the documents, the names and telephone numbers of LERO personnel were redacted from the documents to be pro-duced.
At the deposition of Dennis Behr on January 13 you made an additional request for "any training documents that have been revised by LERO since the Exercise" and later, by letter dated January 27, 1987, Suffolk County expanded this request to "any training documents prepared since February 13, 1986, and not pre-viously produced."
In response, on February 2, 1987, LILCO sent by Federal Express 1 box of documents responding, in part, to your additional requests.
Production of documents was completed on February 9, 1987, when the remainder of the documents were sent by Federal Express to your offices.
You have also incor-rectly characterized this most recent production of training doc-uments as equalling in quantity all training documents produced prior to that time.
While the documents delivered in each case l
may have been of similar volume, the fact remains that the first l
set of discovery documents represented only about 25% of the training documents made available for Suffolk County's review and i
identification for copying.
By contrast, in responding to your most recent request LILCO attempted to expedite service by copy-ing all responsive training documents and did not first ask you to identify documents for copying.
, -.. - - - ~
.m-,
c.,--vn
i Huwrow & WILLIAMS Michael S.
Miller, Esq.
February 13, 1987 Page 3 Second, your letter implies a continuing obligation on 4
LILCO's part to provide LERO training documents as they are pro-duced.
LILCO is under no such obligation.
Neither the NRC Rules of Practice nor their Federal Rules of Civil Procedure corollary contemplate continuing requests for documents as they are gener-l ated.
Indeed under the facts of this case, such a request wouJd be absurd.
LILCO's training program, as you well know, is ongoing and documents continue to be regularly produced in the course of that program.
Third, your letter implies that the fact the December drill reports have not been produced is a calculated plan on LILCO's part to thwart suf folk County's ability to draf t testimony on the 4
I training issues.
Such an implication is totally unwarranted.
i The December drill reports have not been completed.
Since December, the same, people who would prepare the drill reports have been engaged in numerous activities related to the Exercise l
litigation not the least of which have been the depositions taken i
l by Suffolk County, including preparation, and other discovery.
l l
LILCO is under no obligation to allocate its resources on the basis of Suffolk County's desire for production of certain docu-ments.
As I have stated to you in the past, when the December drill reports are available they will be produced to Suffolk County.
5 4
Huwrow & WILLIAus Michael S. Miller, Esq.
February 13, 1987 Page 4 Fourth, your letter implies that LILCO has been tardy in re-sponding to the additional discovery requests made by Suffolk County during the depositions of Mary E. Goodkind and Michael K.
J Lindell.
Such is not the case.
I received your February 12 let-ter in the midst of preparing responses to your additional dis-covery requests made during those depositions held on January 30 and on February 5 respectively.
We received Ms. Goodkind's depo-sition transcript only yesterday, and it was immediately reviewed i
to identify your additional discovery requests.
We have yet to j
receive a transcript for Dr. Lindell's deposition to confirm the precise scope and nature of your requests.
As always, LILCO at-tempts to provide prompt responses to proper discovery requests.
I enclose LILCO's responses to the additional discovery requests made during Mary Goodkind's deposition.
Despite the fact that I have yet to receive a transcript for Dr. Lindell's deposition, I i
have attempted to define the scope and nature of your requests by I
relying on my own notes taken during Dr. Lindell's deposition f
testimony.
Deposition of Ms. Goodkind i
Request:
Now, today we had mentioned by Ms. Goodkind those i
materials that she had gathered, or perhaps is still in the process of gathering from other exer-i cises she attended.
And I would, at this time, i
request production by counsel for LILCO of those l
documents that have been accumulated and are being ii
~
i 6
Huurow & WILLIAxs Michael S.
Miller, Esq.
February 13, 1987 Page 5 accumulated by Ms. Goodkind.
(Goodkind Deposition at 116.)
Response
Enclosed are copies of documents referred to by Ms. Goodkind in her deposition.
They comprise ex-eccise reports and executive summaries of exercise reports that Ms. Goodkind reviewed in preparation for her deposition.
LILCO objects to the request to the extent that it seeks material other than those documents reviewed in preparation for Ms.
Goodkind's depositions or materials from exercises at which she was an evaluator.
Any additional ex-eccise materials that may be selected, gathered, or reviewed by Ms. Gundkind in the future would be based on litigation preparation decisions made by Ms. Goodkind and her counsel and would constitute protected work product.
Deposition of Dr. Lindell Request:
T. C.
Earle and M.K. Lindell, "The role of the news media in the gasoline crisis".
BHARC-411/80/002.
Response: Enclosed.
Request:
M. K.
Lindell, W.
L. Rankin, and R.
W.
Perry
" Warning mechanisms in emergency response sys,-
tems." BHARC-411/80/003.
1 1-
i L
HUNTON & WILLIAMS Michael S. Miller, Esq.
February 13, 1987 Page 6 Response: Enclosed.
Request:
T.
C.
- Earle, M.
K.
Lindell, and W.
L.
- Rankin,
" Risk perception, risk evaluation and human val-ues:
cognative basis of the acceptability of ra-dioactive waste repository".
BHARC-411/81/007.
Response
Dr. Lindell was unable to locate a copy of this report in his personal papers.
The report is available to the public through the Battelle Memo-rial Institute.
Request:
M.
K.
- Lindell, R. W.
Perry, and M.
R.
- Greene, "Public response to evacuation warnings:
Implica-tions of natural hazards evacuations for nuclear emergencies".
BHARC-411/81/032.
Response
Enclosed.
Request:
T.
C.
Earle, L. L. Southwick, and M.
K.
- Lindell,
" Newspaper coverage of Mount St. Helen's:
Pat-terns of content and information sources".
BHARC-411/81/035.
Response
Dr. Lindell was unable to locate a copy of this report in his personal papers.
Copies are publi-cally available through Battelle Memorial Insti-tute.
Request:
M.
K.
Lindell, P. A.
Bolton, R.
W.
Perry, G. A.
Stoetzel, J.
B.
Martin, and C.
B.
Flynn, " Planning concepts and decision criteria for sheltering an evacuation in a nuclear power plant emergency".
Atomic Industrial Form / National Environmental Studies Project.
AIF/NESP-031.
i i
Huwrow & WILLIAxs Michael S. Miller, Esq.
February 13, 1987 Page 7 Response: This document is publicly available through the Atomic Industrial Forum.
Request:
M.
K.
Lindell, " Communicating risk information to the public a review of research on natural haz-ards".
BHARC-400/84/026.
Response: Enclosed.
Request:
M.
K.
Lindell, " Social and political aspects of nuclear plant emergency planning".
BHARC-400/85/012.
Response: Enclosed.
Request:
M.
K.
Lindell, " Analysis of information flow with-in the Hanford Emergency Control Center".
BHARC-400/85/018.
Response: Enclosed.
Request:
R.
W.
Perry and M.
K.
Lindell Handbook of Emer-gency Response Planning (to be published in late 1987).
Response: The editor at Harper & Rowe has declined to re-lease the draft of the handbook authored by Dr.
Lindell and R. W.
Perry.
At this point in time the manuscript for the handbook is not final.
As Dr. Lindell stated in his deposition, the thoughts, concepts, and theories memoralized in the handbook are not new and have been discussed at length in his other published works.
i Huwrow & WILLIAMS Michael S. Miller, Esq.
February 13, 1987 l
Page 8 a
Request:
J.
D.
Kartez and M.
K.
Lindell, " Planning for un-certainty:
the case of local disaster planning" Journal of the American Planning Association (in press).
Response
Enclosed.
Request:
Presentation by M.
K.
- Lindell, R.
W.
Perry, and M.
i R. Greene, Consistency of Attitudes Related to Nu-clear Power (Western Sociological Association 1980).
Responses As Dr. Lindell described in his deposition, the substance of presentations are frequently embodied
)
in technical reports.
This was the case for Consistency of Attitudes and Behavior Related to Nuclear Power, which became a technical report en-titled " Attitude-Action Consistency in Social Pol-icy Related to Nuclear Power".
BHARC-411/030 (1979).
A copy of that technical report is en-4 closed.
Request:
Notes prepared by Dr. Lindell as a result of his review of various articles and papers.
Response
LILCO objects to this request as seeking protected j
work product.
Notes taken by Dr. Lindell during 4
his review of papers and articles to be considered l
in his testimony are privileged work product mate-rials.
1 l
l HUNTON & WILLIAMS Michael S. Miller, Esq.
February 13, 1987 Page 9 Request:
Any notes, drafts, or final reports prepared by Dr. Lindell concerning training and/or LILCO's
)
proposed fixes in relation to the ARCAs and Deficiencies noted by FEMA in the Post Exercise Assessment which related to training.
(Letter of February 12, 1987).
Response: We have not had an opportunity to verify that this
(
request was made during Dr. Lindell's deposition, and notes taken during the deposition give no in-
~
dication that such a request was indeed made.
However, assuming that the request was made during the deposition, LILCO objects to this request as seeking protected work product.
To the extent that any notes, drafts, or final reports prepared by Dr. Lindell do exist, they would have been pro-duced in the course of his preparation for the testimony he will give on Contention EX 50.
Sincerely, uh k.
Jessine A.
Monaghan 283/6176 Enclosures I
'