ML20196J262

From kanterella
Jump to navigation Jump to search
Summary of 981105 Meeting with Nuclear Industry in Rockville,Md to Discuss Development of Performance Measures to Be Used by NRC to Aid in Assessing Emergency Preparedness at Operating Nuclear Plants
ML20196J262
Person / Time
Issue date: 11/24/1998
From: Reggie Sullivan
NRC (Affiliation Not Assigned)
To: Baranawsky P
NRC OFFICE FOR ANALYSIS & EVALUATION OF OPERATIONAL DATA (AEOD)
References
PROJECT-689 NUDOCS 9812100012
Download: ML20196J262 (88)


Text

-

g f* ,,\

UNITED STATES g NUCLEAR REGULATORY COMMISSION WASHINGTON, D.C. - aaa1 November 24, 1998 MEMORANDUM TO: Patrick W. Barenawsky, Chief Reliability and Risk Assessment Branch Safety Program Division Office for Analysis and Evaluation of Operational Data FROM: Randolph L. Sullivan, EP Specialist - e Emergency Preparec' ness and Environmental Health Physics Section Emergency Preparedness and Radiation Protection Branch Division of Reactor Program Management Office of Nuclear Reactor Regulation

~

SUBJECT:

SUMMARY

OF NOVEMBER 5,1998, PUBLIC MEETING REGARDING EMERGENCY PLANNING PERFORMANCE MEASURES On November 5,1998, representatives of the nuclear industry met with representatives of the Nuclear Regulatory Commission (NRC) at the NRC's offices in Rockville, Maryland.

Attachment 1 provides a list of meeting attendees.

The purpose of the meeting was to discuss the development of performance measures to be

_ used by NRC to aid in assassing emergency preparedness at operating nuclear plants. The handouts from the meeting are included as attachment 2.

The following topics and actions were discussed at the meeting:

The group discussed the process used thus far to develop Emergency Preparedness (EP) Performance indicators (Pl's).

The group discussed the latest draft NRC document on EP PI's (Attached). Comments were provided and discussed on a line-by-line basis. NRC stated that the EP Cornerstone document would be revised and available for the next meeting. It was committed that this would be provided in advance of the 11/12&13/98 meeting to facilitate understanding by the industry attendees. (Attached)

Action: NRC to revise document and present it to industry at next meeting (11/12&13/98)

/

The industry Assessment Public representatives Workshop. This was provided asked for copies (Attached) of tha Results of the NRC NEl discussed industry continuing acceptance of the draft PI's.

The agenda for the next meeting on 11/12&13/98 was discussed.

9812100012 981124 PDR REVGP ERONUPWtC

()p[}, (

m, f m

2-The group discussed the selection of thresholds for the draft PI process. Draft thresholds were provided and there was consensus. However, thresholds for all the PI's have not been developed yet. The basis for the thresholds selected was a review of past inspection performance. Copies of analysis documents were given out. (Attached)

Action: NRC to propose remaining thresholds and present them at the 11/12&13/98 meeting Action: NEl/ industry representatives to provide data analyses on 4 more years of data by 11/10/98 The group again discussed the weighting of Pl statistics for the more risk significant items. In particular, the normalization of PAR successes was discussed. There was no consensus on the merits of this proposal.

The possibility of conducting a workshop on the implementation of the PI process was discussed at length. This will be proposed to management and the industry group.

Project No. 689 Attachments: As stated cc w/att: See next page i

1 -* ' T

1 The group discussed the selection of thresholds for the draft PI process. Draft thresholds were provided and there was consensus. However, thresholds for all the PI's have not been developed yet. The basis for the thresholds selected was a review of past inspection performance. Copies of analysis documents were given out. (Attached)

Action: NRC to propose remaining thresholds and present them at the i

11/12&13/98 meeting Action: NEl/ industry representatives to provide data analyses on 4 more years of data by 11/10/98 l

The group again discussed the weighting of Pl statistics for the more risk significant items. In particular, the normalization of PAR successes was discussed. There was no consensus on the merits of this proposal.

The possibility of conducting a workshop on the implementation of the Pl process was discussed at length. This will be proposed to management and the industry group.

Project No. 689 Attachments: As stated cc w/att: See next page DISTRIBUTION: See attached page DOCUMENT NAME: G:\NOV5MTG. SUM OFFICE: PERB/NRR gg PERph{R NAME: RSULibdN FKdbOR DATE: 11/Pf98 11/ti/98 OFFICIAL RECOP.D COPY

)

Nuclear Energy Institute Project No. 689 cc: Mr. Ralph Beedle Ms. Lynnette Hendricks, Director Senior Vice President Plant Support and Chief Nuclear Officer Nuclear Energy Institute Nuclear Energy institute Suite 400 Suite 400 1776 i Street, NW 1776 i Street, NW Washington, DC 20006-3708 Washington, DC 20006-3708 Mr. Alex Marion, Director Mr. Charles B. Brinkman, Director Programs Washington Operations Nuclear Energy Institute ABB-Combustion Engineering, Inc. l Suite 400 12300 Twinbrook Parkway, Suite 330 1776 i Street, NW Rockville, Maryland 20852 Washington, DC 20006-3708 Mr. David Modeen, Director Engineering .

Nuclear Energy Institute Suite 400 1776 l Street, NW Washington, DC 20006-3708 Mr. Anthony Pietrangelo, Director Licensing Nuclear Energy Institute Suite 400 1776 i Street, NW Washington, DC 20006-3708 Mr. Nicholas J. Liparulo, Manager Nuclear Safety and Regulatory Activities Nuclear and Aovanced Technology Division Westinghouse Electric Corporation P.O. Box 355 Pittsburgh, Pennsylvania 15230 Mr. Jim Davis, Director Operations l Nuclear Energy Institute Suite 400 l 1776 l Street, NW l Washington, DC 20006-3708 l

4 6

Distribution: Mtg. Summary w/ NEl re EP Performance Measures Dated November 24, 1998 Hard Cooy PUBLIC PERB R/F OGC ACRS RSullivan EMail SCollins/FMiraglia BSheron BBoger JRoe DMatthews CMiller BZaleman RSullivan FKantor SRoudier SMagruder GTracy, EDO AMadison l

l 1

l  !

I

~

I 4

s ,.

O [h Q s uaCa

NEl/NRC MEETING ON EP PERFORMANCE MEASURES 10/29/98 List of Attendees Name Oroanization A. Nelson NEl B. McBride VEPCo R. Sullivan NRC/NRR F. Kantor NRC/NRR S. Roudier NRC/NRR R. Lewis Lewis Group D. Miller PSG&E-M. Vont CECO D. Lurie NRC Attachment 1

~

Emergency Preparedness Corner Stone GENERAL DESCRIPTION l

)

Emergency Preparedness (EP) is the final barrier in the defense in depth NRC regulations -

provide in ensuring the public health and safety while allowing operation of civilian nuclear reactor way it is related to the Reactor Safety Strategic Performance Area.10 CFR Pan 50.47 an !App dix E to Part 50, define the requirements of an EP program and the licensee commits,t gimp)'ementation of these requirements through an Emergency Plan (the Plan). l LicensermTa ntenance of a high level of readiness to implement the Plan lowers the risk of impact to the public health and safety in the event of a serious radiological emergency.

Key Attributes of License erformance That Contribute to Emergency Preparedness l

I The most risk significant spect of an EP program is the readiness of the Emergency Response Organization (ERO) to perform its intended function during emergencies. Figure 1 depicts the most risk significant elements of ERO readiness and they are defined below
  • Timely and accurate classificatio events; including the recognition of events as t$n levels (EALs), maintenance of the EAL scheme potentially in an approved and exceeding appropriate con emergen[ figuration and any assessment support the classification;

=

Timely and accurate notincation of offsite governmental authorities; including adequate )

performance of notifications as specified in thj lan, the availability of the Alert and Notification System, adequacy of communicats hannels and the direct interface with offsite authorities; Timely and accurate development and communication ofprotective action recommendations to offsite authorities; including providing protective action recommendations (PARS) to governmental authorities as specif' the Plan, the decision making process to develop the PARS, any accident ass? sment necessary to support PAR development, the protection of emergency worke_ d direct interface with offsite authorities; and Emergency Response Organization readiness: including adequacy of facilities, timely activation of the ERO, adequate training of the ERO to ensure proficiency, efficacy of the licensee assessment program to identify and correct deficiencies in ERO proficiency and supporting equipment / facilities.

Measures taken to protect the public from the effects of a radiological emergency must necessarily involve action by both licensee and governmental authorities in the vicinity of the reactor. The facets of the EP program that involve recognition of the accident, mitigation ofits affects, assessment of the offsite impact and communication ofinformation to governmental authorities, including protective action recommendations, are generally referred to as onsite EP.

The program, procedures and systems maintained to implement governmental actions are November 3,1998

referred to as ogsite EP. TN licensee is responsible for ensudng the development of the onsite EP program and provides support to the offsite program as required. The NRC is responsible for ensuring the adequacy of the overall program, but relies on assessment of the offsite program by the Federal Emergency Management Agency (FEMA).

While both aspects are vitally important to ensuring the EP pogram can serve its intended functiorrgede elopment and collection ofperformance indicators (PI's) for offsite portions of the program is t considered necessary of appropriate. FEMA performs regular assessments of )

the effica$y of l ite EP programs that are the basis for reasonable assurance that adequate protectiWasures can and will be taken in the event ofa radiological emergency.

PERFORMANCE INDIfATORS '

n Compliance of EP progn with regulation is largely assessed through observation of response to simulated emergenci$4a thugh routine inspection of onsite programs are currently conducted by NRC to ensure that a licensee's EP program is maintained in a state of operational readiness.

Demonstration exercises fonn the key observational tool currently used to support, on a continuing basis, the reasonable assuranc/ finding that adequate protective measures can and will be taken in the event ofa radiologic $ ergency. This is especially true for the most risk significant facets of the EP program. hs ing the case, the PI's contemplated for onsite EP draw significantly from performance duringYmulated emergencies but are supplemented by licensee self assessment and NRC inspection. NRC assessment of the adequacy of offsite EP will rely (as it does currently) on FEMA assessment y dequacy ofoffsite EP which is based primadly on observation of perfonnance during the bt exercise.

Statement of Objective ---

Ensure that the licensee capability is maintained to take adequate protective measures in the ,

' event of a radiological emergency C 1 1

Desired Result / Performance Expectation .L Demonstration that a reasonable assurance exists that the licensee can effectively implement emergency plans to adequately protect the public health and safety in the event of a radiological emergency.

= Drill / Exercise Performance, collected quarterly for use in a two year rolling average.

Fraction, (numerator and denominator,) of successful performance opportunities over all opportunities for:

Classification of Emergencies November 3,1998 i l

Notification Protective Action Recommendation Basis: ,

Recognition and subsequent classification of events is a risk significant activity of EP program. It is assumed that classification will lead to activation of the ERO appropriate to the emergency class and notification of govemmental authorities.

Timely and accurate notification of offsite authorities is a risk significant activity of ERO as an EP'[h. It is assumed that notification will lead to activation of the ate to the emergency class and mobilization of govemmental The timely and accurate development and communication of protective action recommendations (PARS) 1 a risk significant activity of an EP program. It requires that several sup g activities be performed including: accident assessment, quantificat' n o gdiological release magnitude, projection of the potential dose to the putic and commumcation to govemment authorities. It is assumed that communication of PARS will lead to actions by governmental authorities to protect the public health fety.

Ifplant staff assigned EP duties consis perform these activities in a timely and accurate manner, it indicates that thbEP program is operating at or above the threshold oflicensee safetyperformance above which the NRC can allow licensees to address weaknesses with decreasedNRC action.

r c 3 Requirements: All activities that are formally critiqued for the tinely and accurate performance of these activities shall be includedithis statistic. All simulated emergency events that are identified as opportunities for this PI shall be included in the statistics, i.e., a ca didate opportunity can not be removed from the data set after actual performance, for instance due to poor performance. Opportunities would include actual emergency declarations, the biennial exercise, any other drills of appropriate scope and operating shift simulator evaluations conducted by the licensee training organization.

Operating shift simulator evaluations would be included when the evolution being evaluated is of such a character as to require classification ifit were a real event.

November 3,1998 f

- _. .- _ . ~ -_. -- ---

l 1

I l

No minimum is set for these observational opportunities, but a statistical l analyses performed on the data will recognize that the more opportunities l provided the more accurately the PI numerical value represents licensee performance. Statistical opportunities should include multiple events during a single drill, evolution, etc., if supported by the scenario:

'e -

each expected recognition and classification opportunity should be I included,

) ""

i notification opportunities should include initial emergency classification notification, upgrade of emergency class, notification of PARS and notification of change in PARS, and l

PAR opportunities should include the initial PAR and any changes s1 the PAR necessary due to meteorological changes or plant status changes.

All declared eme ' y events shall be formally critiqued for compliance with approved e cy plan implementing procedures (EPIPs) and adequacy of th Rs, at least in the areas identified as risk significant.

All PI statistic omWese events shall be reported.

,q Inspection Interface Proposed inspection areas that would be necessary to support these PI's include:

Verify that the collection of data is in compliance with the guidelines described.

r, 3 Review the efficacy of the self assessment program to g ther valid statistics through the accurate critique of successes and failures ding performance  ;

opportunities. l Review the efficacy of the corrective action program to correct identified deficiencies in the risk significant areas. l Review control of the EAL set in an approved and validated configuration. (?)

Review self assessment of actual events during the inspection period.

Review conduct of testing of the Alert and Notification siren system for compliance with FEMA guidance.

l November 3,1998

l Review licensee self assessment program as it relates to adequacy of communication channel testing, communication system availability and timely correction of deficiencies.

Review of the licensee self assessment program as it relates to adequacy of direct interface with offsite authorities during exercises and drills that involve offsite i

I y" uthority participation. Additionally, review the self assessment of the adequacy f direct interface with offsite authorities in the area of PAR communication.

1 8'QReview of the licensee self assessment program as it relates to adequacy of worker O

protection during exercises and drills.

Emergency Response Organization Readiness LL1 Percentage of ERO and operating shift crews that have participated in a drill or exercise in the past 24 months.

Basis: EP programs ensure the ness oflicensee personnel, facilities and equipment to support response to e Nhy situations and protect the public health and safety. The previous mdicates the performance of segments of the ERO in risk significant activities during simulated and actual emergency situations. However, this PI is meant to indicate the readinopoffe total ERO to perform as an integrated organization. There are sev6 al upporting activities important to ERO readiness including: ERO activation tea s, RO training and drills, facility and equipment readiness checks, communidations channels tests, the licensee corrective action program, licensee self assessment program, management support, effective EOP implementation by licensed o rators, severe accident management guide implementation and ERO ability o ding ose plant accident conditions, formulate mitigating actions and implement them under accident conditions. 1 This PI indicates the training opportunities provided to ERO members and operating shift crews and not their success during those opportunities. It assumes that the training will contribute to proficiency and overall ERO readiness.

If an EP program consistently ensures that the ERO is in a high state of readiness it indicates that the program is operating at or above the threshold officensee safetyperformance above which the NRC can allow licensees to address weaknesses with decreasedNRC action.

Requirements: The ERO participation indicated is that of the essential positions committed to in the Emergency Plan and the operating shift crews. Plant November 3,1998

]

workers, security personnel and others that are on shift or may be called in i to support the emergency but do not fill positions on EP duy rosters or are l part of the operating shift crews are not intended to be captured in this Pl. '

, Positions that are formally on the EP duty roster, but not committed to in i the Emergency Plan may be included, but only if this is done completely  ;

and consistently. Participation could be either as a drill / exercise j t p participant or as an evaluator (but not as an observer). Signature on a l j drill / exercise attendance form would be adequate documentation, but the

, intent is that the participation be a meaningful and thorough opportunity to i gain proficiency in the assigned position.

i ipation in the biennial exercise and any other drills of appropriate i s 9('ihay be used in statistics, but table top drills thatl gful interaction with interfacing ERFs would not be appropriate. l signees to a given ERO position could take credit for the same j

, diill/ exercise if their participation is a meaningful and thorough '

) opportunity to gain proficiency in the assigned position.

Evaluated simula olutions that contribute statistics to the

- Drill / Exercise P ;o orgce PI would be considered for operation shift j I

crew participation. However, if all crews have participated in more than one evaluated simulator evolution during the measurement period,it may

, only be counted as 100% and agre.

i Inspection interface j m

Proposed inspection areas that would be necessary to support these PI's include:

Verify that the collection of data is in compliance with the glilde; inis described.

i l' Review of the licensee self assessment program as it relates to,p3quacy of facility and equipment readiness checks, communications channels tests, the licensee corrective i action program, management support, effective EOP implementation by licensed operators, severe accident management guide implementation and ERO ability to diagnose plant accident conditions, formulate mitigating actions and implement them j under accident conditions.

4 Review the efficacy of the corrective action program to correct identified deficiencies in the above mentioned areas.

Offsite EP Program TBP, but satisfactory performance will be based on:

November 3,1998 i I

l

FEMA evaluation of the biennial exercise The absence of FEMA withdrawal of reasonable assurance D 0 1

.l-A l "l7 i e

q l

r 3 i

l l

l November 3,1998 I

)

l

. \

l l

. l 1

PI ITEMS UNDER CONSIDEPATION Fraction of LERs that were successfully diagnosed as emergency or non-emergency events, over the number of LER.

l Requirements: Recognition of actual plant events as warranting classification as an "q[

emergency or not, is a measure of the quality of EP training and its implementation. While the number of missed declarations is expected to be small, this indicator would reflect program quality. Review of these l

~"

events by an individual (s) that is qualified to judge the timeliness and accuracy of the classification performance is required to determine the ,

l Percent availabi 1[.ics.

Alert and Notification System f Requirements: Statistical information gathered in support of system availabilit/ reports given to FEMA would form basis of this Pl. However, the reporting of availability is not dardized currently. It is proposed that the following i rules be applied t ering of this data: (

  • Failure a sir'ep is indicated by failure of any portion of the j system at wNou d have prevented it from performing its safety function, i.e., creating its design sound level and pattern.

The period assumed for ailure would be IAW the direction to the state / licensee from .pjh on gathering statistics (Industry to discuss inconsistencies ydAta collection)

Periodic testing is in aecordance with FEMA guidance and actually tests the ability of the siren to perform its intended safety function.

A failure will be assumed to last at least one day (hour?).

f~ p Percentage of essential ERO positions that successfully responc ed in each of the last 4 pager tests. i Requirements: Pager tests indicate the readiness of the ERO to fill duty roster positions during emergencies that take place during off-normal hours. Only statistics from tests during off-normal hours may be included. The results of all such tests shall be included, i.e., a test may not be removed from the data set due to poor performance. Tne frequency tests may be set or changed at licensee discretion, but results from the last four must be included in the Pl. Successful response means that it could reasonably be expected that the position would have been filled within the time goal expected in the Emergency Plan. .A position is filled by one qualified individual. Multiple individuals filling a single position can not be used to improve the statistics.

November 3,1998

. \

I l

Other Areas for Discussion q .. fine timely and accurate; a UE declaration or notification that is 5 minutes late ay be acceptable, but a GF or PAR notification 5 minutes late is not. While one would expect an EP program to pursue improvement in both cases, perhaps the UE problems should not be incuded in PI statisitics. Must establish criteria for collecti statistics.

.i Weightimja ures, e.g., is a failure in an actual event more important?

(

1 l l I 1

-3 ,

-wj k

aw I i r ,

3 i i

a.-

1 1

l l

t I

1 i

J i

November 3,1998 a

a  !

.y. s, e - - -,

(

Emergency Preparedness Corner Stone GENERAL DESCRIPTION Emergency Preparedness (EP) is the final barrier in the defense in depth NRC regulations provide for ensuring the public health and safety while allowing operation of civilian nuclear

'~

way it is related to the Reactor Safety Strategic Performance Area.10 CFR Part reactors 50.47 an @1Ap dix E to Part 50, define the requirements of an EP program and the licensee commits tbim mentation of these requirements through an Emergency Plan (the Plan). A high level ofE creases the risk of affecting the public health and safety in the event of a radiological emergency.

Statement of Olie e k

Ensure that the.1cen 's capable ofimplementing adequate measures to protect the public health and safety in the event of a radiological emergency.

Desired Result / Performance E ectation l

k Demonention that a reasonabl assgce exists that the licensee can effectively impleinent emergency plans to adequately protect the public health and safety in the event of a radiological emergency.

KEY ATTRIBUTES OF LICENSEE PERFORM ' E THAT CONTRIBUTE TO EMERGENCY PREPAREDNESS .

l The key attributes of an EP program are the ability of the Emergency Response Organization (ERO) to implement the Plan, the ability to activate the ERO in a tirfiely mTmner, the readiness and quality of the equipment and facilities that support the ERO and thdquality of the emergency plan implementing procedures (EPIPs) the ERO uses to implement thefan. Figure 1 depicts these key attributes.

Measures taken to protect the public from the effects of a radiological emergency must l necessarily involve action by both licensee and governmental authorities in the vicinity of the l

reactor. The facets of the EP program that involve recognition of the accident, mitigation ofits j affects, assessment of the offsite impact and communication ofinformation to govemmental

! authorities, including protective action recommendations, are generally referred to as onsite EP.

The program, procedures and systems maintained to implement govemmental actions are refened to as ofsite EP. The licensee is responsible for developing and implementing the onsite

, EP program and provides support to the offsite program as required. The NRC is responsible for ensuring the adequacy of the overall program, but relies on the Federal Emergency Management Agency (FEMA) to assess the offsite program. The development and collection of performance November 7,1998 1

w indicators (PI's) for offsite EP is not considered necessaiy or appropriate because FEMA performs regular assessments of offsite EP programs that are the basis for reasonable assurance that adequateprotective measures can and will be taken in the event ofa radiological emergency.

ERO Performance (IIuman Performance)

The imp ion of the Plan is dependant on the periormance of the ERO in their EP assignm 7 ts. ese duties are in addition to routinejob duties and the opportunity to acquire proficie generally provided only during training opportunities such as drills. There are many areas important to Plan implementation, but the risk significant areas of ERO performance are: ,

. Timelv and accu lassification of events; including the recognition of events as potentially exceedinFamergency e action levels (EALs) and any assessment actions necessary to support the classification; this is measured by a PI; Timelv and accurate notification ofoffsite govemmental authorities; including adequate performance ofnotifications as ied in the Plan; this is measured by a PI; Timelv and accurate development and communication of orotective action recommendations to offsite authorities; including providing protective action recommendations (PARS) to governmental aut%es, the decision making process to develop the PARS, any accident assessment nemes to support PAR development, the protection of emergency workers and direct in'e e with offsite authorities; this is measured by a PI and inspection; and A

=

Emergency Resoonse Organintion readiness; including training of the ERO to ensure proficiency, the efficacy of the assessment program to identify fi3iencies in ERO proficiency and the efficacy of the corrective action program to orrect these deficiencies; this is measured by a PI and inspection.

ERO Activation Licensee systems to activate the ERO are critical to implementing the Plan in a timely manner.

This involves a notification system for individual ERO members, training in its use and testing to ensure facility activation goals can be met. The risk significant area for ERO activation is:

The demondration of timelv activation of the ERO; including the functioning of notification systems, efficacy ofindividual training and adequacy of the duty roster to provide replacement individuals when necessary; this is measured by inspection.

November 7,1998 2

i Equipment and Facilities j Equipment and facilities required to implement licensee emergency response are specified in the l Plan. The availability and quality of this equipment and facilities is a risk significant area:

Availability of the Alert and Notification System; including all facets of the system and n ed in accordance with FEMA direction; this is measured with a PI.

I Availabi ity of communication channels; including channels to governmental authorities HMf1Ictween emergency response facilities; this is measured by inspection.

Availability of emdnment and facilities; including surveillance of equipment and l facilities; this is ' e " ed by inspection.

EPIP Quality EPIPs are used by the ERO to implemen tical facets of the emergency response. The response is tested regularly in drills and cises and the quality of EPIPs is generally improved through assessment ofperformance. e nshignificant areas of EPIP quality are:

Classification of events; this is measured by a PI and inspection of procedure configuration; Notification of offsite novernmental authoritie tMs is measured by a PI; and

~

.JL.

Development and communication of protective action recommendations to offsite authonties; this is measured by a Pl.

r 3 Offsite EP State / local governmental authorities are responsible for implementing protective actions to protect the public health and safety. While the licensee must supply appropriate information to the governmental authorities to allow the timely implementation of protective actions, only governmental authorities are authorized to implement the actions. The risk significant areas of offsite EP are:

Timelv activation of governmental authorities; including activation of all elements necessary for response; this is measured by FEMA evaluation; Imolementation of protective actions; including activation of Alert and Notification System, provision of protective action instructions to the public, trafYic control, worker protection, and care of evacuees; this is measured by FEMA evaluation; and November 7,1998 3

=

Offsite Emergency Response Organization readiness; including provision of training for the offsite ERO; this is measured by FEMA evaluation.

PERFORMANCE INDICATORS Complia%yP programs with regulation is largely assessed through observation of response to simuldde rgencies, although routine inspection of onsite programs are currently conducte.d by NRC as the state ofreadiness. Demonstration exercises form the key observational tool currently to support, on a continuing basis, the reasonable assurance finding that adequate protective measures can and will be taken in the event ofa radiologica! emergency. This is especially true for the sk significant facets of the EP program. Inis being the case, the PI's proposed for onsit aw significantly from performance during simulated emergencies but are supplemented by see self a ;sessment and NRC inspection. NRC assessment of the adequacy ofoffsite EP k (as it does currently) on regular FEMA assessments.

1.0 Drill / Exercise Performance (D , collected quarterly, for use in a six month trend and a two year rolling average.

Fraction, (numerator and denommator,) of successful performance opportunities over all opportunities for:

Classification of Emergencies Notification Protective Action Recommendation r 3 Basis: Recognition and subsequent classification of events is a sk significant activity.

Classification should lead to activation of the ERO as appropriate to the emergency class d notification of governmental authorities.

Timely and accurate notification of offsite authorities is a risk significant activity.

Notification should lead to mobilization of govemmental authorities, as appropriate.

'Ihe timely and accurate development and communication of PARS is a risk significant activity. It requires that several supporting activities be performed including: accident assessment, quantification ofradiological release magnitude, projection of the potential dose to the public and communication to government authorities. Communication of PARS should lead to actions by govemmental November 7,1998 4

J authorities to protect the public health and safety. -

If plant staff assigned EP duties consistently performs these activities in a timely .

and accurate manner, it indicates that the EP program is operating at or above the threshold oflicensee safetyperformance above which the NRC can allow l licensees to address weaknesses with NRC oversight through a risk informed

. spectionprogram. ,

I Requirenj:ny Only activities that are formally critiqued for the timely and accurate performance of these activities shall be included in this statistic.

Simulated emergency events that are identified as in advance of p ance as opportunities for this PI shall be included in the statistics, i.b , a candidate opportunity can not be removed from the data set due to p orgerformance. Opportunities shall include actual emergency de>b.larations and the biennial exercise and may include other dri appropriate scope and operating shift simulator evaluations conducted by the licensee training organization.

~

Operating shift s fator evaluations should be included when the evolution is bei evhted and is of such a character as to require classification i it were a real event. j No minimum is set for these observational opportunities, but a statistical analyses performed on the data %il recognize that the more opportunities provided the more accurately the P numerical value represents licensee performance. Statistical oppord!nities should include multiple events during a single drill, evolution, etc., if supported by the scenario as follows:

r g1 each expected recognition and classificat on opportunity should be included, 1 notification opportunities should include notifications made to the state / local governmental authorities for initial cuergency classification, upgrade of emergency class, initial PARS and changes in PARS, and PAR opportunities should include the initial PAR and any changes to the PAR necessary due to meteorological changes or plant status changes.

All declared emergency events and the biennial exercise shall be formally November 7,1998 5

l 1

cntiqued for compliance with approved EPIPs and adequacy of those l EPIPs, at least in the areas identified as risk significant. All PI statistics l from these events shall be reported.

l Data Reporting Frequency Data woY vided every 3 months PI Threshold

,s The threshold for the gr$n ne (need standard wording here) is two fold:

= 85% for the presi 1 \~ raonths 90% for the previous two years Discussion to follow, there is a good b . s or this and industry acceptance.

TBP The threshold for the red zone is:

- 60% for the previous two years 0 i

.L TBD r 3 2.0 Emergency Response Organization Readiness (EROR)

Percentage of ERO and operating shift crews that have participated in a drill or exercise in the past 24 months.

Basis: EP programs ensure the readiness oflicensee personnel, facilities and equipment to support response to emergencies and protect the public health and safety. The previous PI (DEP) measures the performance of segments of the ERO in risk significant activities during simulated and actual emergencies. However, there are several supporting activities that are not measured by DEP, such as ERO ability to diagnose plant accident conditions, formulate mitigating actions and implement them under accident conditions. EROR measures opportunities that the total ERO has been given to gain proficiency as an integrated organization. It is expected that the licensee assessment program will critique these drill / training opportunities November 7,1998 6

- r and identify areas for improvement and that the licensee corrective action program will ensure these improvements are carried out. It is expected that this training will contribute to proficiency and overall ERO readiness. In this way EROR indicates the proficiency of the ERO. EROR also measures the drill / training opportunities provided to operating shift crews.

l Da licensee consistently ensures that the program is operating at or above the threshold oflicensee safetyperformance ove which the NRC can allow licensees to address weaknesses with NRC oversight through a risk informed inspection program.

Requirements: RO participation indicated is that of the essential positions  :

c) itted to in the Plan and the operating shift crews. Plant workers, s

xigty personnel and others that are on shift or may be called in to sMhe emergency but do not fill positions on EP duty rosters or are l not part of the operating shift crews are not required to be captured in this Pl. However, positions that are formally on the EP duty roster, but not committed to in th Tmergency Plan and others important to emergency response may be * 'c' ded.

Participation may be ei er as a drill / exercise participant or as an evaluator l

(but not as an observer). Only participation in the drills, exercises and '

evolutions that are used to provNput to the DEP PI may be used in the  ;

statistics for this Pl. Multiple a m l (yi ces to a given ERO position co take credit for the same drill /ex cite if their participation is a meaningful and thorough opportmsty to ge n-proficiency in the assigned position.

Evaluated simulator evolutions that contribute to the DEP PI statistics would be considered for operation shift crew p cipation. However,if all crews have participated in more than ene evalua simulator evolution during the measurement period,it ma; only be ted as 100% and not more.

1 Data Reporting Frequency Data would be provided every 3 months PI Threshoid The threshold for the green zone (need standard wording here) is:

November 7,1998 7

. 90% for the previous two years Discussion to follow, I have industry buy in, but no basis other than our expert opinion.

TBP The threat old- r the red zone is:

q l TBD A

3.0 Alert and Notification System (ANSR)

  • Percent availabil lert and Notification System Basis: The Alert and Notification System (ANS) is a critical link to notifying the public of the need to take protective actions. The licensee maintains the ANS and locd  !

I governmental authorities rate it when necessary. Assurance that the system has a high rate of availabl i sincreases the assurance that the licensee can protect I the public health and sapty kng an emergency.

If an EP program consistently ensures that the ANS is in a high state of readir.ess it indicates that the program is operatiqqr above the threshold oflicensee safetyperformance above which the N]C n allowlicensees to address weaknesses with NRC oversight thro isk informedinspectionprogram.

l Requirements: Statistical infonnation gathered in support of system availability reports given to FEMA would form basis of this Pl. It is proposed that the following rules be applied to gathering of thifiihi: T I

Failure of a siren is indicated by failure 3(any portion of the system that would have prevented it from performing its safety function, i.e., creating its design sound level and pattern.

- The period assumed for the failure would be IAW the direction to the state / licensee from FEMA on gathering statistics (industry to resolve inconsistencies in data collection)

= Periodic testing is in accordance with FEMA guidance and actually tests the ability of the siren to perform its intended safety function.

Data Reporting Frequency Data would be provided every 6 months November 7,1998 8

l PI Threshold The threshold for the green zone (need standard wording here) is:

94 % for the previous year Discuss low, I do not have industry buy in, and there is no basis other than expert opinion. e s a lot ofroom between this and 90%

TBD

'Ihe threshold for the rell'zo is:

90% for the prea 1

This is the FEMA requirement and is acceptable to the industry.

l INSPECTION AREAS A

The inspection areas discussed below are necessary to ,

are the licensee EP program is operating at or above the threshold oflicensee safety; 'rmance above which the NRCcan allow licensees to address weaknesses with NRC over ht through a risk informed inspection program.

r 1

  • Verification that the collection of data is in compliance with th uidelines described for

, PI's.

I

  • Review the efficacy of the self assessment program to gather valid statistics through the j accurate critique of successes and failures during performance opponunities.
  • Review conduct of testing of the Alert and Notification System for compliance with l FEMA guidance.

l

  • Review of FEMA evaluation of the biennial exercise. ,
l 1

-

  • Review of the FEMA finding of reasonable assurance. I
  • Review of the licensee self assessment of ERO activation tests, to include the conduct of

]

f November 7,1998 9 l

i

tests, the results, trends in results and associated corrective actions. Attention should be paid to the licensee correction of repeat failures.

This area has the potential for development of a PI, but it could not be accomplished in the current time frame. The PI could be similar to the following:

7 4 Percentage of essential ERO positions that successfully responded in each of the last 4 pager tests.

q Requirements: Pager tests indicate the readiness of the ERO to fill duty roster positions during emergencies that take place during j off-normal hours. Only statistics from tests during off-normal hours may be included. The results of all such tests s.~ shall be included, i.e., a test may not be removed from the data set due to poor performance. The frequency tests may be set or changed at licensee discretion, but results from the last[our must be included in the Pl. Successful response m that it could reasonably be expected that the position u he been filled within the facility activation time goal specified in the Emergency Plan. A position is filled by one qualified individual. Multiple individuals filling a single position can not be counted as more than one success in the statistics. L Review that the EAL scheme has been maintained in the NRC approved configuration or i that changes have met the requirements of 10 CFR 50.54(q).

1

=

Review the efficacy of the self assessment program to identify 'oblems in the following areas:

ERO proficiency in general, EPO ability to diagnose plant accident conditions, formulate mitigating actions and implement them under accident conditions, a readiness and quality of EP equipment and facilities, direct interface with offsite authorities during exercises and drills that involve offsite authority participation, e.g., review the self assessment of the adequacy of direct interface with offsite authorities in the area of PAR communication, adequacy of communication channel testing, communication system availability,

- timely conection of communication channel deficiencies, implementation of severe accident management guides, and November 7,1998 10

1 adequacy of worker protection during exercises and drills.

Review self assessment of actual declared events, the biennial exercise and missed l

declaration of events during the inspection period.

Review the efficacy of the corrective action program to correct identified deficiencies,

~

4 address trends and address repeat deficiencies.

Pls Not , P group has no control over:

l Efective EOP impleme to bylicensedoperators This was handed off t to be picked up else where!

A F

r 3 I

l i

i November 7,1998 11 1

Table 1 i Emergency Preparedness Cornerstone )

Kev Attribute Areas to Measure Means to Comments ,

Measure ERO P orm ce Timely and accurate PI Recognition and subsequent classification of classification ofevents is a risk l events significant activity. Classification should lead to activation of the

, ERO as appropriate to the

! einergency class and notification j ofgovernmental authorities.

u w Timely and accurate PI Timely and accurate notification notification ofoffsite of offsite authorities is a risk governmental I significant activity. Notification authorities should lead to mobilization of governmental authorities, as appropriate.

Timely and accurate PI The timely and accurate development and and ' m development and communication communication of protective action Risk informpd Q of PARS is a risk significant activity. It requires that several recommendations to inspection supporting activities be performed offsite authorities including: accident assessment, quantiCPt'an of radiological releas[mhndude, projection of l the potenliid dose to the public and i communi$dion to government l authorities. Communication of j l PARS should lead to actions by I governmental authorities to protect the public health and i safety.

November 7,1998 12 I

Kev Attribute Areas to Measure Means to Comments Measure Emergency Response PI EROR measures opportunities that Organization and the total ERO has been given to readiness Risk gain proficiency as an integrated -

D informed organization. It is expected that inspection the licensee assessment program i will critique these drill / training opportunities and identify areas for improvement and that the licensee corrective action program R will ensure these improvements are carried out. It is expected that this training will contribute to proficiency and overall ERO readiness. In this way EROR indicates the proficiency of the ERO.

ERO Activation The demonstratdm obRisk ' Activation of the ERO is critical to timely activation of informed implementing the Plan in a timely the ERO inspecg manner during emergencies.

Eauipment and Availability of the PI The ANS is a criticallink to Facilities Alert and notifying the public of the need to Notification System take protective actions. The licensee maintains the ANS and local =v= mental authorities operatlit 'hdinecessary.

Assurane the system has a high vailability increases the assurance that the licensee can protect the public health and safety during an emergency.

Availability of Risk Communications channels are communication informed critical to the notification process, i channels inspection the PAR process and the  !

funcitioning of the ERO during emergencies.

I November 7,1998 13 l 1

l

. . . _ . . _ _ _ _ __.._._z___ . _ _ _ _ __ _

Kev Attribute Areas to Measure Means to Comments Measure Availability of Risk The facilities and equipment equipment and informed identified in the Plan are critical to facilities inspection the funcitiong of the ERO during

'n emergencies EPIP QYality - Classification of TI This EPIP supports the recognition

/ events and subsequent classification of events is a risk significant activity.

Classification should lead to 9 activation of the ERO as

! appropriate to the emergency class i and notification of govemmental authorities.

Classification of Risk This EPIP suports the recognition events .

informed and subsequent classification of inspection events is a risk significant activity.

. Classification should lead to activation of the ERO as appropriate to the emergency class and notification ofgovernmental authorities.

Notification of offsite PI This EPIP supports the timely and governmental accurate notification of offsite authorities authorities is a risk significant activity. ripiiHeation should lead to mobilidtion of governmental authoritie[as appropriate.

November 7,1998 14

Kev Attribute Areas to Measure Means to Comments Measure Development and PI This EPIP supports the timely and communication of accurate development and protective action communication of PARS is a risk D

recommendations to significant activity. It requires offsite authorities that several supporting activities be performed including: accident assessment, quantification of radiological release magnitude, projection of the potential dose to R the public and communication to got anent authorities.

Conununication of PARS should lead to actions by governmental authorities to protect the public g health and safety.

Offsite EP Timely activati -

FEMA State / local governmental governmental -Evaluation authorities are responsible for authorities implementing protective actions to protect the public health and F

safety. While the licensee must supply appropriate information to the governmental authorities to allow the timely implementation ofprotective actions, only goveup, m;puthorities are authorize o implement the actions.

November 7,1998 15

i Kev Attribute Areas to Measurg Means to Comments Measure Implementation of FEMA State / local governmental protective actions; Evaluation authorities are responsible for implementing protective actions to D

protect the public health and safety. While the licensee must supply appropriate information to the governmental authorities to allow the timely implementation ofprotective actions, only R

Offsite Emergency FEMA governmental authorities are authorized to implement the actions.

State / local governmental ,

Response Evaluation authorities are responsible for Organization implementing protective actions to readiness protect the public healdi and safety. While the licensee must supply appropriate information to the governmental authorities to F

allow the timely implementation ofprotective actions, only governmental authorities are authorized to implement the actions.

r 3

.a November 7,1998 16

/

Table 2 Performance Indicators PI Name Measurement Area Defination Threshold

Drill /Ep. J - Timely and accurate Fraction,(numerator The threshold for the Perfor nc EP) classification of and denominator,) of green zone (need events successful standard wording l 4

performance here)is two fold:

Timely and accurate opportunities over all

, ification ofoffsite opportunities for: 85% for the previous g ramental six months orities Classification of a L Emergencies 90% for the previous Timely and accurate two years development and Notification communication protective act Protective Action recommend Recommendation offsite authonties Emergency Emergency Response Pe(c,egnta e of ERO The threshold for the Response Organization and c'pertting shift green zone (need Organization readiness cre standard wording hat have Readiness (EROR) parj gipated in a drill here)is:

or exercise in the past 24 months. 90% for the previous rp yo years l Alert and Availability of the Percent availability l The threshold for the Notification Alert and of Alert and A green zone (need System (ANSR) Notification System Notification System standard wording here)is:

94 % for the previous year The threshold for the red zone is:

90% for the previous year November 7,1998 17

O i

I i

t 1

I i

D i

R i

i A

F r 3 1

l 1

l i

i l

November 7,1998 18 4 4

I 4

1 a ,

, ,, -. .- - y. , - n ,

. . ._= _ . _ _ - . _ -. __

i NOTE TO: Randy Sullivan FROM: Serge Roudier DATE: 11/02/1998

SUBJECT:

DEFINITION OF THRESHOLDS FOR EMERGENCY PREPAREDNESS (EP)

CORNERSTONE PERFORMANCE INDICATORS (PIs)

D Results based on the NRC analysis ofinspections findings made[n the 11996-1997 Number of NRC Evaluated Exercises conducted in 1996- 7- 8 g-Type of Failure Number of Estimated #v! rage of Stan N-

~

Failures

  • Number of 'ec' esse Deviati Opportunities " # .-

Classification 10 272 [y 96 /g D 8.7 %

Notification 8 27pk k%. L'% 8.2 %

PARS 6

[1% h96'k/ 13.6 %

) TOTAL 24 I'.

F Only Failures identified in NR valuate xercises Timely"has been interpreted as not l exceeding the 15 minutes thr hold by rg/g'than aminutes)

" 42 opportunities oppo per exerc'setal:o7'PARsClassi (ties per ex ise or 2x = 136) cationg oiification(total: 4

] ..\ ,l ~. k ID Resu ts based'on the 'EI analysis ofinspections findings made in the period 1997-mid

^

122.8  ; -c g;

4.- .

,1 Number of C-Ev(uated erciseM:onducted in 1997- mid 1998 (estimated): 45 (68/1.5) y .r A Ty/[dIfFailure N pypf' Estimated Average of Standard

/ Fa' 'es Number of Successes Deviation d Opportunities "

ClasIificationj[ 6 180 97 % ***

9 180 95 % ***

ko'tification/

PARS 9 90 90 % ***

TOTAL 24 1

l l

e 4

i

! Only Failures identified in NRC Evaluated Exercises (" Timely" has been interpreted as not i

exceeding the 15 minutes threshold by more than a few minutes)

" 4 opportunities per exercise for Classification and Notification (total: 4 x 45 = 180) -

2 opponunities per exercise for PARS (total: 2 x 90 = ?)

  • " Specific data missing to make the determination HD Results based on the NEI analysis ofinspections findings made in the udflod 1996 l Number of NRC Evaluated Exercises conducted ir 1996 (e Aated):( 8/x l Type ofFailure Number of Estimated A e,r(ge of '$taqdar[\ r Failures
  • Number of Opponunities **

S egesses

j. : . .

Dev1 'on w

/

~~$y Classification 3 136 h;: ' $8 % ***

Notification 4 136

[ \ 97 %\ ***

PARS 3 g T 9)To [

TOTAL 10

~

Y b., .. -

! Only Failures identified in NRC E uated E ercises W-

" 4 opportunities per exercise ificatj p'and Notjcation (total: 4 x 34 = 136)-

2 opportunities per exercise AIs (t (al!* 2 x 34 )

"* Specific data missin to ake the te,rmina,ti

.A ' >~

.. ; .l*],

'2

- [-

- e ~. ;,

c.'.c.
~,
7,,W 9 ,

= .(, : , . .

j

-]'

3."

i l ,

~ ,

l 2

1 o l

\ .

s . .- ,

/ \

, Number of Cornerstone Findings by Facility 1

02.Nov.96 $.. V '

NRC_ Office Name tot. # Findings Classification Notification PARS R1 Beaver Valley  ? 3 '. . . */, < < e ', I.c e *,.

Calvert Cnffs -1 1 d n .". / , . e. f, e ' ',

Ginne .d. 4 c o,.. f,,.', f . . *' f l Haddam Neck Indian Point 2

~j. 2 2

25*f 1 .f , , % *%

-e 1

(- .  ;

indlan Point 3 -4 0 James A. Fitzpatrick 0 Limerick i 1

{ Maine Yankee d 2 JMillstone '. 5 l

lNine Mile Point ( 2 l Oyster Creek , 3 . :; . ', 1 l Peach Bottom

. O l Pilgrim & 1 l Salem / Hope Creek & 3 ISeabrook d 0 iSusquehanna r 1

! IThree Mile taland i 7 92,5% 1 q q* /, 1 y '

l IVermont Yankee d 2 Total Number of Findings: lLo 39; 2j 1l 2l l 23.36%l 16.87 %! 7.89%l 33.33%  ;

T kJ ik...'.< .' [Y.s ">.: .-

2 .-

L fr 5. CW2 .f4 ;; # P3r

  • a %Ii. '
Ril Browns Ferry 1 0 jBrunswick . _s 1 ?. C + ' 1 ,I f,atawba . ,f 1 p,rystal River .' O 1 ,. ,

l ;Edwin I. Hatch f 0 1

%.k ' #'

H. 8. Robinson 0 Joseph M. Farley .i. 1 IMcGuire A. 0 '].$ e4

. 1 INorth Anna # 1 .

IOcones 2 0 1 -

- -4 bY"'

lSequoyah i 6

!Shearon Harris .f_ 2 iSt Lucie d., 6 iSummer 1 0 * -- --" --

) bj. hc . b i

Surry 1 0 1

Turkey Point i 2 Vogue -1, 3 Watts Bar 1 0 Total Number of Findings: l:1.9- 23l 3l 3l l l 13.77%l 26.00%l 23.08%l ,j O

YO ))av.h , a'( Cs };c*

&, .c 1. . s 1

e .*

'l 4

NRC Office Name Tot. # Findings Class!!ication Notification PARS

' Rill Big Rock Point .t 3 Braidwood -1 1 Byron -d, 1 SC '/, 1 .

Clinton 1 D.C. Cook d 4 q e */, 1 / ( , ..

Davis-Besse ,1 6 Dresden d 1 }_S *f, 1 Duane Amoid d 2 Fermi g 3 Kewaunee .i 5 La Salle County e 5 Monticello s_ 1  ::4*; 1  ! .* , [

lPalisados ,1 5 Pony s_, 4 Kg *f, 1 .', [

Point Beach y 4 l Prairie Island .d. 1 Ka/, 1 1 c, */, 1

['

l Quad Cites 1 2 ,

're', 1 3 r t; 1 ,

l Zion  ? 5l Total Number of Findings: lC: Hj 5l 3l 1l l 32.34%l 41.67%l 23.08%l 16.67%l s 2 e jRIV W nsas 1 4 _ h, j, d . ..

Callaway 4 2 /

Comanche Peak f 5 Cooper 1 2 ,#; -

d " ' '

Diablo Canyon 5 -

pg c,ig 4 3

' L s . :/*' - F Grand Gulf  ? 6_ $ e * ,. 1 lPalo Verde d 2 25*I 1 f.$'4 2 l'; .

PJver Bond -d 4 SC */, 1 5e% 1 San Onofre 1 2

' "f e.'.

'~

South Texas Protect -1 3 3 y .. * -

Trojan 1 -

M/ashington Nuclear  : 3  ?! */, 2 g.p iWaterford do 8 C L% rs % L.A g & g.h.4 by.L

& su c u, .::u Has) -DL

>o

  • ',a -2: o, o s

~ S

-o,c4 g p.L -L3 C a ,py,t-.ta. }g is- w/. M% '%

h.em A u": w. u,q. rm uas a2c

l NOTE TO: Randy Sullivan FROM: Serge Roudier ,

DATE: 11/05/1998

SUBJECT:

DEFINITION OF THRESHOLDS FOR EMERGENCY PREP DNESS I (EP) CORNERSTONE PERFORMANCE INDICAT RS (Pi' U?

-PROPOSAL 1- " ? c-M s

x

//

\

, . TpSHOLp5 PERFORMANCE INDICATORS Ey . Short terqi /. g-term k: N (6 mos) (24 mos) l Classification

[C Percent of timely and accurate classifications of simup[4mergenc'y gents k 85 / 90  ;

Percent of(operationnally oriented) LERs that shIuld have b' declared'h ' I 85 / 90 l

emergency events but were not [ [ j.1 i ,

Percent of declared emergencies that we elya cuDate 85 /90

, -]

Percent of declared emergencies thd ' re later d to be ' propriate (or 85 / 90 retracted) /r -

Notification 4 (( [f. [

P to ly an t ationi ulated emergency 85 / 90 Percentqftimely accudate notifica during declared emergencies 85 / 90 Percen;a lity for 11e d Notification System 85 / 90 ProteckAction RecommeDd,ations (PARS)

[hercent of timely anhetp e PARS during simulated emergency events 90/ 95 l krgency Respon,sef rganization (ERO) Readiness Rd ' t has participated in a drill or exercise in the past 24 7 l

Midoperating shift crew that have participated in a drill, exercise or  ?

evaluated simulator evolution in the past 12 months Percent of essential ERO positions that successfully responded in each of the  ?

last 4 pager tests 1

v 1

The above table presents the proposed indicators to measure licensee performance in the Emergency Preparedness comerstone. For each indicator of performance, two thresholds are being proposed that will be used to trigger increased NRC action. The first threshold, also refered to as "short-term performance thresold"is designed to trigger NRC action upon licensee's performance decline over the past 6 months. The second threshold, also refered to as "long-term performance threshold"is designed to trigger NRC action upon licensee's perform A e decline over the past 24 months. This dichotomy between "short-term" and "lon rm"' esholds was deemed necessary to balance the significance of shon-term long-t,ebn p ^ormance time may be .si indications (a slight decline in performance sharper decline noted in a shorter period). In consequene e over a longspe[ "short-are low

' " thre than the "long-tenn" thresholds. --

Determination of the proposed "short-term" and "long-ter thresholds The lack of risk models in the emergency preparedness s[c'eanakes' (definition of risk-informed thresholds for each perfonnance indicator cakgory difqcult..Thi is why a qualitative / comparative approach to risk has been use tead of a quantit 'e one to come up with usable "short-term" and "long-term" thrgshIlds. This alitativel parative approach 7~

recognizes that a higher risk generally existsJfdTe' ublic) anponeous PAR, than from an erroneous Classification of event or N ification loffsite govepunental authorities. On the other hand, it has not been possible to d rmine bb 'een an emorfeous Classification of event and an erroneous Notification of offsit ove ' ental auth* ties which was the most risk 0}'A esho significant. Consequently, the "P/

than the " Classification" and "No d the'fication[Tlhegholds'have

" Classification" and 7 b "Notificatio 'Thresholdshave, en considered to e the same value.

/ L h ,

The abilityjf alic see topnerate ely and accurate Protective Action Recommendations (PARS) to offsite au ' ties having a r ct impact on public health and safety,it is expected that licensees demo'nstrate extremelv'cood ve(rformance in this area. Such a level of perfo believed to be' jished iQe cor'regfonding "shon-tenn" and "long-term" Performance Indicators'sh'ow a su2ces's rate ab ve 90% and 95 % respectively.

N -

The'[ility of a licensee imognize and appropriately classify events exceeding emergency acdo, levels (EALs)inj imely manner is expected to be very cood. As previously indicated, the sam $ evelofperform ce is not required from a licensee for " PARS" and " Classifications", to reco e differene the risk significance of the two activities. The NRC will have confidence ,

In the licBnsee'd assification" ability if the corresponding "short-term" and "long-term" '

P n icators show a success rate exceeding 85 and 90 % respectively.

Likewise, the ability of a licensee to notify offsite authorities in a timely manner is expected to be verv cood. The NRC will have confidence in the licensee's " Notification" ability if the corresponding "short term" and "long-term" Performance Indicators show a success rate 2

. , l exceeding 85% and 90 % respectively.

Adequacy of the proposed PI Thresholds based on past experience l

Emergency preparedness findings resulting from NRC Exercise Inspections conducted in 1996 and 1997 were systematically reviewed in order to retrospectively evaluate the lic 'hes' global and individual perfonnance in the risk-significant areas of"Classificati '"No cation" and

" PARS" during simulated emergencies events. The results the findings ( ew is presented in the following table:

  • b:y. -

~

Number of NRC Evaluated Exercises conducted in 199  ; 997: 68 .' ,1 .

~ ,.y Type of Failure Number of Estimated Akrage' . Standard Failures

  • Nutaber of Succe Deviation Opportunities ** [ sses?%iq.A\

Classification 10 2)2'% 96 % y 8.7 %

Notification 8 [y k 97 *[ 8.2 %

l PARS 6

[ 13[ % ![/. 9 13.6 %

TOTAL 24 .[

Evalua ed gerci (" Timely" has been interpreted as not Only Failures exceedinkthe identifiedy 15 minutes t , hold by(more few minutes)th

    • 4 opp 6'rtun[ffes per e(ercise f Classificatin, and Notification (total: 4 x 68 = 272)-

2 opportunitiehs er exercise for Rs (total: 2 x 68 = 136) )

=,.~.

N;;. '

The above table.shows t the industiy a whole is well above the proposed Performance

~ Indicator Ty ldin the " sification" and " Notification" areas, with results of 96 and 97 %

respectiv 1 compared to the 9D# threshold. With regards to " PARS", the industry as a whole is also ' e the proposedithr but with fewer margin: 96 % versus the 95 % threshold value.

It been recognized the NRC that the overall emergency preparedness performance of the iEdustry was good in last few years. The proposed system ofperformance indicators and socihted threshol confirm this assessment.

V Y gv '

3 l

f NOTE TO: Randy Sullivan FROM: Serge Roudier DATE: 11/05/1998

SUBJECT:

DEFINITION OF THRESHOLDS FOR EMERGENCY PREP ~

DNESS (EP) CORNERSTONE PERFORMANCE INDICAT f

- PROPOSAL 2 - H @hr fe '

f PERFORMANCEINDICATORS

[k [ MSOLk l Classification

[

I Percent of timely and accurate classifications of simulatedpacy #ci-1.645ceic //(n) y Percent of(operationnally oriented) LERs that should have been deb ed as . ~

emergency events but were not U .

' ~

Percent of declared emergencies that were tim 3 a} curate

,r**

Percent of declared emergencies that w p  %. l ater foun be mapprop ' (or l

}

retracted) ,

Notification h /[ f l Percent of timely and ac te n ' teations'dukimu ' d emergency g,.1.6450xi//(n) events \ . } fs- '

Pdeent of ly andqcurate n cationsMurin eclared emergencies Percent agailabiiI!y{or Alert Notification System

, Protective, Action ReebDn(enda'tio[(PARS) t of timely Emd accuraEep during simulated emergency events ,, .1.645c ir //(n)

Em'er'gency ResponsefOYganization (ERO) Readiness

/.. Percent of ERO th [ s participated in a drill or exer:ise in the past 24 months

?

g

- P h of 4 ting shift crew that have participated in e drill, exercise or  ?

evaluatedjr'fnulator evolution in the past 12 months 7 t of essential ERO positions that successfully responded in each of the  ?

last 4 pager tests 1

.. .n The above table presents the proposed indicators to measure licensee performance in the Emergency Preparedness cornerstone.

Basis:

The licensee average industry performance in EP for 1996-1997 (1992-19987) is considered good and can form the base of acceptable performance above w hNRC action is notjustified ..

,N.

  • #ci, #ui, #ri, o ic ,o i nand opi can then be calculate 'b d on apt exp ce

. The threshold is by essence variable because then ber of o rturlitj 'n'y 'es one licensee to another and from one evaluating' d fro' another

. Tae specific threshold for the " Classification" P1 would be ased on data from 1996 and 1997 Exercise Inspections: c = 96 % and ac i  : ;~ ;.

. :s .e T,i = 96 % - 1.645 x 8.7 % _ '~@ - M

.;x h- pr J u gt. .

Or =

T,i = 88.8 % for n f(..lvaluat ercise . x 4 opport./exe.)

T,i = 92.4 */ f = 16 (4 aluated exercises x 4 opport./exe.)

~ .

6 4- 5$ ' &" d

,s;4 - A_

,_+-.-

, s_ -+ -f+

,d .. ,

. .i

/ll'^;.

-T-,

.; 5 .: .. 'i b e

, G. ' [ f '

'D; ;-

3:-:

( r i

s I'

.* eF e-.

2

, .. . _ ._. - -. __- . . _ _ _ - - . _ . - . . - - . . . . - - - _= . . - . - -

5256653447

., 10/26/98 MON 09:55 FAI 5056653447 @ 001 oR, ,

i  ;

4 j Results of the NRC's Performance Assessment Public Workshop  ;

i

i 1

i 2

Prepared by:

i Heidi Ann Hahn, Ph. D.

! F. Kay Houghton

! Jerome Morzinski l Rebecca R. Phillips, Ph.D l Daniel L. Pond, Ph. D.

Los Alamos National Laboratory i Los Alamos, NM 87545

)

) Prepared for:

Office of Nuclear Reactor Regulation Divisions of Reactor Projects Washington, DC 20555 l l

i Octob3r 25; 1998 __

, m y aina>WIWW N"

. . en. as eup " " ' " "

"'""' * #G g,gt tirand tax transmtr.al memo 75M

' "** dwk, +Ls.A - --- .

, n. Dr ,

e man 0 _

,j"'gos,41s- 4 W7 a

20fts/08 MON 09:ss FAI sosS8s34 0 @ 002 l

BBREVIATIONS  !

ABOD Office for the Analysis and Evaluation of Operational Data ALARA aslow as reasonably achievable ItWR boiling water rescer CAL confumatory actionletter CFR Code of Federal Regulations '

CP - civilpenalty i EP emergency preparedness IQ IRO emergency response organization IEMA Federal Emergency Management Agency IME foreign matenals exclasion IIRA

]AP integrated assessmentprocess IRAP integrated review of assessment gm 1ANL Los Alamos Nationallaboratory ,

LER licensee event report IOCA loss of coolant accident NEI Nuclear EnergyInstitute NOV notice of violation NRC Nuclear Regulatory Commission 1%R protective action recommendation PI performanceindicator PIM plantissues matrix PPR plant performance review PRA probabilistic risk assessment RCS reactor coolant system SALP systematic assessment oflicensoc performance SAMG severe accident management SMM senior management mcctag SSC structurcs, systems, and components SSPI safety system performance indicators l

l l

1 l

l E

-- .-.n ... e m m -. n ,.pg, g g g.,, p , . - , , p3,,, p g

~~ '

10/20/98 NoN09:56FAI'50566534d Q)cos

- t CONTENTS Page  :

l Abbreviations ii 1 Introduction 1-1 1.1 Background 1-1 1.2 De Comeistone Framework 1-4 1 13 About the Workshop 1-9 l

2 Fundamenta1 Issues 2-1 1

2.1 GeneralPolicyIssues

SafetyPerformanceExpectations/ 21 Regulatory Oversight Process 2.2 Use of Risk Insightsin Assessment 2-4 23 Use of Performance Indicators and Integration with Inspection 2-6 Results in Assessing Licensec Performr.nce l 2.4 Role of Enforcement in Regulatory Oversight / Range of 2-9 NRC ActionVCommunications 3 Cornerstone Development 3-1 3.1 Initiating Events 3-1 3.2 Mitigation Systems 3-4 3.3 BamerIntegrity 3-7 3.4 Emergency Preparedness 3-12 3.5 Radiation Safety 3-14 4 Concluding Comments 4-1 4.1 Funher Development of the Framework and Comerstones 4-1 4.2 knplementation Issues 4-2 Tables Page Tabic 3.1 Measurement areas and methods of banierintegrity comerstone. 3 9 Table 3.2 Areas to measure and measurements for the occupational 3-16 radiation safety comerstone.

Table 3.3 Additional areas of measurement and measurements for the 3-18 public radiation safety cornerstone.

Figures Page Figure 1.1. Preliminary comerstone framework. 15 Figure 1.2. Undedyin; structure of comerstones. 1-6 Figure 1.3. Final comerstone framework. 19 Figure 3.1. Gra ical representation ofinitiating events comerstone. 33 Figure 3.2. hical representation of mitigation systems comerstone. 3-6 Figure 3.4. Gra hien1 representation of emergency preparedness 3-12 comerstone.

Figure 3.5 Graphical representation of radiation safety cornerstone. 3 17

. iii l

l

, , r, , . ,,. ;" * ' F ~ * *

  • t ~ *. 7 , 2
  • :: ~ ? * - = * ~ ~ - ' * * ' * ~ ~ ~ * * ' * , ,

2 004 Appendices k WorkshopParticipantList -

B: Background Materials Desenbing NRC and NEI Assessment Proposals C: WorkAhopFacilitatorIlst ,

O e

9 I

tv t

. . - - , - . . , ,.. . ..- , . ~. . ..... ... .. .

ceztour,snuunenawuwuu liboos i l

1 INTR.ODUCTION On September 28 through October 1,1998, the Nuclear Regulatory Agency (NRC) conducted a public d workshop on performance assessment. De stated purpose of the workshop was to explore and develop aframeworkfor oversight ofoperating commercial nuclear reactors that takes into account a graded 4hresholdapproach. A starting point for this framework, called the cornerstone approach (or just cornerstones) throughout this paper, was proposed. General acceptance of the framework concept, modi 5 cations and additions to the approach, and detailed development of the framework were the outcomes of the workshop. About 350 people, drawn from the NRC, industry, and the public, l'articipated in the workshop. (See Appendix A for a participant list.) This report documents the workshop results.

3.1 Backgroundt

%e individual omponents of the current NRC assessment processes for operating commercial nuclear scactors were developed and implemented at difrerent times. He first major assessment process component, the systematic assessment oflicensee performance (SALP), was being developed before the nree Mile Island accident and was implemented in 1980. It was intended to provide a systematic, long term, integrated evaluation of overalllicensee performance.

%e second major assessment process component, the senior management meeting (SMM), was developed in response to the 1985 Davis-Besse loss-of feedwater event and was first implemented in 1986. It was developed to bring to the attention of the highest levels of NRC management those plants where operational safety performance was of most concem.

Ac third major component, plant performance reviews (PPRs), were developed to provide for better ellocation of NRC resources and were implemented in 1988. PPRs are conducted more frequently than SALPs or SMMs and were developed to provide mid-course adjustments in inspection focus in response to changes in licensee performance and emerging plant issues.

%e plant issues matrix (PIM) provides an index of the primary issues, generated through inspection findmgs and licensee event reports (LERs) that are evaluated during the SALP, SMM, and PPR processes. It was developed as part of the effort to improve the integration ofinspection findings iollowing the South Texas Lessons Learned Task Force, and was implemented in 1996.

Numerous reports have been prepared by NRC staff, industry, and third-party analys,ts noting weaknesses with the various assessment processes, many of which arose due to the piece-meal tpproach by which the processes were developed. Among these weaknesses are the following:  !

Many of the process components are redundant and have similar end prodacts.  ;

1

  • The assessment criteria differ between process components, especially SALP and the SMM, and are not viewed as being sufficiently objective.  !
  • De processes are subject to inconsistent implementation among the regions.  !

1 The background material contalncd in this sec60u is represcotative of the infonnation reviewed whb l ardcipants at the stan of the workshop.

1-1  :

i

,,.w., , ,, g -, .l _. , , .

10/26/98 mon 09:s? FAI s056653447 @ 006 The processes are more resource-intensive than originaEy intended, particularly when the safety-sigmficance of the results obtained is considered.

He comerstone approach resulted from the confluence of two ahemative pmposals generated by the bRC staff and industry (specifically, the Nuclear Energy Institute [NEll), respectively, aimed at addressing the problems with the current processes. The assessment process proposed by the NRC, called the integrated assessment process (IAP), uses inspection findings as its pnmary data source and also provides a mechanism for checking the inspection-based assessment results against other data sources, such as industry performance indicators (PIs), the trending methodology developed by the NRC's Office for the Analysis and Evaluation of Operational Data, and licensee-generated self-auessment data. In contrast, the NEI proposal uses a performance based model that relies primarily on licensee generated performance indicator data and requires minimal NRC involvement unless a perfonnance threshold is crossed. Each of these proposals is briefly described below, ne more detailed set of descriptive materials related to the two approaches that were provided to workshop panicipants are contained in Appendix B.

1.1.1. NRC Integrated assessment process. The NRC proposal grew out of a process roengineering project, caDed the integrated review of assessment processes (IRAP), that involved NRC regional and headguaners staff, and which used a principic-based approach (i.e., starting with objectives and artnbutes. then designing processes tn achieve them) to evaluate existing processes and dasign new ones, where necessary. The IAP propowd that the inspection program would continue to observe licensee performance and document those observations in inspecu,on reports. Performance issues would be entered into the PIM and would be assigned a significance rating and template category tag. He template is a tool for sorting inspecnon issues and includes both functional and cross funcnonal categories. Functional categones melude: operational performance, material condition, engineering / design, and plant support. Cross-functional categories include: human performance, problem identification and resolution, and programs and process. Esca PD4 cntry would be binned into both a functional and cross-functional category.

ne graded PIM entries would be rggregated by template category, and numerical thresholds would be used to produce a color assessment rating for each template category, as follows:

  • Green - perfonnance that generally meets or exceeds regulatory requirements.

Yellow - performance that demonstates a, pattern of non-compliance with tegulatory requirements or that results in a number of performance issues indicative of a programmatic weakness that warrants increased licensee attention or corrective actions, and a Red - performance that demonstrates significant non compliance with regulatory requirements in a systematic or pervasive manner that warrants licensee corrective action.

He performance of every plant would be ' assessed annually, at a regional meeting. This meeting wou: d allow for the review of, and reconciliation between, the template assessment and other indicators. A decision logic model would be applied to the assessment results to determine the range of NRC actions that shouLd be considered as well as the appropriate communication methods. b1C actions would be taken in a graded approach, with different levels of NRC management responsible f or the action, depending upon licensee performance. Assessment results would be issued in writing 13 both the licensee and the public and would be reviewed with thc licensee at a public meeting; again.

NRC (and licensee) participation in the public meeting would be graded based upon the assessment results.

1-2

~

sdeis~s' lioff ee:s7 pd sos'e~ess447 " ~ ~ ~ ~ ~

, ~ {[ ~ ~

1.1.2. NEI risk-informed, performance-based assessment process. The NEl saproach l would use the existing regulatory requirements (primarily 10 Code of Federal Regulations [CM] Parts 50 and 100) as a basis for setdng beensee perfonnance expectations that relate to public health and l 1

s:Jety. For assessment purposes, these performance expectations would be grouped into three tiers:

- Tier I: Public health and safety - maintaining the barriers for radionuclide release, and controlling radiation exposure and radioactive materials.

. Tier D: Safety performance margin - minimizing operational events that could challenge the barriers and ensuring that engineered safety systems can perform their intended safety functions.

  • Tact ID: Overall plant performance - plant safety performance trends are used as Icading indicators for problems that might develop in the Tier U performance areas.

Fach performance expectation would have a set of PIs that would be used to evaluate the achievement c f the expectadon. For Tier 1, the performance expectation of barrier integrity would be evaluated using three PIs: reactor coolant system (RCS) acuvity (level of fission products), RC5 boundary (leakage rate from primary boundary), and containment integrity. The performance expecta: ion of control of exposure and radioactive materials would also have three PIs: emergency preparedness, r adioactive material control (release and shipment of radioactive materials), and exposure control (for both workers and the public).

For Tier B, the perfortnance expectation related to operating challenges would be monitored using four Ms: unplanned automatic scrams, safety system actuauon, shutdown operating margins, and l implanned operating transients. The performance expectation related to mitigation capability would i assessed on the basis of high risk significant structures, systems, and components (SSCs) l nerformance.

i Tier IH would be monitored using an index of plant safety, which would be trended to show the direction overall safety performance could be headed.  !

l With the exception of the Tier HI trending indicators, allindicators would have an objective regulato threshold and a safety threshold value. The regulatory threshold defines the level of performance at which the safety performance margin has declined to a point where regulatory attention may be warranted. The safety threshold defines the level of performance at which the safety perfonnance margin has declined to a point where plant operation ss not pennitted until corrective accon is t restote margm.

The thresholds,in turn, define three response bands: the utility response band, the regulator response band, and the unacceptable band. If performance is within the utihty response band, utility management would maintain performance within the control band; the NRC would perform a core baschne inspection program and monitor the PIs.

  • Ibe regulator response band defines the point at which the regulatory response increases bey inspection to questioning the adequacy oflicensee corrective actions and programs and processe related to the performance area for which the band has been crossed. The degree of regulato would be determined by how close perfonnance is to the unacceptable band. Performance far fr unacceptable band thresho'd would receive minimal regulatory action while performance unacceptable band would receive more aggessive action, such as increased mspection, confirm action letters (CALs), and civil penalties (dPs). .

1-3

- .-. . . . . ~ . ~ _ ,,.. - , - . . - . . . . . . ~ . , . _ - . .

' ~~ '~

~ }0;s

~ ~

1o

~[~its'/ss~uoN'Osiss~~FAIsos95s3id i i l l

The unacceptable band defines the point at which plant operation is no longer allowed until conecdve action is taken.

The licensees and the NRC would have different, but complementary roles in this assessment 1 approach. De NRC would first assess results, by verifying the PIs and reviewing inspections and corrective actions. Based on the assessment results, the NRC would develop and implement ir,spection plans with the scope of those plans being defined by the response bands, as described above. Similarly, the NRC would take regulatory action as indicated by the response band. De licensee would monitor and sport on the PIs; inform the NRC ofits self assessment and audit plans aad make the usults of self assessments and audits available to the NRC prior to planned inspections; and perform root cause analysis, identify conective action, and report the status of conective actions to tle NRC prior to NRC regulatory medons.

1.2 The Cornerstone Framework De comerstone framework is a hierarchical structure that begins with a focus on the NRC's overall safety mission and identifies strategic areas in which performance must be maintained in order for the overall safety mission to be achieved. Each strategic performance area,in tum, has a set of cornerstones or areas that suppon the strategic performance arca. Pls, inspection, and other information sources provide the data to assess performance on each comerstone. Decision thresholds sie used to determine the regulatory action warranted by licensee performance in each area. A draft framework. shown as Figure 1.1. was developed as a starting pomt for the workshop.

l As can be seen from Figure 1.1. for the strategic area exposure from reactor accident releases, four c omerstones were proposed: initiating events, mitigation systems, containment systems, and l cmergency preparedness. Exposure from non-reactor accident radiological plant micases had two cornerstones: operational and events. Radiological worker exposure had a pair of comerstones similar

! to those for exposure from non-reactor accident releases, namely, operational and over exposure incidents. The reactor plant safeguards strategic performance area had one cornerstone related to I'hysical plant protection. ,

Draft objectives were provided for each comerstone, as follows. For the exposure from reactor siccident releases, the cornerstone objectives were:

I l

  • Initiating events: limit the frequency of those events which challenge the heat removal capability of l l the reactor plant, commensurate with their safety significance. ncse events include both operating events such as loss of main feedwater and shutdown events such as loss of RCS inventory.

l - Mitigation systems: casure that the reliability and capability of those systems required to prevent and/or mitigate core damage are maintained at a level commensurate with their safety significance.  ;

l 1

l i

~

14 l

. .. ... _ _._ m .. _ . . _ _ _ _ _ _ _ . . . . _ , _ . . . ._ . ,._ __

~

&TPUJ/ULFlhWRTiUJ1/G U000603447

. '%oos PVOLIC NEALTM A86 DIRC's SAFTTY AS A RESULT Overall$#ety 0F CfVILI AN Wladen M8t11AA REACTOR OPERATION I I I Strategic h RADIOLOGICAL ALACTOR PLANr CTOR At M*'snamt ACCIDENr AADIOLe66 CAL Wof*LR EXPO 5URE SAFEGWAAOS e 4rees E IA8EI an rmuse i I J

I I I I l

( NITunNG M1TK,aTIOh CONTAl W LWT EMDGDeCY pppg y OPthAT10%AL Q,g PHY5tGAL J EWNT5 $YSTEW5  % 5Y3fEMS PREPAR D Nga$ (&LARA) y,m PROTICTION T

  • PEIFD AM ANCa l@lCATOR e peSPECDON
  • OfMER INFORWATION $0URCEE

. nECsiON , ...- ==

Figure 1.1. Preliminary comerstone framework.

  • Containment systems: ensure that the reliability and capability of those systems required to ensure containment integrity are maintained at a level commensurate whh their safety significance.

Emergency prepamdness: ensure that the capability is maintained to take adequate protective measures in the event of a radiological emergency.

For exposure from non-reactor accident radiological plant releases, the cornerstone objectives included:

  • Operational: maintain the exposure to the public resulting from plant operation as low as reasonably achievable (ALARA). i 1
  • Events: ensure that releases from events do not exceed licensed and 10 CFR Part 201imits.

Included would be events such as transportation accidents, spent fuel pool accidents, and large sp, ills.

For radiological worker exposure, the comerstone objectives included:

  • Operational: maintain radiation worker exposure during plant operation ALARA.
  • Over exposure incidents: maintain radiation worker exposure below licensed and 10 CFR Part 20 limits.

1-5

10/26/98 MON 09:s0 FAI 50s66s3447

. Q)010 For reactor plant safeguards, the comerstone objective was:

Physical protection: protect vital plant equipment and prevent the diversion of special nuclear materials.

Within the framework, each comerstone has an underlying structure comprising its desired results, attributes imponant to achieving those results, areas to measure, and means of measurement. This is

.shown diagrammatically as Figure 1.2. Defining this underlying structure for cach of the comerstones was the nmary objective of the workshops. De structures that were developed are documented in Section .

nroughout the workshop, issues were raised relative to the framework and its premise of relying primanly on PIs to measure performance. nese included that the:

mission statement implies that the NRC understands the public's view of health and safety; further dcfmition of these terms is probably warranted; strategic p(erformance areas focus on exposures not risk - the public feels consequen exposure such as economic impacts); there was a sense that tie strategic performance areas should be expressed as positives rather than things to be prevented; strategic performance areas are not expressed using common language - some are results driven, others not; framework diagram implies that all the strategic performance areas are of equal weight; similarly, all comerstones within a strategic perfonnance area are shown as equal; reactor accident releases strategic performance area should consider secondary site issues, such as steam generator ruptures and intenm plugging, notjust core damage in its scope; c-

+

puse Ruult

+ --

w Amribute Arsw To Arsas To Measure Measme JL .J 6 Mouursecat Muswcomest .

o. n.tur) ne. n.te p)

Figure 1.2. Underlying structure of comerstones.

1-6

lo/2s/es mon 0s:59 FAI 50508s3447

. @ c11 !

i risk significance of systems covered in the reactor accident releases strategic perfonnance area could be determined by regulation or plant probabilistic risk assessments or both; there needs to be some clarity and consistency in what is used; two radiation control strategic performance areas should be combined and have two oornerstores dealing with public and occupational exposurts; environmental measures, such as pennit violations, are not well-incorporated in the non-accident releases strategic performance area; safeguards strategic performance area is still(D defined; there was some sense that it (or at least the physical protection components) could be subsumed into other comerstones, leaving only the aspects related to diverston of special nuclear matenal,if anything; at the conclusion of the workshop, this issue remained unresolved and no aspects of safeguards were " picked up"in other comerstones; a comerstones will share attributes, areas to measure, and measurement methods; there is a concem that there will be unintended consequences of the lack ofindependence of the comerstones; requirement to use risk insights in defining and measuring the cornerstones may not be appropriate in all cases, with the specific case of emergency preparedness being cited as a comerstone for which the incorporation of risk insights will be difficult; containment systems comerstone should be broadened to include integrity of all barriers, including the fuel boundary, the RCS pressure boundary, and the containment boundary; there was also some question as to whether people, processes, and equipment should be considered as barriers;

  • question of where,if at all, transportation issues are dealt with in the comerstones must be addressed;

" sorter" issues, such as human performance, organizational performance (tmining, quality, etc.),

safety culture and safety-conscious work environment, managernent, and pr,ograins and processes, are not easily represented in PIs and are not easy to measure directly; assessmg these issues in the absence of an event is particularly difficult;

  • PIs may not be sufficiently leading; where do precursors, such as use of procedures and licensing information, fit into the framework?
  • question of setting the thresholds of descction, and whether the threshold would detect slippage prior to failure must be addressed; also there was the question of whether the threshold for crossing into a band (in a negative direction) might be different for the threshold for crossing back out of tat band;
  • thresholds need to be fixed, and what they are needs to be wcn understood by the NRC, industry, and the public; further, consequences at the given thresholds also need to be known a priori;
  • thresholds (and indicators) need to recognize differences be4 ween plants with respect to their licensing bases, co e damage frequency estimates, etc.; and 1-7 s - _ ,_me m .m _

10/26/96 MON 10:00 FAI s056653447 @ 012 new approach must be validated, using appropriate success criteria;in addition to showing that the approach has efficacy in tenns of providmg accurate assessments, it rnust also be shown to reduce regulatory burden.

De cross-cutting issues enumerated above were seen potentially requiring a combination of measurement methods. Several options for dealing with these issues were explored. In initial discus: ions, rtany of these topics were recommended as strategic performance areas out of a concern I that they would become "least common denominators" and be factored out of the assessment process.

Altematively, the cross-cuuing issues could be included in the detailed attributes under the individual comerstones (as was done in the case of the initiating events comerstone; see section 3.1.2) or could be shown as a common set of attributes across comerstones. Finally, there was a proposal to let evaluation of the cross-cutting issues be tnggered by inspection, using the intangibles to rebut PI data, rather than being called out specifically in the framework. Lack visibility of these issues in the framework was raised as potential bam,er to public acceptance of the new approach. De approach of having the cross cutting issues be common auributes across cornerstones was tentatively adopted, and the attributes were shown explicitly on the framework. As is the case for allindicators, validity must be demonstrated for any indicators used to address the cross-cutting issues. De issue of whether there needed to be a separate evaluation or roll up of the cross cutung issues was left unresolved.

In addition, more global issues related to the regulatory relationship between the NRC and industry in any assessment process were raised. Here,it was noted that there needs to be a balance in the public .

perception, such that the regulatory relationship is ne!$c An adversarial one nor one that is recived ' )

as the NRC having abro pied its responsibility and allowed self regulation by the utilities.  !

discussion implies that taere needs to be a high standard for the utility response band, but also that l failures on single indicators should not cause slippage into the regulatory response band. Fmally, there must be a trip point at which perfonnance is viewed as unsafe and in need of correction prior to continued operations. ne meaning of the "res' threshold was questioned - does this mean unacceptable or unsafe?

Further, there was a question regarding the roles of the licensee and the NRC in data verification and validation, with the o ptions being either that the licensee validates and the NRC verifies via inspection vs that the NRC valid atcs the data and the licensee's processes for developing it. In either case, initial development of theindicators should involve validation. Dis,too, implies changes to theinspection program. He resident inspectors would assume responsibilities currently performed by team inspections, such as checks of the licensee's implementation of corrective action, worx control, and the mamtenance rule. Examination of the licensee's scif assessment and corrective action programs would be used to focus inspections, to allow the inspectors to " work smart." ,

Finally, there was some discussion in the plenary sessions about what adoption of the framework mightimply for other regulatory processes. It was acknowledged that the core inspection program would likely need to change to accommodate the new assessment model. Dere was concem, for example, that the cunent inspection program would not identify results that would rebut the PIs. In addinon, there was discussion as to when the core inspection program would be triggered - cither only for areas not covered in PIs vs also for verification of PIs - and how Pls and mspection findings would be integrated. (These issues were explored in detail in the fundamental issues session documented in section 2.3 and will not be discussed further here.) Further, there was a concern expressed that the framework approach could result in a weakening of the licensinJ function.

a modified version of the framework, that addressed many of By these the conclusion issues (especiallyof theviewe when workshop,d in conjunction with the work on fundamentalissues comerstone development documented in sections 2 and 3), was adopted. De final framework is shown as Figure 1.3. Several major differences frorn the preliminary framework should be noted.

1-8 l

l

10/26/98 MbN10:01FIs056653447 @ 013 1

a NU na 4

sq crumATION A

I I i

i g ma4ctea mana4 Tion 84P5TV

, m SAFETY i N \

l _ v.n =~ + ,=T ==, - -- ,==.

i f ""an" vin E"Jr"*"" '8.fd. M "*"' I j

  • PERFORMANCE DOICAWat l 3
  • DerE"IBOM 3 .ofusa evosmanow sesmas i

! .oacassomrunsseems l 3

) ligure 1.3. Final comerstone framewori:.

J

  • l 1
First, the two strategic performance areas that dealt with radiological exposure of workers and the j j public were combined mto a single strategic performance area, called radiation safety, which has two 4 (ornerstones
public and occupational. (Details of the cornerstone definitions can be found in section

} 3.5.) Second, the strategic performance areas were renamed to represent topical areas in which safety i innst be maintained (positive consequences) rather than exposures. hird, the cross-cutting areas of j human performance, safety conscious work environrnent, and problem identification and resolution

were added to the high-level framework in acknowledgement that they underpin all of the cornerstones

. and are deserving of a high degree of visibility.

i Wese modifications, however, do not address all of the concems with the conceptual dedgn of the

! high level framework. Specifically: (1) there was no funher definition of the terms relatM to public

health and safety in the mission statement; (2) the relative weighting of the strategic performance areas, 1 if any, has not been addressed; (3) the safeguards strategic performance area is still ill defined; and (4)
t here are still several " orphan" areas, such as environmental protection and transponation, that have not heen evaluated for inclusion in the framework.

1.3 About the Workshop

  • fhe workshop was divided into three segments, with a section each on: (1) background and mtroduction, which was aimed at bringing participants to a common level of understanding and acceptance of the framework concept; (2) tbe fundamental issues or guiding prbelples underlying the framework; and (3) detailed development of the cornerstones. Each segment contained workmg essions or break-out groups, where the objective was to achieve alignment or construus on the topic being discussed. Here, alignment and consensus are used interchangeably to convey a general sense that the group was moving m the same direction - 100% agreement with a;1 points of the discussion was not requued to achieve consensus, rather, consensus implied that participants were supportive of l

1-9

. . - - - _ _ _ - . . .. .. . - . - -- . _.- _-.-. -__ - . . . - . ~ - - -

10/26/88 mon 10:01 FAI 5058853447 Q)014 l the general concepts. Note that the scope of these discussions was largely limited to assessment,

although the broader topic of regulatory oversight needs also to consider the related processes of inspection and enforcement. Dese were dealt with only in the context of fundamentalissues discussions.

i j With the exception of the industry and NRC sessions held to bring participants up to speed on the i cornerstone concept, all break-out sessions were facilitated by a threc person facilitation team. (See j Appendix C for a listing of the facilitators for the fundamental issue and comerstone development j treak-out sessions.) A professienal facilitator (from Los Alamos National Laboratory [LANL]) was r.tsponsible for the overall administration of the session, including casuring that all attendees had equal c pponunity to panicipate, seeing that issues were adequately discussed to either ensure consensus or cnderstand why alignment was lacking, and documenting the session results. Two technical facilitators, one from the NRC and the other from industry, were responsible for representing their 1,

cirganizational technical positions, answering technical questions related to the issues, and ensuring ,

that the technical aspects of the discussion remained relevant to the issue being discussed. Technical '

i facilitators were provided with a brief training session highlighting "do's and dont's" prior to

participating in these sessions.

ne LANL facilitators were responsible for producing this report. The NRC technical facilitators c hecked all documentation for accuracy and completion prior to publication.

j lioth the' content and conclusions nf the background and introduction section of the workshop are j - tiocumented above. Section 2 of this repon describes the methods and results of the fundamental

issues discussions, while Section 3 describes the methods and results of the cornerstone development i 1 octivities. Section 4 provides overall conclusions and commentary regarding next steps.

l

.I l

4 1-10 l

m = y er -e - 1im.-,v-. ==- e- -_e*,..-4 m +- ..m.------, ,.-e , , ,-- m - ---

10/36/96 mon 10:03 FAI 50sB6s3447 @ 015 2 FUNDAMENTAL ISSUES Prior to doing detailed development work on the comerstones,it was necessary for the convened group to achieve alignment on some of the underlying policy and regulrtory issues. Alignment was expressed in terms ofpositions or guiding principles to be used in ecmcrstone development. In the j fc w cases where alignment could not be achieved, it was acceptable to " stake out the poles" or express l

the two ends of a position so that comerstone development efforts could be mindful the effects of choosing one path or the other.

Workshop panicipants self-selected into one of four fundamental issue break-out sessions: general policy issues, including safety performance expectations and the regulatory oversight process; use of risk msights in assessment; use of performance indicators and integration with inspecuon results; and a catch all session tht included enforcement, range of NRC actions, and communications. The objective of these brsak-out sessions was to discuss thosefundamentalissuesfor which decisions and attributes needed to be agreed upon to the support the continued development of sheframeworkfor the r<>gulatory oversightprocess. The intent was that the decisions and criteria decided upon during ths.se bn:ak-out sessions would then form the denning principles used as the basis for the more detailed oversight framework development.

E ach group was presented with a prepared set ofissue quest i ons for which th:y wre to provide a consensus position. In addition, groups were permitted to add scope to their topic area. Details of the fimdamental issues and the results of the break-out sessions are provided in sections 2.1 through 2.4 1clow.

1he numbers of participants in each group ranged from about 25 to more than 100. Groups were

[cnerally fairly proportionately mixed between NRC and industry participants. Althou gh were encouraged to stay with a single group, some between-group movement o l 2.1 General Policy Issues: Safety Performance Expectations / Regulatory Oversight Process Regardless of the specific assessment process used, the NRC has stated that alllicensees will continue to receive some Icvel ofindependent NRC inspection. The NRC will continue to assess licensee Icrformance in order to allocate inspection resources and t:ke appropriate action to address licensee perfonnance issues. The purpose of the general policy fundamentalissue break-out session, then, was to discuss the high level philosophy goveming the regulatory oversight process. Note that, although t.ome of the questions posed in this session appear to be similar to those assigned to other fundamental issue break-out sessions, the level of expected discussion differed among the groups - with other groups being expected to work at a more detailed level than the general policy group.

The group addressed four assigned issues (performance threshold for decreased NRC interaction, leading indications, NRC independence in a famework oflicensee generated assessment data, and use of NRC inspection findings) and, een, developed and considered a fifth (an evaluation structure for l the new process). Th: group elected to tackle the issues in a different order than that presented in the l workshop briefing matenals, and engaged in a detailed disenssion of the issue of how NRC inspection j lindings should be factored into the overall assessment process before dealing with threshcids for increased NRC interaction, leading indicators, or NRC mdependence. Because this discussion l

I revealed a number of assumptions that might also impact the reader's understanding of the positions developed for the other issues, it is dccumented first.

2-1

  • " ** ***MP e - -e.. ,.

10/2s/es von 10:02 FAI 5053653447 .

libo16 2.1.1. Use of NRC Inspection findings. Current assessment processes are based on using inspection findings to provide a rebuttable presumption on licensee performance, with PIs used to place the inspection fmdings in context and to provide compelling reasons to override the inspection fmdings. An alternative could be that PIs would form the basis for a rebunable presumption on licensee performance. When warranted ins findings could then be used to develop a compelling reason to overside the indicators. Basically, the discussion subject for this issue asked participants to tecide between these altematives. De presentin question for the discussion was: Given NRC will continue inspecting selected areas oflicensee rmance, how should NRC inspection findings be factored into the regulatory oversight promss '

He group engaged in very lengthy discussion on this tope. Although as de#ed below, alignment was achieved regarding a proposed position, this was based on a number of assumptions, and it was agreed that further discussion is required to address unanswered questions.

He assumptions on which the proposed position is based include:

  • inspection areas beyond perfonnance indicators must be dermed;
  • perfonnance indicators will drive focused reexamination of the conditions being measund (i.e.,

focused inspection in certain areas);

+ (with respect to the NEIwhite indicator determines colorgreen, ,

(i.e.white paper and the " Regulating Action Model"), the performanz or red);

~

  • the intent / interpretation of all " green" indicators does not imply full compliance;
  • baselineinspection:

a determines validity of PIcolor a can challenge PIs without implying a color change

  • influences NRC response, based on where the NRC staff thinks the utility is
  • is structured such that burden of proofis on NRC when performance indicators are in white
  • must notbe redundant
  • must be dermed so that it cannot be abused
  • must be riskinformed
  • is used for confumation

.has a dermed threshold He questions raised in the session but which remsin unanswered include:

  • what is the inDuence ofinspection findings on performance indicators and vice versa?

are inspections used to simply confirm or do they " trump" (i.e., supersede) performanceindicators?

  • are performance indicators and/or inspections associated with a score?
  • bow will circumstances which aren't covered by performance indicator be es-sead?

2-2

__x

20/s0/88 mon 10:03 FAI s0s00s3447 @ 017 how will situations be handled in which there are differences between inspection and performance indicator results (it was assumed that there will be a block ofissues which are not addressable by performance indicators)?

  • how willNRC andindastry a ate information to accurately reflect performance while taking into account the au ence" which will have access to the results? -

(Note that many of these questions were addressed by the fundarnentalissue group on PIs, as documented in section 2.3 below.)

Hnally,it was noted that the focus of this Performance Assessment Workshop becomes instead about " oversight" if/when the issue of how to reconcile all inputs with performance indicators is addressed.

Despite the number of unconfirmed assumptions and unresolved questions, the group successfully came to alignme:.t on the following proposed position: Tbc NRC will perform basehne inspection.

In a process that uses objective PIs to form a rebuttable presumption, this baseline inspection will:

s alidate the accuracy of the performance data and be used to assess the risk-significant areas not a dequately covered by FIs, to ensure that comerstones are being met.

2.1.2. Performance thresholds for decreased NRC interaction. Currently, the NRC interacts with licensees at a very low threshold of performance weaknesses. Licensee procedure quality and human perfonnance errors with little or no safety significance are reviewed by the NRC and extrapolated to make broad conclusions regarding effectiveness oflicensee programs. The issue to be explored here is whether that relationship can change: "Is there a threshold of licensee safety performance above which the NRC can allow licensees to address weaknesses with decreased NRC a.ction? In other words, at what level above 'zero-defect' tolerance can licensees safely operate with decreased NRCinteraction?"

De discussion focused initially on establishing relevant assumptions including, for exarnple, that decreases in NRC action are not limited to enforcement activities and that each threshold has a risk basis or safety assumption. It was noted a number of times that although there is a need to establish different thresholds for different utilities, there is also a need for consistent treatment of utilities by NRC. The proposed position for this issue as developed / modified by the youp is: here is a level of licensee safety performance which could warrant decreased NRC interaction if (1) the threshold considers risk insights or an acceptable level of wrformance: (2) there is a safety margin below the threshold to account for uncertainty; and (3) per?ormance is measurable with clear criteria.

2.1.3. Leading Indications. The Commission has directed the NRC staff to consider the develop' ment of perfontance indicators that provide leading or concurrent indications of plant performance, to the extent practicabic. The group was asked to address the issue "Criven the threshold kr NRC interaction decided in the preceding discussion, what type ofleading indication does Ibc NRC need to have in order to provide an early indication of changes in licensee, performance trends? Do additional performance criteria need to be monitored to provide a leading indication of declining performance to ensure timely and complete corrective actions by, licensees?" Potential topics of discussion in addressing this issue include the philosophical basis for desiring leading indicators (inevention of plant events due to failures of SSCs? prevention of extended shutdowns due to performance iuues? anurance of public health and safety by minimizing risk-significant events?) and the respective roles ofIcading and lagging indicators.

23 9

=Ah_1_.____________----m__

lo/26/es mon 10:03 FAI s0566s3447 @0tt l

  • l 1

ne group recommended caution that " indicators" not evolve to be viewed as " performance" and, therefore, that reaching an indicator threshold becomes equated with poor performance. With reference to the NEI white paper and the " Regulating Accon Model", an assumption was formulated that initiallicensee interest as m indicators depicting performance moving from the green to the white b4nd, while NRC interest increases as performance mdicators show rnovement from the white towards t!e red band. It was senerally assumed that licensees already have " defense in depth," that leading indicators determine cormctive actions, and that " leading" suggests prediction. Questions remain, however, regarding whether thresholds are continuous and/or step functions and whether they will apply to safety and/or performance situations. Nonetheless, alignrnent was achieved with respect to the following proposed position: It should not be expected that any process could provide a " leading" indication to every plant event caused by the failure of an SSC or human error. Le assessment process should establish thresholds that would trigger early interventions to anest declining parformance-and to trigger an escalated approach-prior to performance becoming unacceptable.

2.1.4. NRC Independence in a framework of licensee generated assessment data.

he industry has sugr,ested that much of the NRC inspection effort duplicates license- ir.spections, ,

self assessments, sad quality assurance audits. He industry has suggested that the SRC use more i licensce-generate 6 informanon to make assessments in lieu of NRC inspection. This must be balanced  !

sgainst the NRC's principle of pood regulation that states that the NRC must maintain independence in l r:gulatory oversight. Given this background, the group was asked to consider whether "the NRC l

[:an] place more emphasis on licensee generated information to perform assessments and still maintain l ns required regulatory oversight functionT On one hand, concem was expressed about possible use of proprietary and other company information and its incorporation into a public document. On the other hand, there was concem that restricting I what is used in order to protect the company's privacy could limit NRC's ability to assess performance  !

which,in turn, could adversely affect the pubbc's confidence in the process. A hope was expressed l that scif. assessment summary information would be sufficient for this purpose, but it was agreed that i iurther discussion was required to determine what information to docket in order to enable self-assessrnents to take the place of NRC inspection. It was agned that the process should allow i NRC to reduce duplication oflicensee scif assessments while maintaining independence and, ridditionally, that NRC can rely on more than licensee information. He proposed position for this hsue as developed / modified by theyroup is: De NRC can place reliance on licensee generated ossessment results. However, the MC needs to ensure adequate means of validating the data used in its regulatory and decision making process.

2.1.5. Evaluation structure for the new assessment process. As noted above, the group i decided that inclusion of a fifth issue was warranted. Although, due to time constraints,litde or no discussion took place to amplify this issue or to develop a proposed position,it was collectively a pced that any new performance assessment system must, as a key part of the development process, me: ude n " controlled feedback" (i.e., evaluation) structure for NRC and licensees to assess the new system.

2.2 Use of Risk Insights in Assessment

'the NRC is committed to applying probabilistic risk insights, as appropriate, to the regulatory oversight process. nus, the goal of this session was to collect input as to the appropnate level of applicanon of probabilistic risk insights to be applied to the regulatory oversight process. Three key issues were presented for discussion: (1) the roles of risk-informed and determmistically based PIs in the oversight processt (2) the role of a set of PIs in the assessment of the integrated risk significance of the licensee performance; and (3) the use ofindividual site PRAs in the development of site-specific 2-4

w %ois I

I i

indicators. He second and third issues were part of the discussion of the first issue, and therefore lirce time was devoted to their in-depth discussion.

2.2.1 Role of risk-Informed performance indicators. The key issue presented for discussion was:'To what extent should the PIs and their associated performance thresholds be risk-informed? What is the role of deterministically based indicators in an oversight process? What processes and criteria should be followed to select these information sources?"In addressing the final question presented in this issue, participants were asked to consider the criteria for selecting risk-infonned, site-specine Pls. ,

During the session,it became clear that a definition of" risk-informed" and an over arching philosophy were needed as a basis for further discussion. The over arching philosophy stated that " risk informed insights and indicators should be used to focus resources on those areas of greatest importance to the pub:ic health and safety." This philosophy depends on the definition of risk. informed. The differences between risk informed, risk based, and deterministically based became a focal point. The group agreed that criteria for decision making should not be limited to risk-informed, risk based, or determinhtically based inforrnation. A mix should be used as appropriate. It was agreed that a dermition of risk-informed should encompass risk-based and detenmnistically based information.

Therefore risk-infonned was defined as " consider [ing) consequences and probabilities of those consequences, balanced with operational experience and exisung regulations." Alignment on the definicon was achieved after much discussion. The definition proposed was not unanimously considered "the" dermition, however,it was agreed that it was a suitable working defmition that could be further rermed. .

The above definition of risk informed consists of two parts. The first rt," consider consequences and probabilities of those consequences"is based on a risk inf6rmed j gement that may notbe risk-based. The second part," balanced with operational expenence an existin3 regulations " was added to reflect the differences in types of plants (peer groups) and in individua) plants and the risk-based nature of radiological concerns. (Radiological hmits are calculated usmg known dose-rate consequences and are therefore risk-based.)

After a working dermition of risk-informed and the general philosophy was agreed on, the discussion continued and revolved around the extent to which PIs and their thresholds should be risk-informed.

Each of the pmmises in the proposed position provided in the workshop materials f'PIs and their performance thresholds should be risk informed to the maximum extent possible. This may involve the generation of certain site specific PIs. The use of risk-informed objective PIs should be balanced with deterministic approaches as appropriate.") was examined, as described below.

With respect to the notion of having PIs be risk informed to the maximum extent possible,it was noted that being risk-informed is connected with probabilistic :isk assessment [PRA), and while it might be possible to perform some PRA or PRA-like analysis,it may not be cost effective or the issue may not ,

warrant an extended analysis. Therefore, various analysis techniques must be evaluated to determine  ;

the merits and cost effectiveness. Thus, the_ group concluded that performance indicators should be l used to the maximum extent " practicable " This allow; ficxibility m using the most appropriate analysis tool.

The discussion of the use of" site-specific PIs" included the differences between various peer groups and differences in individual plants. It was agreed that there should be consistency across plants with the understanding that each plant is also unique. It was determined that PIs should be established by peer-groups (said another way, one criterion for selecting Pls would be applicability to a peer group).

If there were unique plant-specific characteristics, the thresholds could be adjusted appropriate: y.

2-5

, ... . ,.- n.- - -- .. -- .- - , ---..... . .

[_

% 020 t

I l

l Beyond the notion that PIs should be peer. grouped, no criteria for adopting PIs or setting thresholds were developed.

Finally,it was concluded that risk based included deterministic approaches such that deterministic approaches do not need to be considered as a separate entity.

l /.t the conclusion of the discussion, alignment was reached on the following position:

1 is and other measures and their thresholds should be risk-informed (based on the agreed working i definition) to the maximum extent practicable. PIs and thresholds may be peer-grouped or may be

! plant-spec 15c.

2.2.2. Independence of cornerstones. The value of risk assessment is in the ability to e valuate and identify higher risk-significant events through the integration of individual issues which l may have had lower nsk-sigmficance. For example, individual equipment failures, which may not

! individually cross PI thresholds, may, when combined, result in situations or configurations with l high risk significance. The challenge posed by this discussion, then, was to determine how a set of l performance indicators could be used to assess the integrated risk significance oflicensee performance.

\

l Ahhough the presenting question used to introduce this discussion was phrased in terms ofintegrated risk significance, the proposed position contained in the workshop materials, which focused on j -

independence of the cornerstones, was what actually framed the d ialogue.

There was concern that the cornerstone a yproach may not address the si p215cance of complex events. i In the case of complex events, issues such as scrams and multiple comp;ications, and their integrated a afety significance, event assessment techniques may be necessary.

Two important points were generated during the discussion. First, exceptions are not handled with only PIs. Because they are exceptions and are often com slex events, they warrant special handling.  !

t

~Jhus, exceptional events will be handled exceptionally. "he second point was existence of the close l selationship between the confidence in the PIs and the plant's correcuve action program, including root

< ause analysis and lessons learned programs.

At the conclusion of the discussion, the following proposed position was stated: Conclusions should 1e drawn from a broad set ofind:cators of need to perfonn rocre in depth i tions without l

performing risk-calculations in all cases. Handle exceptions exceptionally. level ofinspeedons should be determined by the confidence in the plant's ability to perform root cause analysis and lessons-learned.

2.2.3. Use of individual site specific risk analysis in developing risk-informed performance indicators. The key question presented for this area was "To what extent do we need to use individual site PRAs to develop risk informed, site specific performance indicators to complement the generic performance indicatorsT' Participants were asked to consider the role of risk i

momtors and configuration risk management, reliance on site-spectSc vs NRC-developed PRA

! models, required quality of site-specific PRAs, and the itse of expen panels to supplement site specific PRAs in forming their position on this issue' Due to the time constraints, however, many of these topics were not addressed specifically.

Rather, the proposed position," Individual site-specific risk analysis should be used to the extent necessary to develop risk-infonned PIs and to set risk-informed thresholds" wr s discussed briefly and was modiSed to agree with the positions developed for the previous issues. tlc following posinon i

26 1

i

,,.v._..,.~..,.---, . . . . . , . . - . . . . . . . . - _ _ . .

10/26/98 MON 10:05 FAI s056653447 Soti was the result of the modificadons: Individual site speciSc risk analysis should be used to the extent axessary to develop risk-informed peer group PIs and risk-infonned site specific thresholds.

2.3 Use of Performance Indicators and Integration with Inspection Results in Assessing Licensee Performance nis session was organized around five key issues, as described below. Context for the session was provided in the form of an assumption or ground rule, namely, that the NRC is committed to the development of objective standards for measuring licensec performance that reduce subjectivity and '

establish a cicar level of performance expectation through the use of risk-informed perf ormance hidicators and inspections. In effect, this statement set a boundary condition that mandated a preference for decision making based on objective measures, such as Pls, over some of the more subjxtive assessments enabled by curent assessment processes. l l

General discussions in the vadous plenary sessions revealed that there was another underlying i auumption relevant to this fundamentalissue area. This assumption can be stated as follows: PIs, l although not necessadly able to measure all attributes of a particular comerstone, are sufficiently broad I to provide con 5dence that the desired results of the cornerstone have been met. Not all workshop participants were equally confident that this assumption was valid and noted that use of PIs as pnmary undicators, as recommended by the break out session participants in section 2.3.4 below, would be inappropriate if the assumption was proved invalid. Because it was impossible to verify the validity of the anumption in the context of the workshop, the convened group agreed to the following position:

If, when the detailed PIs have been developed, the assumption that the PIs are sufficiently broad to arovide confidence that the desired results of the comerstone have been met is proved valid, then it will n acceptable to use inspection findings in a rebucable assumption mode (i.e., as an check on Pls r.ither than as direct inputs to the assessment). Verification of the assumption was left as a task for the validation effort.

2.3.1. Role of perforinance indicators. The key issue presented for discussion was as follows: "What is the role of PIs in assessing licensee performance?" ne consensus position was that Hs should be used to provide an objective measure in those areas where they can reasonably be developed (e.g., a basis exists, they can be validated, etc.). The rebuttable presumption about licensee performance is formed from them, and requires a preponderance of other data in order to be overridden. PIs bring consistency, discip1m' e, objecuvity to the assessment process.

h should be noted that it is yet to be determined how much weight will ultimately rest on PIs vs inspection / inspector'sjudgement. Industry expects inspecnons to continue, but wants to be allowed to fix problems without being hindered by the inspection / assessment process, especially if the PI indicates satisfactory performance.

2.3.2. Characteristics and attributes of performance indicators. This issue was framed in terms of two very similar focus questions: 'What are the charactedstics of PIs (e.g., genede, specific) that would provide a reasonable measure of licensee performance?' and 'What are the recessary attributes of these performance indicatorsT'In answering the questions, participants were asked to consider the roles and relationships of genede vs plant-specific PIs. They were also asked to consider how one would verify the adequacy and accuracy of the PIs.

Discussion focused on the characteristics of the PIs themselves as well as the thresholds set for the l PIs. With respect to the PIs per se, the following observations were made, Pls should: I

  • represent a broad sample oflicensee performance.

2-7 i

. - - . . . . - - - . . . . . . - - - - . - . =

i lm be risk informed to the extent practical; however, it was acknowledged that some PIs will need to l be deterministic, such as in cases where the comerstones are not developed from risk insights, j e.g., radiation protection.

j -

be capable of being objectively measured and validated / verified.

i be timely, but it was also noted that timeliness depends on the nature of the data (for routine data,

perhaps quarterly reporting is acceptable, within one month of the end ofthe quaner).
%e group expects that both PIs and thresholds will have a combination of generic and plant specific i components. Like PIs, thresholds should be risk informed to the extent practical. Thresholds must be

{ adequately leading", which means that thresholds (margins) need to be set so as to allow trends and

, problems to be identified before crossing a threshold of unacceptable perfonnance.  ;

i ne consensus position proposed for this issue is as follows: The PIs may be generic across the industry but can have p ant-specific thresholds. The PIs should: (1) be capable of being objectively incasured, (2) be timely (taking special circumstances into account), (3) have :isk informed thresholds, 4 (4) identify trends and allow problems to be addressed before a threshold is crossed (5) provide a

scasonable sample of overall performance,(6) be validated and veri 5able, and (7) esult in only
appropriate licensee and NRC actions.

i 2

When this group reponed their results to the plenary, an additional characteristic was proposed,

! namely, that if plant-specific indicators or thresholds are used, they must be developed so as to be j equitable across plants. No funher development of this proposal occuned during tne workshop.

. 2.3.3. Characteristics of a risk-informed inspection program. The key issue for j tliscussion here was "In light of the proposed PI characteristics, what are the characteristics of c risk-

' informed inspection program to monitor liccasce performance?" Three major points emerged during the group discussion. First, the group stated that the inspection program should include some 8 assessment of the risk significance ofinspection findings and should provide follow up accordingly.

j Second, the group acknowledged that changes in the assessment program could result in changes to -

4 the inspection program and noted that the exact nature of the inspection program is yet to be 1 determmed,i.e., can/should items covered by PIs also be subject to inspection, and if so, under what i circumstances? Finally, the group asserted that there should be increasing inspection coverage of clements covered by PIs as performance goes from green to red.

{

The consensus position that the group arrived at is that: The baseline inspection program should cover j risk-important attributes of the safety functions, programs, and processes not covered directly by PIs l (e.g., d:: sign issues, human performance, etc.). Inspection findinas should: (1) clearly indicate the i e of overall pedormance,and (3)

{'

sisk-significance of discrep)ancies, (2) provide verify (sometimes validate the performance indicators. The inspection a reasonable program samp:

willinclude mspections to address root causcs and corrective actions of detenorating performance as indicated by i the PIs and/or other inspection findings, and will address specific events or conditions that indicate ]

risk-significant degradations in performance.

i i 1.3.4, Integration of performance indicators, inspection results, and other sources i ofinformation. Discussions about integration of various indicators were predicated on several i

assumptions, that
(1) a performance-based assessment process will include several individual PIs osed in combination to assess licensee performance; (2) results of continuing baseline inspections and j other information sources will also feed into the assessment process; and (3) it is necessary to imegrate 5

2-8

)

r

. . . _ , m . . .m . , . m .. - - - - - _

10/26/98 MON 10:07 FAI sos 66s3447 @ 023 tids disparate infonnation in some way so that NRC action can be taken in a scmtable and transparent e anner. Given that context, the key issue presented for discussion was as follows:"How should PI n:sults be integrated with inspection results and other sources ofinformation (e.g., event reports, Fcdcral Emergency Management Agency ] reports)to assesslicensee Participants were asked to consider seve this quesnon from several angles, performance 7 includ PIs should be integrated with one another; how PIs and other, possibly more subjective,information sources should bc integrated; how the meaning of the other data sources can be evaluated in a more objective way (i.e., using a "PI like" method); and how subjective methods such as PPRs and SMMs should be used in evaluating the various information sources.

Discussion focused primarily on integrating FIs with other sources ofinfortnation. Dere was some lack of cladty as to what was meant by " integration"- the p's working definition allowed integration to include a situation in which objective and su ve results would not be mixed. Said another way, the grou PIs should be handled separately from areas covered by PIs.p's view was that arcas not covered He group noted that the relative weight of PI vs inspection results is presently unknown. heir view was that validated PIs should be the primary indication of performance, with additions! information generated as necessary, in a risk. informed manner. Further, a preponderance of evidence would be ruquired to refute a PI with conflicting inspection findings.

He position on this issue developed by the group is as follows: De means should be developed for NRC actions to be taken in a manner that is scrutable, transparent, and predictable to both the W-and the public. De assessment process should have the fotowing characteristics: (1) PIs will be the primary indicator of acceptable performance; (2) additional information will be generated when necessary, in a manner that is risk informed; (3) supplemental information (from areas not covered by Fis) will be treated separately from PIs: (4) an approach will be developed to integrate the two types of information (PIs, which are objective, and inspeccon results, which may be subjective) when they conflict.

Note that the position statement does not address what role,if any, cunent NRC management processes such as the PPR and SMM or their correlates, the regional review and agency action meetings proposed by IRAP, should have in integration /evaluanon of assessment results. Specific details as to how the PIs themselves should be integrated have also not been addressed.

2.3.5. Assuring reliability and validity of performance indicators. Much of the data used in an improved assessment process, including she. specific PIs, licensee self assessments, and quality assurance audit results, could be generated and submitted by the licensces. His, however, r.tises the following questions, which became the focus for the break-out session's final discussion:

"How can the NRC ensure that PI data is accurately developed, recorded, and submitted in a timely snanner? Should this be motivated by a voluntary program or by rulemaking? What controls should be placed on this process 7' In their discussion, participants noted that a voluttary licensee program, i.e., an industry-wide initiative,is prefened to rule making. In such a scenario, there would be a distinction between what data were made available to the NRC vs what data would be available to the general public: all data (i.e., the detailed raw data) would be available to inspectors, but publicly available data would only include high level PI results, not the underlying / supporting data.

He consensus position proposed by the group basically reflects the discussion captured above. It s.ates: A funcuon of the baseline inspection program should be to verify data from the PIs. A 29 1

10/as/es NON 10:os FAI sossessasi voluntary indastry initiative to develop, maintain, and submit PI data is preferable to rule-making. PI data should be available to the public, but the supporting details need not be. (See section 2.3.2 for a discussion of timeliness of PI c ata.)

I nplicit in this statement is the notion that the core inspection program would be the " control" on the process by serving to verify the accuracy, timeliness, etc. of the PIs reported by the utilitics.

2.4 Role of Enforcement in Regulatory Oversight / Range of NRC Actions / Communications As stated previously, this session covered several different topics, including the role of enforcement in the regulatory oversight process, the range of NRC actions that could/should be outcomes of the assessment process, and communication of assessments. As such, the session was organized around three issues, as described below. In addition, the gr u raised an enforcement related issue, namely, the degree to which enforcement could/should be ' - nformed. Due to time constraints, this was l discussed only briefly, and that discussion is captured in the section on enforcement (section 2.4.1) below.

2.4.1 Role of enforcement. 'Ihe key issue presented for discussion was as follows:"What should the role of enforcement be in the overall regulatory oversight process 7" Specifically, puticipants were asked to consider whether the current relationship between the assessment and enforcement processes, namely, that enforcement actions act as an input to the assessment process, l should change.

The NRC technical facilitator rovided some additional background related to planned changes in the treatment of what are currentl 12velIV violations in enforcement space. Under the NRC proposal, no notice of violation (NOV) r NOV res nse would be required for Level IV violations, regardless of who identified the violation,if and on1 if the following conditions were met: corrective actions had been taken in the appropriate time frame; the information about the violation is docketed in a LER, inspection report, etc.t and the item is entered into the licensee's corrective action program. In these cases, NRC mspection staff would not close out individual items. Rather, inspection would focus on verification of the licensee's corrective action program. However,in cases of willful or repetitive acts or situations where compliance was not appropnately restored, violations would be cited and appropriate actions taken. Sanctions and admmistrative actions will remain unchanged forleve violations and above.

Initial discussion focused on defining and providing the scope for enforcement. In general, regardless of the level of violation, enforcement was viewed as a regulatory action taken to a licensee to retum to a desired safety threshold - addressing safety s ant issues, including deterrence and remediation.

Enforcement was not seen as fulfilling a g rmance assesstaent role. Possible actions include sanctions, such as notices of violation Vs), cps, and orders, as well as administrative actions such as CALs, demands for information, and notices of deviation. Such regulatory actions may occur in response to inspection findings of violations or may be event-driven. In cases where violations are uncovered in the course of inspection, the same information sill be used in both the enforcement ed assessment processes. Because situations (events or inspection findings) affect performance (a the assessment process indicators, it is not necessary to have separate indica account for enforcement actions in the assessment process. One subtle, but interesting" comment was made regarding the appropriate uses of enforcement, namely, that the NRC practice of rolling up" small violations before taking enforcement action, in effect, moves the enforcement into assessment space.

2-10

~ - - ~

gggggg g

4 4

i The primary area for discussion centered around whether the performance bands developed assessment process should be used as a factor to adjust enforcement results. Dis discussion sepa into two themes: should the level of the citation be adjusted based on meument performance a i should the sanedens levied for a given violation be adjusted based on performance? For

- level, the answer was "no"- the citation should be based on the safety significance of the violation.

i (In contrast,in the short discussion of risk informed enforcement, the general sentiment seemed to i

that it was appmpdate for the sevedty level of a violation to be adjusted based on the risk sig

thethe of affected systems or functions with risk insights being developed a priod either by the NR plant.)

With regard to sanctions levied (specifically CP), positions were more varied. Some participa believed that giving consideration regarding CP in cases where a plant was in the green performan good perfonnance. Several proposals were p)roffered as to how the Dese included: (1) two years mthout an enforcement action or green performance band equals no o

) mduced CP and (2) a more complicated formula under which four circumstances could serve as mitigators: extended shut down, voluntary formal initiative, two-year performance window with gr performance and no enforcements, and effccdve corrective action. Culpability, such as willful violations, would be considered separately under these schemes. Other participants believed that sanctions should be relaned to the safety significance of the violatica, which does not chan of the plant's assessment perfonnance band, and should not be mitigated by performance.g Some group members expressed the view that mitigation of CP based on performance leads to a perception that historical gooc perfonners sometimes "got away with rourdet in enforcement space, while historical bad performers were treated with a " heavy hand"- this, in tum, exacerbates issues with fairness and scrutability of NRC prennes.

He position developed as a insult of the group discussion contained some points of consenec.s and some more controversial areas. Consensus was reached on the following: The role of escalated enforcement is to address safety significant violations. Severity level is based solely on safety significance and is independent of the assessment process and its results. Because the situations that -

1:1ve rise to enforcement also feed assessment measures, no specific PIs are needed for enforcement.

He tratment of assessment results in CP detennination remained controversial. A slight majority iook the position that plants with overall green assessment results should be given favorable t onsideration when determining CP amounts. The minonty position was that CP amounts should be independent of assessment.

' 4.2 Range of actions. De key issue presented for discussion was as follows: "What NRC nctions are most effective in encouraging nmely licensee corrective actions?" Discussion focused on actions that would be appropriate outcorres of an assessment process. Note that the results recorded below (as well as in secuon 2.4.3) assume that there will be an egIall assessment result, that is, a singic " color score" that will describe overall perfonnance. Not all working groups shared this assumption, and a final consensus position remained outstanding at the conclusion of tbc workshop, so will need to be decided as implementation details are worked out. Should the decision result in something other than an overall assessment msult, decisions that were predicated upon that assumption 2any accd to be revisited.

%e consensus position developed for this issue is as follows: Dere should be a graded approach to the assessment outputs, with outputs being directly related to overall assessment grade. The range of outputs should be from: core inspection, routine conespondence, and event-driven inspection for overall " green" assessment resuhs and management meetings, some additional inspecuen, and 2-11

10/26/s4 MON 10:00 FAI 5058653447 @ 026 daands for information for overall " white" results to confirmatory actions letters, orders, increased inspection, and/orjustification for continued operation for overall " red" results.

Given this position, there is a public perception / communication issue that must be addressed, namely that. if red results do not necessary, imply a voluntary shutdown or a shutdown order, perhaps the use of red as the color used to commumcate the assessment needs to be reevaluated. A suggesnon was made that using yellow as the lowest threshold color would provide a more accurate communication toolin this case.

2.4.3 Communication of assessment results. The key issue presented for discussion was as follows: "What methods of communicating performance assessment results and enforcement actions a:e most effective, accurate, fair, and objecove?" Discussion focused on communication of assessment results, and should not be inferred to also apply to enforcement actions. Note also that this discussion was very general and that details of specific communication devices are to be determined in the implementation phase.

Participants noted that the actions taken as a result of the assessment are,in themselves, a communication tool, at least between the NRC and thc licensee. It was also, noted, however, that there are multiple audiences to whom assessment results must be communicated and that the needs of these audiences vary. For some audiences, such as the public,it may be necessary to provide interpretation regarding what the overall findings rnean. The degree of interpretation needed would be Ligher in cases where a threshold band was crossed. 'Ihcre was some debate regarding the r ced/desirableness of making detailed data on the individcal performance indicators available for inspection, such as via the World Wide Web, but the group did not provide a recommendation on this issue. "Ihe NRC audience could use a mechanism similar to today's plant performance reviews to  !

c ommunicate assessment information and its implications for resource allocation.

The consensus position developed for this issue is as follows: Communication of assessment results snust provide a clear assessment and must be meaningful to all audiences, who will have a range of techmcal knowledge. Wrinen assessment reports, beyond just inspection reports, are needed. 'Ihese repons must address the cunent assessment results, trends m those results, and actions. Public i ineerings are seen as a potentially effective communication method, but a graded approach is needed such that the required level of participants is higher (for example, includes h1C headquaners participation) for more negative assessment results.

2-12

10/28/88 NoN 10:10 FAI 5058853447 ,

2027.

b

  • 1 4 3 CORNERSTONE DEVELOPMENT i

Workshop participants self-selected into one of five break-out sessions, each of which was organ i

around a panicular comer /nne2: initiating events, mitigation systems, containment systerns, emergency preparedness, and radiological controls. The safeguards strategic perfoimance area, with its associated physical protation comerstone, was not addressed at the workshop 3. The objective of ~

these break-out sesstons was to develop aprocess which idenrufed what information can be gathere l .mdmontrored to adequately assess licensee perprmance in each ofthe cornerstones. Each group u a top-down approach to developing their comerstone, which included:

i

{ identifying the objective and scope of the comerstone; l identifying the desired results and important attributes of the comerstone; i

j identifying what should he measured to ensure that the comerstone objectives are met; determining which of the areas to be measured can be monitored adequately by performance

[

4 indicators and the associated criteria for selecting the performance indicators; d -

determining whether inspection or other information sources are needed to supplement the l- performance indicators and the associated criteria for selecting the performance indicators; and

?

4 -

determining the criteria for detennirdng thresholds.

i

Following the workshop, the LANL facilitators produced the graphical represcalations of the sesults
r,hown in the sections below.

1 l As was the case with the fundamental issues break out sessions, the numbers of participants in each

roup ranged from about 25 to more thar,100 and some movement occurred between groups.

l 3.1 Initiating Events '

4 1

  • Ihree impoitant points served as means of focusing the discussion during the devel ent of the j initiating events cornerstone: (1)1imit the initiating events to those that are risk si cant; (2)the i real is not "aero-defect" and (3) the goal is the development of monitoring tools. en the discussion j scemed to be off scope, these thme statements became the criteria to determine if the topic needed to be 2 Note that the first four topics are att cornerstones that support the scactor safety strategic perfonnance area of the framework. The fifth topic is a bit different, as it represents all cornerstones for the radiation safety strategic performance area, wb!ch had not been as fully developed as the n: actor safety area at the start of the workshop.

(Recan that this staned out as two strategic perfonnance areas, with four proposed cornerstones.) Hence, the task of this group was broader than that of other groulw as it also regulfed further developmcat at the strategic performance arealevelof the framework.

8 A public meetlog, which is szpected to be simDar in format in Ibc working sessions for the radiological controls strategic performance area is planned. Results of that public meeting wiU be incorporated into thi6 repcst t ten they become available.

3-1

..._ -. . . . . _ .. --.~... _ -- . _.

~ ~ ~~~ ~~"~~~

W 8e/ss D M 1GiM D 6 Gossess44 F ~ ~ "~~Q~

ir.cluded as pan of the central discussion. As such, they provide imponant context for understanding the results of this workshop session.

3.1.1 Objective and scope. The objective of the initiating event cornerstone is to limit the frequency ofevents shot upsetplant equilibrium and challenge critical safetyfunctions. During the davelopment of the ob,)ective the terms "upsetylant equilibnum" and "dsk significant" were carefully selected. The use of" upset plant equilibnum vs " transient" was discussed m detail. Many pxticipants thought that transients encompassed all events that would challenge the critical safety functions. Chher panicipants presented the, position that transients included a finite set of events, and did not include other potential events. Dunng the discussion the panicipants retamed to the concept that zero-defects was not a goal. At the conclusion of the discussion abgnment was reached and it was a;rced to use the tenn " upset plant equilibrium" instead of tmnsient because transient has a set of limiting definitions associated with it and does not included shut-down. It was agreed that " risk significant" includes sensitivity as well as the overall contribution to risk.

He scope of PIs in the initianng events comerstone is limited to events that are risk significant in PRA (core damage frequency or 1.IRF). Hus, non-risk significant events will not be assessed he nsk significant events to be considered should be closely tied to risk-informed decisions. We d:finition developed in the fundamental issues break-out session on risk insights was adopted as the working definition of risk informed. Dus, risk informed was defined as "considerfing] consequences and probabilities of those consequences, We4 with operational experience and ensting regulations."

3.1.2 Desired results and important attributes. The desired result of this comerstone is tased directly on the objective statemcat and is to limit thefrequency of eventr shat upsetplant eguilibrium and challenge criticalsoferyfunctions, ne following key attributes were selected: configuration control, common cause failure, procedure quality, and human performance. (Note that this set of attributes is identical to those pasented as an ,

c xampic in the workshop materials. In that sense, they can be considered a " default" set. Funher, the  !

troup noted that if there is a desire to have consistent key attributes for the inidating everits, mitigation !

system, and barrier system cernerstones, they would have no objection to modifying the identified l sttdbutes.) The panicipants agreed that all of the PIs developed would reflect all of the key attributes. i 3.1.3 Areas to measure and measurement methods. While there was a great deal of discussion during the development of areas to measure,it was readily agreed that equipment caused events, personnel caused events, and process caused events were the broad areas of imtiating events.

Two additional arcas nre proposed: " operational characteristics as precursors to events" and "effectivenest, of corrective action program." He panicipants had very differing opinions concerning the scope of precursors to events. Based on the goal of developing monitoring too s and not aiming for zero defects the discussion was refocused. During the discussion a list of examples of opemtional characteristics that may serve as precursors was generated. It was determined that many of the examples were closely connected to the corrective action program, and a good corrective action program will identify the operational characteristics of concem. Dus, the two were combined into a founh area to measure.

To summarize, alignment was reached on the following four areas to measure: (I) equipment caused events, (2) personnel caused events, (3) process caused events, and (4) backstopped by inspection of the effectiveness of problem identification and corrective action programs (for example, a entical element is operational characteristics as precursors to initiating events). The example is considered part of the position statement and is not simply an aside.

3-2

-- n-. - - - -m m. 4_--. ,, .

] ~ lo/2s/ss ~ mon ~1oill FAI sCsess344h' ,

In order to measure these areas, two areas of PI development and two areas for inspections were identified. nc PIs identified form a complete set of PIs and constitute a rebuttable presumption, and a pxponderance of evidence from the inspection prognm would be required to reverse conclusions based on performance indicators. He relationships among attributes, measurement areas, and sneasurement methods are shown in Figure 3.1 below.

Two sets of PIs were identified, one for transients and one loss of coolant accidents (LOCAs). The PIs for transients the following initiating events include unplanned scrams, safety system actuations, unplanned operating transients, and shut down operator margins. ne granularity of the definitions of the areas to measure (for example, significant scrams vs non si gnificant scrams) was discussed at length. The " keep it simple" philosophy was adopted to limit the r' umber ofitems to measure. nus, all scrams are counted equally. It was apced that manual scrams would be included, however, there necds to be further study of this issue. The primary reason for including manual scrams was that it is the common practice to initiate a rnanual scram when an automatic scrams seems imminent. However, there was a concern that if manual scrams are included, operato;s 411 not initial the manual scram.

ne PIs for LOCAs developed by the containment (barriers) comerstone group will be accepted for use in the initiating events comerstone. It is assumed that the PIs will include RCS boundary leakage or crackindicators.

Two areas for inspection were identified: (1) conduct of operations - which would use a risk informed inspection program to examine aspects of plant operations (maintenance, etc.) that have a c!irect influence on imtiating event fmquency and (2) corrective action program effectiveness. Dere were unresolved questions concermng the methods for counting events such as fire, flooding, and loss of power. It was agmed that double counting should be avoided. Thus, guidelines need to bc cleveloped to determine when such events are included in the initiating events PIs and when they are included in the mitigating system PIs.

i. e.- j l A l im is ,e

/7 NNlo,j m-.- l l c.i- c e l lc c..uwl l

~

4 4 4 4 l i - i

[ i

- ' - b c '".kE.

.:=r r e- '"""

I I -

_i y . _ _ _ - -

c ,o sE.", L "f-c*

1** *'G""

W om* - n /

r

%.*,eY s Figure 3.1. Graphical representation of the initiating events cornerstone.

3-3

w/w/w tw w:tm t/8J USUWs3447 SO49_

3.I.4. Thresholds. De general criteria for regulatory and safety thresholds were developed; due to time constraints, specific thresholds remain to be develo NEI white paper "A New Regulatory Oversight Process." It was apsedped. that thereThe scriteriaiould bewere two larpely ba thresholds, the regulatory threshold that when exceeded would require regulatory involvement and a safety threshold. The regulatory threshold should be established where there is a significant deviation from expected normal performance. The safety threshold should be established where there is a significant inenase in risk or impact on performance. The balance between thresholds must provide '

suf6cient margin to allow corroedve acuans to avoid or exceed the safety threshold.

Several guiding principals were discussed and are summarized below:

ne regulatory threshold should be a constant norm consistent with the IPE assumption.

Thus, the threshold should not change as performance improvea.

He development of thresholds should be based on safety and not on public relations.

Limits should be low enough to consider crosscutting issues.

nc PIs should be independent. Thus, the thresholds are developed for each PI not for the sum of the Pis.

He threshold number should be set to avoid ambiguities. The threshold should be set such that the number of PIs is clearly above or below the threshold.

3.2 Mitigation Systems 3.2.1 Objective and scope. The objective of the mitigation systems comerstone is so ensure shot hose sysrems re ' red to prennt andfor mitigare core damage perform at a lew! commensurate win theirsafety sign cance both atpower and during shur down. This cbicctive stasement implies that perfonnance of itigation systems is event-driven; the comerstone does not address routine activities related to these systems. Further, the objective statement limits concern about mitigation systems to those events that could lead to core damage;it does not include all design basis accidents. Use of the term " core damage" is meant to imply loss of core sectnetry.

his comerstone encompasses high risk signi$ cant SSCsfor core damage mitigarion. In the near-term, the SSCs to be monitored and the measures would be derived from the institute for Nuclear Power Operations' safety, system performance indicators (SSPIs). These SSCs would generally be common across plants, with some differences between pressurized and boiling water reactors. In all, Gve to six systems would be included. In the longer-term, it is recommended that an additional two to ,

three plant specific high-risk SSCs be selected for each plant, based on maintenance rule l implementauon analyses and/or significance determinauons from plant probabilistic safety analyses.  !

Note that the scope statement does not necessarily limit measurement to hardware systems; personnel could also be considered as a system for core damage mitigation.

Criteria for determining the scope of the comerstone included: (1) a desire to have a meaningful sample of systems that would be common across sites while also allowing some alant specific customization so that. for example, plants could have the flexibility to include some of their non-failure tolerant systems that would not otherwise be included; (2) a desire to u:,e existing indicators, if possible; (3) inclusion, either directly or indirectly, of both direct and supporting systems; and (4) frequency of available data. Severalissues with using the SSPIindicators were noted. First, the indicators only 3-4 4

. as e a - = ^ *

  • Y-9
  • ^ "^^
  • 9 P,_ g e? G $ 4 Y - 'fm -9 9 'ON"
  • I '"# ' ' ' ' ' ' ' ' ~ * *"" ~* - ' ' " ~ '

10/2s/es NON 10:22 FAI 505ss53447 I

acidmss reliability (functional failures and demands, and mnning time before failure) and availability  !

i (planned and unplanned down-time, technical speification required time availabic, and unavailable hours due to failure of support systems). Capability is not directly measured with the SSPI indicators, although it was noted that equipment performance information and exchange system could evolve as a source for capability indicator data. Further, participants questioned the fidelity of the SSPI data, noting that there are differences in how the plants report their data. ,

3.2.2 Desired results and important attrib'utes. The results desired from the mitigation systems comerstone can be summed up with the word ability : that the mitigation s stems will puform theirintendedsafe crions when needed and in their intended manner r the required daran'on. This statement o sired results encompasses SSC reliability, availab ty, and capability.

However, because the key attributes remained the same regardless of which"-ability" was being discussed,it was determined that a global term was more appropriate for describing the desired result.

[

Key auributes that must be assessed to determine whether this desired result is achieved include:

l

. desi n,

  • Con iguration CoDtrol, e mamtenance.
  • operator performance.

l - verification and testing,and

! + support functions.

Several cross-cutting issues, including common cause failures, conective action, and operati e xperience feedback, were identified, but were notlisted as key attributes as they were viewe as l cnderlying all of the listed attributes. It is intended that these be treated globally in the overall framework, as was discussed in the early sessions related to the framework itself (see section 1.2 i

l sbove).

{

l 3.2.3 Areas to measure and measurement methods. For cach attribute, the group identified one or more areas to measure, as listed below:

  • Design: adequacy ofinitial design, control of design modifications, and understanding and dealing l appropriately with conditions outside design basis; j

i + Configuration control: configuration management (i.e., the correct alignment of systems);

  • Maintenance: adequacy of preventive and conective maintenance; i

l

- Operator performance: post-event actuation, operation, and monitoring of mitigation SSCs;

- Verification and testing: routine monitoring, completion of required tests, and adequacy of tests performed (including the frequency of performance); and l

l - Support functions: reliability and availability of functions supporting mitigation SSCs.

Some combination of SSPIs, inspection, and review oflicensee self-assessments was considered adequate to measure all of these areas. In determining the measures, preference was given to the SSPI indicators. In cases where the SSPlindicators did not provide adequate coverage of a measurement j wea. hspection on licensee self-assessment programs were also considered as potential measurement i

methods. Note that all the measurement methods that involve licensee-generated data assume that there 3-5 1

er

10/28/98 MON 10:13 FAI 60s60s3447 E032 sill be underlying NRC inspeedon aimed at verifying licensee reponing ofindicators and/or adequacy oflicensee scif assessment programs. The relationships among attributes, measurement areas, and measurement methods are shown in Figure 3.2 below.

For the design measurement area, the SSPIindicators were seen as providing partialinformation regarding adequacy ofinidal design and control of design modifications. However, inspection o design bases and test and surveillance programs were seen as a necessary augmentation to indicators. In addition. periodic design basis reviews, conducted and assessed via licensee self-assessment programs, were seen as contributing to veriGcation of control of design modificati SSPIindicators do not address the understanding and control of conditions outside the design ba Rather, review oflicensec self assessment programs related to reviews of the design basis were se as the appropriate measurement method.

uradon control stuibute was viewed The configuration management measurement area under the confi eaned from the SSPIindicators but similarly to the initial design area, with some information being g with augmentation needed from inspeedon. SpecificaUy, inspection would focus on Ecc=

implementation of the maintenance rule. This, however, was an area for which participan that a performance indicator could be developed. Industry panicipants were challenged to propos rak informed indicator that would reflect upon the adequacy oflicensee maintenance rule implementation.

Both measurement areas under the maintenance attribute were seen as being primarily c the SSPIindicators. However, inspections of maintenance mle implementation and licensee pro

elated to industry operating expenence were also seca as providing valuable information to ihat provided by the SSPIindicators.

Operator performance was the one atuibute for which ne post-event operations. Rather,licensec accreditation and operator requalification program as reDecting on whether operators would be able to actuate, operate, and monitor mitigati during an event. There was some sense, however, that it might be possible to cr indicators for these measurement arcas in the longer um.

p m.-= M 1

Iurwaj 7

r p v4 e iu.t p.

= -_i Fu-

.e...] H=.=.% gre . 4 I J ===- 1 4we 4 L+** -b

.r.,w g -4 w.} - 4. W b.> t.. et f

.3 . Oe

.fi;; ,,T.~. 1 e g.

g ,, : p.

--) O =.a== 4 Ma === 'J-

.w g  :  ;

i ,

..----A L .. . . .*. .-. .. . . . . - . . . . . -

b DNd Ikl Figure 3.2. Graphical representation of mitigation systems comerstone.

3-6

_ _ _ _ _ . . _ _ . . _ _ _ _ _ _ _ _ _ _ _ . _ _ _ _ _ _ .. _ _ - - _ . _ .m _

10/26/s8 MON 10:13 FAI 5056653447 @ 033 I

1 I

. Routine monitodng, frequency of testing, and test completion were seen as all be adequa 1 monitored using the SSPIindacators. Other aspects of the adequacy of the test program (i.e., th tests actually achieve their intended purpose) need to be verified via inspection.

4 Finally,dicators.

SSPIm reliability and availability of support functions were seen as being p 3.2.4 Thresholds. Workshop participants began drafting criteria for setting thresholds for the j SSPIindicators, and produced the followmg statements:

4 the threshold for the green-white interface should be set within one or two standard deviations of the licensee criteria under the maintenance rule (with the number of standard deviations b

) determined to minimize the number of false positives), and l the threshold for the white red interface should be set based on Regulatory Guide 1.174 criteda.

Note that these criteria both resuh in plant-specific thresholds. Dese thresholds should not be considered final, however, as the break-out session articipants also developed a list of desired.

j attributes of threshold criteria, but did not verify th their aft thresholds possessed these attributes.

i Desired attributes of thresholds include that the thresholds:

e

are set using riskinsights; j
  • are statistically meaningful, that is that they provide a su!(icient margin to limit the number of false

! posinves; are based on criteria that are either already in existence or are easy to generate; are benchmarked; reinforce conservative decision making, that is that they do not trigger NRC action unnecessarily; are easily understood, especsally by the public; I

are set using generically-based principles, while accommodating plant-speci5c design variations; and cari be consistently developed, applied. and evaluated.

3.3 Barrier Integrity Based on the discussions at various the plenary and fundamental issues break out proup sessions, t group's first action was to expand its specified focus from ' Containment Systems to

  • Barriers."

After extended consideration, containment, fuel, and RCS were then designated as the three classes of barricts to be included in the group's scope. The group did not address the subject of thresholds for ,

the PIs they recommended; 'his will need to be dealt with in future development of the cornerstones.

3.3.1 Objective and scope. The objective of the barrier integrity cornerstone is to assure that the physical design barriers (containment, fuel, a sdRCS) protect she publicfrom radionuclide releases due 3-7

10/28/98 MON 10:14 FAI sos 08s3447 S 034 so accidents or events. In addition to expanding the scope to include the additional areas of fuel and RCS, it was further decided that severe accident perfonnance should be addressed not just by the baniers cornerstone group but also by those focusing on initiating events and mitigation systems as well. ' Ibis recommendat4on was made during the final, full. group plenary session.

3.3.2 Desired results and attributes. For each of the three classes of physical desi

~ the agreed upon desired resuh was defined as maintaining the functionality andof thethe banier,gn important atuibutes of each were speciSed to be design features, human performance, configuration control, procedure quality, and banicr/ equipment performance.

3.3.3 Areas to measure and measurement methods. Although assessment of the three classes of physical barriers requires examination of each attribute, the areas to measure and the sneasurement methods vary somewhat across the types of barriers, as shown in Table 3.1. Note that the same measurement area may apply to multiple attributes and rnultiple ba: Tiers. Note also that tneasurement methods were not specified for a1l measurement areas; the fuel area received more attention in this regard than the other areas. Figure 3.3 uses F for fuel, RCS for reactor coolant t ystem, and C for containment in conjunction with the areas to measure to indicate to which desired

.mult each measurement area applies.

With respect to the PIs, the group identified a list of near term or currently available indicators as well ts a list of measurements for which PIs may be available/ applicable in the longer tenn. Included in the list of near. term or currently available PIs are the following:

RCS activity a

LLRT(cumulative running total)

RCS leakage (total)

  • Tranr.ients Primary /secondaryleakage Included in the list of measurements for which PIs might be available/ applicable in the longer term are:
  • CIV reliability (which would use maintenance rule data)

Containment support systems RCS off gas (for boiling water reactors (BWRs)) -

  • Chemistry
  • Reactivity excursions Technical specification actions MSIVleakage(for BWRs)
  • ATWS ECCS Valve alignment AIIgnment among group members was also reahzed with respect to designation ofitems for inspection /self. assessment, to include:

Corrective action programs, including configuration control, design, procedure quality, and human Perfonnance; Severe accidentmanagement(SAMG);

Puel handling and storage; 3-8 s

-,n.-

85MU/UU U93 Bt0:80 LE 0058653447 2035 l

i Banier l Amibute Measun: ment Area Measurement Method Fuel Design RCS acdvity PI Reactivity excursions Conective action program COLR Inspection /self-assessment l -

Design enors Conective action program 10 CFR Part 50.46 Inspection /self-assessment l

Capacity Human performance Foreign materials exclusion Corrective action program~

, (FME) l Rod control Inspection /self-assessment l Corrective action program Transients PI Reactivity excursions Corrective action program Design errors Corrective action program Configuration control Primary chemistry PI l

l Core loading Inspection /self-assessment Procedure quality Reactivity excursions Conective action program l Equipment RCS activity PI l performance l Transients PI 14ose parts, Inspection /self-assessment Conective action program Reactivity excursions Conective action program Oxidation Inspection /self assessment Controlrods Inspection /self assessment

. ATWS performance

' Assumed to be measured in the rmDgating sysictns cornerstonc l

Tabic 3.I.a. Measurement amas and methods for fuel barrier of banier integrity comerstone.

1 I 3-9 l _ i

~ ~ ~ ~ ~ ' ^ ~ ~ ~ ~^ ' ^

~10/2s/ss'~uoN Yoits FAI 3056655447 kOAS i -

1

! Bamer Atnibum Measurernent Arca Measurement Method RCS Design Modification -

Design errors EQ Human performance Following procedurcs 1

i FME i ,

Training Knowledges, skills, and abilities Mamtenance enors Configuration control Modi 5 cation Valve alignment Security Prirnary chemistry Secondary chemistry EQ Procedure quality Valve alignment Training i l

Equipment Eddy current Inspection /self =====ent Orrnance

._Non-destructive evaluation /

in-serviceinspection j i

Transients FME l l

Surveillance EQreliability RCS leakage rate PI Maintenance Primary /secondaryleakage PI Table 3.1.b. Measurement areas and methMs for RCS barrier of banierintegrity comerstone.

1 3-10

_ .-_- . - _ ~ - - . -_______m- ~ _ _ --..,aw._. m._- - m - .. -_.-

10/26/98 MON 10:1s FAI s0sS6s3447 @ 037 Barrier Amibum Measurement Arca Measurement Method Containment Design Support systems (sprays, if risk significant, use recombiners*, maintenance rule data modifications *,EQ)

Human performance Following procedures FME Training Configuration control Valve alignment Doors and hatches EQ Clearance problems Physical security .

Procedure quality Riskimportant Post accident Training Equipment ILRT PI(as found/asleft) ormance LI.RT PI(as found/asleft)

Surveillance ,

Appendix J MSIVleakage PI CIV reliabihty PI Draw containment down time (risk secondary )

signWant

  • Assumed to be captured by the mitigating systems comerstone Table 3.1.c. Measurement areas and methods for containment barrier of barner integrity comerstone.

1 l

l 3-11  !

i i

m _ , - ,-

20/26/98 MON 10:15 FAI 5056653447 Q)c38

,__ x..-y ene w . s m-J e.ps cs .i?SSc--WT&

~~

-wcm'im r * ~ - -

  • p . 7,r , g---- venas.r.m cyc.w.2. cr~-m
s. . ~ 3S=% . h>. , o .g n,, - t >W~qm[

m r.q .& me*"li F -- M . % H f=---~ -

-ypve jrav. .. - r i ,. SM% sc~---. f W ~~~~

hh, dwy ',w:.-EN_

N= ;8N MN bJ - - .k$R f_ 2bbS O' NYNJ

'~~' R HM~ m-W.pwar.vd

<W:%fW):. 7.W+2Wh

-- = .----c a n .s CM[Mihg?g?T -W"W-4 :~~w ~~ - :q .s e ck = u.1 4

... ,e p xt~ .

,e;i.%c4<"l&$ i mrPv ~ f'.,,

t.,.

d.ik W% M rd me
=!.. v .u:.a,.n.

mm.: e evww: ;s o %v mpa w . d e. d eia-wuwx:3sw-s *m.sru,,w '

e"tr p' '*MC & w.~ ' ^ M: - W

/ a y .~.,,c ,,w.. c S. .!v w p . w w .r; 3 M 'W @% Y r;.7TS Ti'"- Wi i=f @

. w. w .9 W '*"7* *'

c(  ? - " ' " ShW

- . f -~~ = m l, 3 -- . a 5 i

.[.

y,),., g

~

pw i' =. 2 CM . . .. . . :. n ..;

n. .

, i.,

u ..N#O.  : n rd

%f,'"

.; m NAMT,c.x..

3s ...., .. , Un 1 7 'P M..

c w -i j w . m , .. .-

. . . , . m.s .. . . e

.. e. ~. We'r.%. '

.I M . @ e, - m . . . .

P. u.ne, a'mT1 Figure 3.3. Graphical representation of barrier integrity cornerstone.

  • Human performance; and .

- Design.

3.4 Emergency Preparedness The focus of the emergency preparedness (EP) working group was on the interfaces between the NRC l and the utility, although the involvement of other organizations, such as FEMA, was acknowledged; in spite of comments made to the contrary by plenary session participants and/or participantI of the fundamentalissues break-out sessions, developers of the EP cornerstone believe that it is risk- t informed, based on the fact that it is built on a planning standard that is risk based.

Due to time considerations, the group agreed that some items worthy of discussion would be identifie but not discussed in detail in this breakout. They captured the items in a " parking lot" and retumed to -  ;

them as timepermined at the conclusion of the r,ession. Their discussions of these issues are documented m section 3.4.4 below.

I 3-12

20/26/96 mon 10:16 FAI 50566s3447 @ 039 0

o i 1

3.4.1 objective and scope. The objective of the EP cornerstone is to ensure that the licensee cepabiliry is rnaintained to take adequate protective measures in the event of a radiological emergency.

~Be scope of the cornerstone was limited to onsite emergency preparedness. Offsite emergency plans were removed from the scope, but are believed to be bounded by H2iA's finding of reasonable saurance. Included in what is considered "offsite" are the following: cffective evacuation in the event of radioactive release; public information/ confidence; adequacy of state / local facilities; protection of e nergency workers; adequate communication channels; and alett and notification systems. l 3.4.2 Desired results and important attributes. The desired result of the comerstone is efective implementation of onsite emergency plans. A coto11nry desired result can n1so be expressed with res t to offsite cmer cncy plans, namely, efective implementation ofofsite emergency plans ,

(with , ective being equivdent to supicient to warrant a FEMAfinding ofreasonable assurance).

Given the limited scope of the comerstone, however, neither attributes nor measures were developed I for this desired result.

j Four imponant, safety significant attributes of EP were identified: the timeliness and accuracy of classification, the timehness and accuracy of notification, the timeliness and accuracy ofprotective cction recommendations (PARS), and emergency response organization (ERO) readiness.

The group also discussed public confidence as a potential attribute. They noted that it is a very irnportant amibute but one that is impossible to objectively measure. Licensee methods for creating public confidence that were discussed include: accurate information flow, calendars / brochures, m clay, and operation of ajointinformation center.

Training, human perfonnance, and procedures were identified as cross-cutting issues that are pan o each of the four important attributes listed above.

3.4.3 Areas to measure and measurement methods. For each attribute, the group identified 1

zveral sitas to measure, as listed below:

- Classification: event recognition and emergency action levels (EALs);

  • Notification: alert and notification system; adequacy of communication channels (voice and data),

including the communication systems necessary to make emergency noti 5 cations to offsite agencies; and provision of a direct interface to offsite;

  • PARS: protection of emergency workers and provisien of a direct interface to offsite (inclu dose assessment and field team functions); and
  • Es,0 readiness: adequacy of facilities and activation of the ERO.

Within the PAR sitribute, the protection of emergency workers is a broad ares that encompasses emergency dose limits, applied health physics, evaluation of the need for potassium iodid relocations, and re-entries.

The ERO readiness measurement areas are also fairly bmad. The adequacy of facilities focuses on equipment and procedures having safety significance. It includes the availability of sa in the emergency response facilities. The activation of the ERO includes timely staffing with required facility staff, testing of safety significant activation equipment and systems, a that minimally tequired facility staffing is available 24 hours2.777778e-4 days <br />0.00667 hours <br />3.968254e-5 weeks <br />9.132e-6 months <br /> per day, seven days per week.

3-13

f ,

1 Accident assessment, which was initially viewed as a measurement area for classification and PARS.

i was ultimately deleted, as it was viewed as a cross cuuing area that would be subsumed within the other attributes. Indeed, there was some discussion among the group that the attributes of l classification, notification, and PARS could be combined into a single attribute caDed accident asseasment.

t

! A combination ofinspection,limnsee self. assessment, and PIs was considered ade of these areas. Data for these measurement methods will be drawn fromevents, exercises,quate and testing. to mea nere remains a concem, however, that the inputs from exercises and events may not provide a i' sufficient number of data points to provide an indication of reasonable assurance. His issue must be

' addressed as the comerstone is furuer developed. Because the cornerstone is highly dexndent on inspection, self assessment, and on. site verification, interpretation of what triggers a NT,C " visit" to the site is considered critical. However, the group did not determine thresholds for the measurement areas. This, too, will need to be completed as the comerstone is further developed.

i l

He relationships among attributes, measurement areas, and measurement methods are shown la Figure 3.4 below. Note that inspection and self assessment are shown together in the diagram. His reflects the fact that the workshop participants always retened to them in combination when they were recommended as the measurement method for a particular area. Details of how each measurement area  ;

could be measured, including the data sources that would be used, arc described in the subsequent paragraphs.

With respect to the classification attribute, the event recognition measurement area was seen as the cercentage of timely and amenable to measurement using exercises, events, and P;s. Speci5ca accurate classifications could be developed as numerical a PI. De numer weighted based on the severity of the event (whether the event is classifiable or not and the severity of the classifiable event). Some of the sources ofinput include: actual events, exercises, LERs (which r?.present a potential classifiable events database), and crew / team evaluations. EALs would be

{

evaluated as part of the self assessment / inspection process. '

l With respoet to notification, a PI could be developed relating to the ' availability of the alert and -

notificauon system. Such a measure would include operability of the equipment , would be bounded by what is required to be maintained by the licensee, and would need to be consistent with the criteria already established by FEMA for system availability. He adequacy of communication channels would  !

be measured primarily through the use of the self assessment / inspection, process. Here was a concern l that metrics such as the counting of the number of pieces of communication equipment would not be adeguate as a PL Finally, a numerical PI could be developed for provision of a direct interface to l offsite by tracking the pertentage of timely and accurate notificauons. Sources ofinput would include

{

w: - ., ,

i ~. w .-- - g L. . .L. , a i _ . ~ ". c e t.~ . .U. - 11 1 < ~ m ._ = u 3 . . , cs. m ._ . .-- s ~.% .

. . /g -- . i FEEFTr.w.wi, i . --.-a.,

s ..-.i j

.a- .. z . - . '

.l . . .

p r _.i Figure 3.4. Graphical representation of emergency preparedness comerstone.

3 14

10/26/98 MON 10:18 FAI sos 68s3447 Q3041 l 1

\

initial and upgrade emergency notifications, notification of PAR changes, and 10 CFR 50.72 notifications.

He measurement of the disset interface with offsite would be similar for the PAR attribute. De percentage of timely, technically based, and accurate PARS could serve as a numerical PL Sources of mput ine ude actual events, exercises, crew / team evaluations, and other site s >ecific means of self assessment. De protection of emergency workers would be measured by se:f assessment / inspection.

Finally, both of the ERO readiness measurement areas, adequacy of facilities and activation of the ERO, would be measured using self assessment and inspecnon. '

3.4.4. Other issues. The applicability of several additional areas to the EP comerstose was discussed. Although these discussions did not result in modification of the comerstone during the workshop, they are captured here for future consideration. In some cases, the EP group assumed that other comerstone working groups would " pick up" certain items - this should be venfied as the full framework is developed.

Training and management systems (support) were discussed as potential areas to consider when evaluatmg ERO readiness. With respect to training, measures could be developed for the percentage of emergency plan ERO personnel that panicipate in drills /cxercises and for the qualifications of ERO gersonnel. Number of dnlls could be used as an indicator of the robustness / management supp I

Several other areas were determined to nde part of EP. It was decided that SAMG would not be handled as an EP item, as it is not regulatory based; the decision was made within the EP working group to discuss only regulatory based issues. De EP group believed that if evaluated at all, S AMG would probably be handled under the containment comerstone.

Effective implementation of emergency operating procedures and assessment of control room habitability were considered to not be EP reportable items.

3.5 Radiation Safety The two preliminary strategic performance areas that dealt with radiological release / exposure were combined into one arca called " Radiation Safety." There are two comerstones under that area:

Occupational Radiation Safety, and Public Radiation Safety. The group that considered radiation safety spent most ofits time on the occupational branch, but noted that the outcome concerning the occupational branch was lar,gely applicable to the public branch as well. What follows, therefore,is an account of the sessions deahng with the occupational radiation safety comerstone. At the end of this section, there are comments about relevance to the public radiation safety cornerstone.

3.5.1 Objective and scope. The objective of the occupational radiation safety comerstone is to assure adequate protection of worker health and scleryfrom radiation emosure. This coincrstone addresses radiation doses to workers from normal plant operations and anticipated operational occurrences. It excludes industrial hygiene and safety, and excludes special nuclear materials safeguards.

3.5.2 Desired results and important attributes. He desired results from this cornerstone are to maintain worker dose below regulatory limits, and to maintain an effective ALARA (As Low As Reasonably Achievable) program. There are four key attributes that must be assessed to determine if 3-15

,.7 .,

_ _ _ - _ _ _ __ . _ _ _ . _ _ . _ _ . ~ _ _ _ - . _ _ _ _ . - _ - _ . . . _

10/26/98 MON 10:18 FAI s0s66s3447 @odt the desired results are being achieved: plant, programs / processes, procedures, and human performance.

3.5.3 Areas to measure and measurements. For each attribute, the group identified one or i

more areas to measure, plus one or more measurements in each area. These am shown in table 3.2 b:!ow. Figure 3.5 shows the same information graphically.

3.5.4 Performance indicators. Due to lack of time, the group was not able to thoroughly discuss performance indicators. However, the group was able to sdentify one PI with three components for this cornerstone.

Attribute Areas of Measurement Measurements _

collective dose, coolant activity, trend in Plant Source term control extent (number / area) of HRAs, trends in ,

radiation fields, chemistry,long term use of temporary shielding Facilities and equipment instrumentation /desimetry out-of-calibration, inoperable ventilation systems, availability of process / area radiation monitors i

lost sources, inadvertent release of material Programs and Radioactive materialcontrol outside RCA, plant ama contamination processes Exposurecontrol individual dose, collective dose, unplanned dose events, failure to survey adequately, personnel contamination events, inadoquate controlof HRA/VHRA access Planning excess dose from rework, excess dose from cmergent work, projected vs actual collective dose atthejoblevel overdue corrective action, duration of Self assessmentand closure, reoccurrence, percent self-corrective action -

identified, percent self. revealing Organization / administration events due to inadequate procedures Procedure quality events due to failure to follow procedures, Human Organization and management supervisory oversight (hours), amount of performance ovemme,tumover, management of contractors

  • attendance, events due to inadequate Training traming, personnel error rate, management observation of training Table 3.2. Areas to measure and measurements for the occupational radiation safety comerstone.

3 16 e

  • ww-w c , _

i Dec'@sijonal Radiation Safety n me .n .u.

dowbelow ALARA program I l l l

~ i., ,_. .- -

~. , - -

/ \,-

/ \

NN - -

\

\ -

saw.

p.

sh,3 b .

e I

';.  ::1, =L 'O..

. O . ::"'  :::"

0".-

"10..

.  ::::"! 'L :O ':::"'

-' ".*.~*.*~*

.". 'a: tr.:,*,,. 1 0  :::".2 Lr. .- ::'.

!ll1"  :=, ".

,.2" 2,  :::P-

=*  : "-' cd .":::t * ~ ~'
-  :::r Mm. 3":: C 0~  ::llllE

~

lL"" .::l"."T  :!*

I; i Figum 3.5 Graphical representation of radiation safety cornerstone.

The PIis the number ofradiation safety eventr, and the components are:

  • Substantialloss of radiation safety barriers,
  • Unplanned significant dose, and -
  • Exposure above regulatorylimits.

3.5.5' Thresholds. *Ihe threshold for individual dos: (the third component of the PI)is the regulatory limit (5 rem).There are no explicit thresholds for maintaining an effective ALARA program, but the first two components of the PIimplicitly express a threshold: unplanned significant dose (individual or collecuve), and substantialloss of radiation safety baniers. Thresholds for these components can be site specific. The core inspection program would remain an element in essessing effectiveness of the ALARA program. .

3.5.6 Relevance to the public radiation safety cornerstone. Many direct parallels between.

the occunational radiation safety comerstone and the public radiation safety comerstone were discussed and are documented below.

3-17

_ _ _ _ _ _ _ _ _ _ O

20/26/98 XON 10:19 FAI 8058653 H7 l

3.5.6.1 Objective and scope. The objective of the public radiadon safety cornerstone is to assure adequateprotection ofpublic health andsafetyfrom radiarion exposure. This comerstone addresses radiation exposure of the public from effluents associated with normal operations and anticipated occurrences. It includes radioactive ef0uents (solid, liquid, and gas), direct radiation, tr.msportation of radioactive material, and radioactive material release.

3.5.6.2. Desired results and important attributes. Tbc desired results are to maintain public dose below regulatory limits, and to maintain public dose ALARA.He public branch shares the same four key attributes as the occupational branch: plant, programs / processes, procedures, and human pxformance.

3.5.6.3 Areas to measure and measurements. Public radiation safety generally uses the same areas of measurement and measurements as the occupational branch, but adds two areas under programsfprocess, as shown in tabic 3.3 below.

3.5.6.4 Performance indicators. The PI and components for this cornerstone are identical to those for the occupationalbranch. .

We PIis the number ofradiadon safety eventr, and the components are:

  • Substantialloss of radiation safety barriers.
  • Unplanned signiScant dose, and

. Es posure above regulatorylimits.

3.5.6.5 Thresholds. The thresholds are again implied in the three components of the PI. Public

< lose must be maintained below re atory limits. Elements of maintaining public dose ALARA would include measurement ofindividu dose (site specific), and the core inspection program would be used to assess instances of substantia 11oss of radianon safety barriers or unplanned sigcificant dose.

Auribute Arcas of Measurement Measurements Programs and processes Radioactive environmental measurements willbe similar monitoring program to thoselisted for the programs and processes Transpomn.on of ra6oactive attribute of the occupational rnaterial branch Tabic 5.3 Additional areas of measurement and measurements for the public radiation safety cornerstone.

I 3-18

10/28/98 mon 10:20 FAI sos 00s3447 @ 04s a

4 CONCLUDING COMMENTS Although a great deal of progress was made during the workshop, much work remains. This work can be categorized as being of two types: work on the framework and providing funher detail on comerstones themselves and work on implementation of the new assessment process. 'Ihe main activities remaining for each of these is outlined in its own section below. (Work will also be needed to ensure that the inspection program supports the assessment process, but commentary on that subject is beyond the scope of this report.)

4.1 Further Development of the Framework and Cornerstones As was indicated in section 1.2,if safeguards is to remain as a strategic performance area (and it was the conclusion of the performance assessment workshop that it shoulo),1t will need to be developed to the same level as the other areas. It is recomrnen6d that a similar process as that documented herein,

v. ith NRC and industry participation in the developraent process, be followed.

During the course of the workshop, there were many cases in which one cornerstone development group made assumptions that certain arcas would be covered in another cornerstone. 'Ihe 'tig picture"

ramework, with all the cornerstone substructure, should be evaluated to ensure that these assumptions did not lead to gaps in coverage. In addition, the need for inclusion of the areas known to have been c mined from the framework produced to date should be evaluated.

Along similar lines,it is evident from even a cursory examination of section 3 of this report that there is a great deal of unevenness with respect to the attributes developed by the various groups. Indeed, the definition of what constitutes an attribute is not clear - some groups used functions (such as design and maintenance) as attributes, others used characteristics (such as ERO readiness), and others used activities (classification. notification). Several groups used a mixture of these types. In addition, ,

several groups used cross-cutting areas, such as human perfonnance, as attributes in spite of the group decision to include them as part of the larger framework. Further development efforts should evaluate the merits of consistent use of attributes (which is not to imply that each comerstone should have the rame attributes, only that all attributes should be of some limited set of types) and enforce whatever degree of consistency is deemed appropriate.

Here is also unevenness with respect to the PIs, with some groups saying simply that the rncasurement area is amenable to measurement using a PI while others came closer to specifyi:3 what the PI should be. In addition, the groups were not consistent with regard to whether they preferred site specific or generic PIs. Finally, some proposed PIs are already in existence, while others are in need of development. Clearly, before a new assessment process can be implemented, a decision is needed as to the preferred PI type (or that a mix is acceptable). Regardless of which type of PI(if any) is given preference, it will be necessary to develop all of the PIs in detail, noting exactly what will be incasured and how, including with what frequency. For those areas for which mspection or self-assessment arc the primary measurement methods, a specification is needed as to what will be mspected or =~ad and how the findings will be reported. Because licensee self-assessment prograrns currently vary widely,it will also be important to discuss the degee of standardization expected (with the issue again being whether the measurement methods can 14 site specific or must be generic, recognizing that an insistence on generic processes could mean major reengmeering of

- .:xisting self. assessment methods for some licensecs).

Finally, the criteria by which each PI will be considered to have crossed both the regulatory and safety thresholds need to be specified for each PL Again, the issue of whether to allow site-specific thresholds for PIs, and, if so, how to ensure equity among licensees, must be addressed. More 4-1

10/26/98 NoN 10:20 FAI sos 06:3447 @ 046 4

1o difficult, but equally as important,is the issue of what types ofinspection or self assessment findings are to be considered adverse enough to warrant saying that a threshold of performance has been crossed for those areas Dom which inspection and self assessment are the primary measurement

! methods.

l -

i *Ihe LANL analysts suggest that an important next step in framework development is to combine the

graphical representations of the individual cornerstones provided in section 3 into a single representation of the entire framework. This " master" framework can then be used as a convenient l structure from which to evaluate discrepancies at each level of the hierarchy (such as in the language used to describe attributes); redundancies in which
he same attribute, area of measurement, and/or measurement method is used across cornerstones and could be combined; and gaps.

4.2 Implementation Issues l Because the focus of the workshop was on designing the assessment methodol (so had asits primary interest the speciScation of what and how to measure), many of the im mentation details needed to describe how the assessment process will work were either addres by assumptions (some of which were not widely shared) or not addressed at all. While the following discussion is not intended to be a comprehensive enumeration of all the implementation details that need to be considered,it does address some of the most important issues.

First, a process needs to be developed to demonstrate how PIs are combined to assess a particular cornerstone. Questions that must be answered include:

  • does each comerstonc receive some type of overall assessment achieved by " rolling up" the PI results or are the results of each PI left as independent information sources? and
  • how does one avoid double jeope.rdy (in which the licensee is downgraded in several areas) in cases where the same PIis used in different comerstones?

If there is a roll-up, the mechanism for accomplishing the aggregation of findings must be developed and must address the following questions: .

  • is the mechanism an intermediate roll-up in which PI results roll-up first to attribute-level results then attribute-level results roll up to a comerstone result?
  • if there is no intermediate roll-up, how are PI results translated into comerstone results?
  • are PIs weighted or do all contribute equally?
  • how many PIs must have crossed a threshold before the attribute is considered to have crossed a threshold?
  • can performance on the comerstone be viewed as within licensee prerogative if any attribute has crossed a threshold?

If there is no roll-up, criteria rnust be developed for assessing the comerstone or for interpreting the results of the individual PIs. Finally, regardless of whether the approach is a data roll up or something else,in cases where either inspection or self-assessment findings are used as the primary assessment l

i methods, there needs to be a process for combining these data with PIs.

l l

4-2 l

. ::: w --.. .:. , .

10/26/98 MON 10:21 FAI sos 00s3447 0

Similarly, a position needs to be developed as to whether each licensee receives an overaD score, and, if m, how the cornerstone results are combined to produce it. The questions to be addressed here are sirnilar to those needing to be answered relative to mucssing each cornerstone. Regardless of the aliproach taken, a process needs o be developed to communicate the assessment resuhs to the licensees and the public. A decision to not produce an overall assessroent or an aggregated assessment on a per comerstone basis will produce a pazticular challenge with respect to communicating the safety significance of the assessment results to the public.

In cases where inspection findings and licensee self-assessment data are used to rebut PIs, there needs to be a process for reconciling differences, if any, between PIs and other information sources. " Ibis process must be sensitive to the need for a balance between objectivity and scrutability on the one hand and the ability of NRC senior managers to applyjudgment to the assessment on the other.

Finally, there is an identified need to design a validation study addressing both the effica of the new l process with respect to the results achieved and the confortnance of the new process with ectives such as scrutability and reducing regulatory burden on both the NRC and thc licensees. ation of process efficacy should consider agreement with previous resuhs (i.e., validation t historical data), concunent validation (i.e., in which results of the new process are compare to results of e sisting processes for the same time period), and internal consistency of the results achieved both within a single assessment period and across assessment periods. Evaluation ofinternal consistency should address reliability issues, such as whether two assessors would come to the same conclusions given the same data set, whether all plants are collecting and reponing PIs consistently, and the like.

liecause these issues were considered in some detail during the IRAP, examination of the project documentation for the IAP to determine which ofits pmposed implementation details either apply directly or can be modified to apply to the cornerstone approach might prove to be a fruitful next step.

- i 4-3