ML20197H973

From kanterella
Jump to navigation Jump to search
Summary of 981022 Meeting with NEI Re Team Charters,To Present Integrated Schedule & Plans for Future Interactions Between NRC & NEI to Complete Review of Regulatory Oversight Process.Attached Matls Presented & Discussed
ML20197H973
Person / Time
Issue date: 11/06/1998
From: Isom J
NRC (Affiliation Not Assigned)
To:
NRC (Affiliation Not Assigned)
References
NUDOCS 9812140222
Download: ML20197H973 (28)


Text

_ _ - _ _ __ _ . . _ _ . _ _ _ _ . . _ _ _ . _ . .- . _ . _ - _ _ _ _ _ - _ __ _

November 6, 1998 \

l MEMORANDUM TO: File FROM: James A. Isom, Operations Engineer i inspection Program Branch  !

Office of Nuclear Reactor Regulation  ;

i

SUBJECT:

SUMMARY

OF THE OCTOBER 22,1998 MEETINGS WITH THE I

NUCLEAR ENERGY INSTITUTE TO DISCUSS OPTIONS FOR REVISING THE REGULATORY OVERSIGHT PROCESS .

i

?

On October 22,1998, the NRC met with facilitators, NEl, and others, to discuss the team charters; to present integrated schedule; and to discuss the plans for future interactions-between the NRC and NEl to complete the review of the regulatory oversight process. NEl

]

i provided data on the validation of NEl indicators and insights. Plans for benchmarking data and  !

how thresholds might be set were also reviewed. The attached materials were presented and I l discussed.

l l

1 1 i

l l

l () fin l

l j 9812140222 981106 i

PDR REVQP ERONUMRC PDR I

l CONTACT: James A. Isom C/g'JO 301-415-1109

{, ; 0 ( ?]' Y lA) y o fye b NT5 I

2 Attachments: 1. Agenda

2. List of Attendees
3. Framework Task Group Charter
4. Inspection Task Group Charter
5. Assessment Task Group Charter
6. Performance Assessment and Inspection Initiatives- Preliminary Schedule of StakeholderInterface Meetings
7. Performance Indicator Workshop Emergency Preparedness Breakout Session
8. Occupational and Public Radiation Safety Cornerstones Industry Preliminary Proposal
9. Insights From Data Analysis
10. Unavailability Graphs (Emergency AC; AFW/RHR; High Pressure injection)

DISTRIBUTION:

PUBLIC Central Files PlPB R/F F. Gillespie M. Johnson C. Holden A. Madison J. Isom ,

T. Frye J.Jacobson D. Gamberoni R. Barrett P. Baranowsky B. Mallet i

DOCUMENT NAME: MTG1022. SUM To receive a copy of this document, indicate in the box: "C" = Copy without enclosures *E" = Copy with enclosures "N" = No copy l

OFFICE PIPB: DISP A PIPB: DISP l@ l l l NAME JAlsom#) MRJohnson Mk DATE 11/ 4 // /98 / 11/ (,, /98 V l OFFICIAL . RECORD COPY l

l l

AGENDA FOR NRC/NEl MEETING ON OCTOBER 22,1998 PURPOSE: Present NRC Charters Present integrated Schedule Discuss Future Interactions I. Opening Remarks Alan Madison

11. Schedule issues Jeff Jacobson 111. Framework Development Pat Baranowsky

. Charter

. Initiating Event Cornerstone Barriers Cornerstone

. HP Cornerstone

. Security Cornerstone

. PI Benchmarking l

IV. Inspection Program Rebaselining Steve Stein

. Charter V. Assessment Program Development Mike Johnson

. Charter VI. Future Interactions General Discussion Attachment 1

OCTOBEP. 22,1998, NRC/NEl MEETING ON PERFORMANCE ASSESSMENT PROCESS IMPROVEMENTS SIGN-IN SHEET i

NAME ORGANIZATION PHONE NUMBER 4

Ralph Andersen NEl 202-739-8111

, MaryAnn Ashley NRC/NRR 301-415-1073

Rudolph Bernhard NRC/R-Il Jin Chung NRC/NRR 301-415-1071 Tom Dexter NRC/R-IV 301-415-1452 Rich Enkeboll NEl 202-739-8110
Stephen Floyd Nuclear Energy institute 202-739-8078 Timothy Frye NRC/NRR 301-415-1287 i

David Garchow PSEG 609-339-3250 Clare Goodman NRC/NRR 301-4151047 Comelius Holden NRC/NRR 301-415-1037 Tom Houghton NEl 202-739-8107 Jeffrey Jacobson NRC/NRR 301-415-2977 Michael Johnson NRC/NRR 301-415-1241 Larry B. Kuzo NRC/R-il 301-415-8111 Ron Lloyd NRC/AEOD 301-415-7479 Alan Madison NRC/NRR 301-415-6412 Sam Malur NRC/NRR 301-415-2963 Jim McCarthy Virginia Power 804-273-2699 Alan Nelson NEl 202-739-8110 Peter Prescott NRC/AEOD 301-415-7591 Larry Scholl NRC/R-l 301-4151332 Nick Shah NRC/R-Ill 301-415-1406 Mel Shannon NRC/R-il (423)842-8001 Greg Smith NRC/R-l Steven Stein NRC/NRR 301-415-1296 David Stellfox McGraw-Hill 202-383-2162 Randy Sullivan NRC/NRR 301-415-1123 Mike Tschlit! NRC/EDO 301-415-1733 Attachment 2

O FRAMEWORK TASK GROUP CHARTER PURPOSE

The purpose of the taskforce is to develop details of the framework for a more objective, risk-informed, performance-based approach to licensee performance assessment and related bases for inspection activities. Information developed as part of the task will be used in the j development of risk-informed baseline inspection and performance assessment tasks.

i SCOPE This activity includes: articulation of the principals, bases, and logic of the framework; identification and evaluation of performance indicators (Pis) and associated performance thresholds for initial implementation of the framework; and determining the limitations of Pls used for performance assessment and developing inspection bases for rebaselining the inspection program. The work of the taskforce will follow and build on the defining principals and comerstone development effort that was begun at the Performance Assessment Workshop held September 28,1998, through October 1,1998. It is recognized that this program will evolve and be refined over a period of years. Therefore, the intent the taskforce is to develop sufficient detail to allow the Commission to make a decision on the efficacy and direction of this new approach to licensee oversight and, if approved, lay the groundwork for initial implementation.

PRODUCT By November 25,1998, the taskforce will provide to the Director, Division of Inspection and Support Programs, NRR and the Director of NRR, documents describing the overall framework, performance indicators and thresholds, and related bases for the inspection program. These documents will contain the principles, bases, logic, and supporting technical information and will be in the form of appendices to a Commission Paper.

KEY TASKS

1. Oversight framework development Review and, as necessary, enhance articulation of oversight framework related defining issues (consider including appropriate enforcement items)

=

identify and evaluate other framework or regulatory program interface issues (e.g.

role of compliance with respect to risk and performance, role of generic issues, operating experience evaluation and insights, other risk considerations)

Outline safety construct, framework, bases, operational process, interfaces

. Draft oversight framework section Intemal review and extemal coordination '

. Finalize oversight framework section O

Attachmen}(

4 2 '

2. Pl Development Review and, as necessary, enhance articulation of defining issues specific to identification, definition, validation, and limitation of Pls 3

Review and finalize identification of Pls for both power and shutdown Identify P1 limitations, inspection interfaces, relation to risk and Maintenance Rule

' Identify and evaluate V&V issues, phased development of improved (risk-based)

Pls and implementation issues Prepare write-up template for each comerstone

. Prepare write-up for each comerstone Prepare overview / introductory writeup for Pl section Intemal review and extemal coordination

. Finalize Pl section write-up

3. Pl Thresholds Review and, as necessary, enhance articulation of defining issues specific to thresholds Review and finalize attributes of Pl thresholds Review and evaluate NEl proposed thresholds, bases, analyses Perform independent analyses of P1 response, evaluate correlation to peer performers Perform analyses of risk sensitivity to changes in performance relevant to Pls Finalize position on, details for, and proposed values of PI thresholds Prepare wnte-up outline for Pl thresholds

. Prepare write-up Intemal review and extemal coordination

. Finalize Pl threshold write-up

4. Inspection Bases Review and, as necessary, enhance aniculation of related defining issues (e.g.

purpose / role of inspection, risk context of inspections and findings)  ;

Detail needs for PI V&V, inspection areas not es .ered (or partially covered) by Pis, performance driven inspections, and principles for risk-informed j inspections and evaluation of findings. j Coordinate with inspection rebaselining task group 1

. Prepare write-up outline

. Prepare write-up i Intemal review and extemal coordination

. Finalize write-up l

Mk /

h TNSP6CTMAI CHARTER

(

PURPOSE The purpose of the task force is to develop a program for the NRC baseline inspection of power l

reactor licenses performance. The program must be meshed with the programs being i developed for the framework, assessment and enforcement. The program must have a sound  !

basis linked to the safety risk. i SCOPE

[

The development is limited to the inspection program for operating nuclear power reactor j facilities. The task force product must address the issues that have been raised regarding the  ;

NRC's current and proposed inspection program. A brief summary of soms of these is contained in Chairman Jackson's spesch entitled, "The challenge before the NRC: A bend in the road or a modulation of trajectory", which was given at the July 14,1998 Senior  :

Management Meeting.  !

PRODUCT / DELIVERABLE I By December 11,1998, the task force will provide to the Director, Division of inspection and  !

Support Programs in NRR and the Director of NRR, a document describing the developed  !

program and a draft input to a Commission Paper. The document will be in the form of a 1

handbook with a general description of the key elements in the inspection program and a '

i referenced attachment. The attachment should describe the basis for each key element of the  ;

inspection program.

KEY TASKS TO ACCOMPLISH DURING THE PROJECT

1. Determine from intemal and extemal stakeholders, what should be included in the scope  !

of the baseline inspection program....what are the issues need to address in the  !

program.

1

2. Determine what the NRC inspector should examine during inspections of nuclear reactor power plants. This should incorporate the results of the Assessment Task Force and I work being conducted by the Office of Nuclear Reactor Research. The product of this task should establish the population of inspectable items and key program elements.
3. Develop the risk informed basis for each key inspection element. Use this as a basis  !

and develop a hierarchy for the elements. l

4. Determine the role in the baseline program for the issues developed in item No.1.

q

5. Determine the value-added from each proposed element of the inspection program.

l

6. Develop guidance for the inspection program process attributes. Guidance should i

include type inspection (team vs individual), type inspector (resident, region-based), '

t depth of inspection, frequency of inspection, link to enforcement, and link to assessment grfom ewT 4-

, c_,,._..,_.

i L

( 2 prccess. Guidance should also include how the generic baseline program will be mo:iified with site specific risk information.

7. Benchmark inspection program used by other Agency's/ industry.
8. Communicate results continuously to stakeholders.

. i

(

l

'r 1 .

i 1

d

. 5 k[

O .IkA &.?, E

. e, CSESSMENT TASK GROUP CHARTER PURPOSE The purpose of the assessment task group is to develop the process which will allow the NRC to take various information sources, make objective conclusions on licensee performance, take actions based on these conclusions in a predictable manner, and effectively communicate these results to the licensees and to the public. This task group will also develop recommcndations on the best methods to transition from the existing regulatory oversight processes and implement any proposed new oversight processes.

SCOPE The scope of this effort is limited to operating power reactors, and does not include for example:

(1) permanently shutdown and/or decommissioned power reactors, and (2) fuel fabrication or materiallicensees.

PRODUCT / DELIVERABLE The task group will deliver recommendations which address each of the key tasks for developing an assessment process and a transition plan. These recommendations need to be completed by the end of November 1998 and will form part of the Commission paper which will forward a recommendations for improvement to the regulatory oversight processes.

KEY TASKS TO ACCOMPLISH

1. Develop methodology for the integration of the information inputs within the cornerstones so that the assessment results are objective and transparent. Evaluate methods for the integration of risk-significant inspection findings and other information sources with the performance indicator results.
2. Develop decision criteria /model so that NRC actions can be taken in manner that is scrutable and predictable by both the licensees and the public,
3. Develop the appropriate frequency for routine assessment of licensee performance.

Identify the appropriate staff positions for conducting assessment, and their responsibilities. Determine the level of senior NRC management involvement required for the performance assessments. Develop a process for handling changes in performance indicator results as they occur so that appropriate action is not reliant on the performance of a periodic assessment.

4. Determine methods for communicating to both the licensees and the public the assessment results and NRC actions taken. Evaluate methods for taking a graded approach to accomplishing this activities.
5. Determine how licensee performance in response to NRC actions is monitored and measured, and how the results feed back into the assessment process.
6. Determine appropriate methods to transition from the current assessment processes.

f M ('t4 b U d -1k E

. l DRAFT I

Develop a recommendation regarding the continued suspension of SALP. l

7. Develop a recommendation for implementing a new process. Identify the necessary actions required to validate a new process prior to implementation. Evaluate and recommend either a phased-in approach for all licensees or a pilot program with targeted or volunteer participation.
8. Identify the necessary program requirements to support a voluntary licensee assessment data reporting process. Develop a recommendation on how those licensees who decline to participate in a voluntary program would be accounted for and assessed (interface with framework group).
9. Develop a methodology for the continuous self-assessment of program effectiveness subsequent to implementation.  :

I

10. Interface with Office of Enforcement to develop concepts of how assessment results  !

affect enforcement actions, and how ' regulatory significance" is defined and used in the I oversight processes.

l t

t o

i DRAFT

\

n Performance Assessment and inspection initiatives - Preliminary Schedule of Stakeholder interface Meetings Data Place and Time Meeting Purpose 10/22/98 O-5B4 Discuss overall schedule, NRC team charters, and 1 - 4 pm initialissues 10/27/98 O-12B4 Working meeting on safeguards cornerstone  !

8:30 am - 4:30 pm 10/28/98 TBD Working meeting on sahguards comerstone i 1:00 pm - 4:30 j

10/28/98 0-5B4 Working meeting for NRC to present preliminary results 9 am - 12pm of comerstone reviews (suggested PI's and inspection l

areas) 10/29/98 NEl 1776 l St Working meeting on radiation protection comerstone NW, Wash DC 8:30 am -

12:30 pm I 10/29/98 O-14B11 Working meeting on emergency preparedness 1 pm - 4 pm cornerstone 11/4/98 O-584 Receive feedback from stakeholders on suggested Pls 9 am - 12 pm and inspection areas Discuss NEl validation efforts 11/12/98 O-6B11 NRC presents final draft of cornerstone reviews 1 - 4 pm (suggested Pls and inspection areas)

Working meeting to discuss methodology for integrating inspection results with Pls, decision criteria, and communication methods NRC presents first draft of inspection scope 11/18/98 O-5B4 Receive feedback on inspection scope 9 am - 12 pm Discuss transition strategy for new oversight process Discuss licensee data reporting program Discuss preliminary threshold analysis for Pts

/YTr^c + w r &

t s,

Performance Assessment and Inspection initiatives - Preliminary Schedule of Stakeholder Interface Meetings 11/20/98 ACRS subcommittee brief 11/25/98 O-584 NRC presents recommended scope, depth, and 9 am -12 pm frequency for baseline inspection program 12/2/98 O-5B4 TBD 9 am - 12 pm 12/9/98 O-6B11 TBD 9 am - 12 pm 12/16/98 O-5B4 Present final draft of recommended Pls and baseline 9 am -12 pm inspection

I PERFORMANCE INDICATOR WORKSHOP j EMERGENCY PREPAREDNESS

BREAKOUT SESSION
09/30/98-10/01/98 2

i i

i '

i 1

i 1 l 1  ;

Table of Contents Page e Summary 2 j e

' Perfoimance Indicator (PI) Discussion 3 l e ' Classificatiori 3 e Notification 3 )

l . Protective Action Recommendations 4 i e Emergency Response Organization Readiness 4 l

1 2 . Key Discussion Notes 09/30/98 5 ,

4

. Key Discussion Notes 10/01/98 5

. EP Parking Lot item List 6

e 10/01/98 EP Parking Lot item Discussion 6 l

10/01/98 1 Affeceevr 7

Emergency Preparedness Breakout Session Summary OBJECTIVE - Identify Performance Indicators for Emergency Preparedness l

NRC's Overall Safety Mission '

Public Health and Safety as a Result of Civilian Nuclear Reactor Operation Strategic Performance Areas ,

. (Minimize / Prevent) Exposure From Reactor Accident Releases l Cornerstone / Strategic Performance Indicators  !

. Emergency Preparedness l

. Objective - l Ensure that the Licensee's capability is maintained to take j adequate protective measures in the event of a radiological

)

emergency. 1

. Scope -  !

Onsite Emergency Plans; Offsite Emergency Plans were removed '

from the scope but are believed to be bounded by FEMAs Finding of Reasonable Assurance i Desired Result / Performance Expectations I

. Effective implementation of onsite emergency plans.

Important Attributes (Safety Significant)

. Classification 1

. Notification I Protective Action Recommendations (PARS)

. Emergency Response Organization (ERO) Readiness Summary Matrix l

Important Attributes Areas (What) to Measure Means (How) to Measure Classification Exercise (EX), Events (EV), Testing (T), inspection (I), Self Assessment (SA), Performance Indicator (PI)

(Timely and Accurate) Event Recognition EX,EV,Pi Accident Assessment Delete - Crosscutting Emergency Action Levels EX,EV,SA,1 Notification (Timely and Accurate) Alert and Notification System EX,EV,T.Pi Adequate Com Channels (voice, data) EX,EV,T,SA,I Provide Direct Interface To Offsite EX,EV,Pi PARS Accident Assessment Delete - Crosscutting Protection of Emergency Workers EX,EV,SA,I (Timely and Accurate) Direct Interface with Offsite Agencies EX,EV,PI ERO Readiness I Adequacy of Facilities EX,EV,T,SA,1 Activation of ERO EX,EV,SA,I i

10/01/98 2

PERFORMANCE INDICATORS (PI) DISCUSSION Classification e Event Recognition

.  % of timely and accurate classifications may be numerical PI

. The numericalindicator may need to be weighted based on the severity of the event (whether the event is classifiable or not) (the severity of classifiable event) e What are the sources of input Actual Events Exercises Licensee Event Reports (LERs) (Potential classifiable events database)

Crew / Team evaluations e EALs e EALs are considered as part of the classification process and would be evaluated as part of the self assessment / inspection process e Accident Assessment

. Accident Assessment is considered to be a cross-cutting issue and is subsumed in the other facets of EP being evaluated Notification e Provide Direct Interface to Offsite

  • % of timely and accurate notifications may be a numerical PI e What are the sources of input e initial and upgrade Emergency notifications should be included

. Notification of PAR changes e 50.72 Notifications e Alert and Notification System

.  % relating to the availability of the Alert and Notification system

. Bounded by what is required to be maintained by the Licensee e includes operability of the equipment

. May need to be consistent with the criteria already established by FEMA for system availability e Adequate Communication Channels e Would include the communication systems necessary to make emergency notifications to the offsite agencies e Concern that the counting of the number of pieces equipment may not be worthy to be looked as a Pl

. Decision to include as a self assessment / inspection process 10/01/98 3

. - l PERFORMANCE INDICATORS (PI) DISCUSSION (CONT)

Protective Action Recommendations

. Direct Interface with Offsite

% of timely, technically based, and accurate PARS may be a numerical PI  !

Would include dose assessment and field team functions in the assessment process i

. What are the sources of input  !

Actual Events l Exercises Crew / Team evaluations l Other site specific means of self assessment

. Accident Assessment

. Agreed that Accident Assessment should not be included as a "What must be  !

measured" under the PAR key attribute. Agreed that dose assessment i functions / activities would be included as part of the Classification key attribute  ;

. Accident Assessment is a crosscutting issue  !

. Protection of Emergency Workers

. Would include emergency dose limits, applied HP, evaluation of the need for KI, facility relocations, re-entries

. Would be measured by self assessment / inspection l

ERO Readiness j

= Adequacy of Facilities

. Focuses on equipment a..d procedures of safety significance e includes availability of safety information in the emergency response facilities

. Agreed this would be measured by self assessment / inspection l

. Activation of ERO

. Includes timely staffing of minimally required facility staffing

. Testing of safety significant activation equipment and systems

. Verification that minimally required facility staffing is available 24 hours2.777778e-4 days <br />0.00667 hours <br />3.968254e-5 weeks <br />9.132e-6 months <br /> a day,7 days a week

. Agreed this would be measured by self assessment / inspection l

l 10/01/98 4 l

i Kev Discussion Notes 09/30/98 1

)

. Agreed that some items noteworthy of discussion would be identified but not l discussed in this breakout due to time considerations and that these items would be 1 captured for later discussion in the " Parking Lot". (see page 6) 1 Agreed that the focus of the EP working group should be on NRC/ Utility interfaces I realizing that offsite issues may need to be fully discussed but would probably be l bounded by if FEMA Finding of Reasonable Assurance was found. '

Agreed that EP was risk informed based on the fact that it is built around a planning i standard that is risk based

. Public Confidence is a very important attribute but one that is impossible to j objectively measure l

  • Licensee methods for creating discussed include: accuracy of information flow ,  !

calendars / brochures, media day, operation of the Joint Information Center (JIC) '

. Agreed that the following fall under Offsite: l

. Effective evacuation in the event of radioactive release

  • Public information/ Confidence ,

e Adequacy of state / local facilities l

. Protection of emergency workers l

. Adequate communication channels l

  • Alert and Notification systems

. Agreed that the important attributes and areas to measure identified may lead to development of performance indicators and self assessment items that reflect the safety significant aspects of an effective EP program

. Agreed that training, human performance, and procedures were cross-cutting issues that were part of all of the important attributes that were identified.

Kev Discussion Notes 10/01/98.-

e Challenge to distinguish between areas measured and attributes

. Consider combining classification, PARS, and notification into accident assessment

  • Concerned that Mitigation of the accident is not included in the key attributes

. Concern that the number of data inputs from exercises and events may not provide sufficient points to provide an indication of reasonable assurance Is " Readiness" the correct word to be used in to describe the 4* attribute (ERO Readiness) 10/01/98 5

Need to consider self assessment program and the corrective action program as performance indicators Need to ensure that performance indicators do not become performance drivers It was agreed that when discussing performance indicators it may be appropriate to l use one of the following words or group of words often: rebuttalable presumption, over-arching, cross-crosscutting, or preponderance. i EP Parkina Lot item List I

. Training l

. Management Systems (Support)

. Severe Accident Management (SAMG)  !

. Effective implementation of Emergency Operating Procedures (EOPs)

. Control Room Habitability (Hand-off from 9/30/98)

. Offsite Emergency Plan

. Long Term Concerns (Ingestion Pathway)

. Mitigation / Damage Control

. Access Control / Security 10/01/98 EP Parkina Lot item Discussion  !

. Training l

. Discussed the potential need to consider the following in ERO Readiness '

.  % of E-Plan ERO personnel that participate in drills / exercises  ;

Qualifications of ERO personnel (Training) 4

. Management Systems (Support) ,

. Discussed the potential need to consider the following in ERO Readiness -

. Number of drills as an indicator of the robustness / management support of EP

. SAMGs as an EP item

. SAMGs are not regulatory based and the decision was made within the EP 4 working group to only discuss regulatory based issues i e if evaluated at all, SAMG will probably end up being reported under the 1 Containment Cornerstone based on the final General Session )

. Effective implementation of EOPs 3

. Group believes that this is not an EP reportable item

. Assessment of Control Room Habitability as an EP item

. Decided that it should not be a part of EP 10/01/98 6 i

i Occupational and Public Radiation Safety Cornerstones Industry Preliminary Proposal (Discussed at the Workshop - Sep 28 - Oct 1)

Obiectives l

l Occupational Radiation Safety l

. Keep doses to workers below regulatory dose limits.

. Maintain an effective ALARA program.

Public Radiation Safety ,

e Keep public doses below regulatory limits and ALARA (e.g., as defined in 10 CFR Part 50 - Appendix I).

Performance Indicators  :

Performance indicators for occupational and public radiation safety will consist of tracking occurrences which:

. result in significant unplanned / unintended individual or collective dose.

. involve a substantial loss of radiation safety barriers.

Performance indicators should be dose-based (as an analogue for risk-based),

overarching (if practical), and need not be comprehensive across all key elements -

i.e., performance indicators can represent a selected sample of key elements.

Cross-cutting issues, e.g., human performance, safety-conscious work environment, etc., are likely to be reflected in overall performance and do not need separate performance indicators.

4 1

n Tra war 6'

1 l

l 1

(Discussed at the Oct 20 NRC-NEI Meeting)

Thresholds i Functional definitions:

. The reculatory threshold delineates the level of performance above which the NRC maintains a reduced level of regulatory attention -permitting the licensee to manage performance and identify and correct problems without NRC intervention -and below which there will be increased regulatory attention on a progressive basis in conjunction with continued decline in performance.

. The safety threshold delineates a unacceptable level of performance that is persistent and pervasive requiring restriction or cessation of the operations relevant to the degraded performance.

Performance Assessment:

. thresholds should be defined as exceeding a specified number of reported occurrences during the assessment period.

. reporting should be quarterly.

. the assessment period should be the duration of a fuel cycle (e.g., from the end of one refueling outage to the end of the next).

. threshold criteria (i.e., the acceptable or tolerable number of events per assessment period) should %-44. h permit trending and response by the licensee prior to regulatory intervention.

. threshold criteria should be determined on a site-specific basis to reflect differences in the duration of fuel cycles (e.g.,12,18, or 24 months) and the number of reactor units (e.g.,1,2, 3).

. threshold criteria should reflect the level of the detailed criteria for what is a reportable occurrence -i.e., the lower the detailed criteria, the larger the number of acceptable / tolerable occurrences during the assessment period (or vice versa).

2

}

i i

Performance Indicators .

Occupational Radiation Safety:

'. Single overarching indicator

. "Significant unplanned / unintended individual dose" should be defined as a {

per cent (%) of regulatory limits and applied uniformly across all relevant (

limits. There is not yet consensus on what the % value should be (due on l 10/29).  !

i

. There is not yet a consensus on defining "significant  !

unplanned / unintended collective dose" (due 10/29)  !

. " Substantial loss of radiation safety barriers" is intended to capture "near- .

misses" and should be determined by a two-step process:  !

e First -there must be a failure of one or more barriers, including:

1. Failure to identify and control the hazard.
2. Inadequate procedure or RWP. l
3. Loss of a physical barrier (posting, locked door, shieldmg, j etc.) j
4. Failure to provide adequate monitoring / surveillance (e.g., HP  :

technician coverage, remote monitoring or surveillance, etc.) i

5. Improper action by the worker due to inadequate knowledge l or understanding ofinstructions or requirements. {

t

  • Second -there must be a." substantial potential" (as used in ,

regulatory jargon) for an exposure to have occurred in excess of the  !

% value of a dose limit (i.e., "significant unplanned / unintended dose").  !

. Events that meet the first, but not the second-step criteria, would -

be tracked in the licensee corrective action program, but not recorded as a " reported occurrence."

l

  • Significant events that are currently reportable under Part 20 or l Part 50 (e.g., regulatory overexposure) should be treated as a single i

! event, but should be handled by reactive inspection. Such an inspection may uncover a preponderance of other information that l rebuts the performance indicator data.

4 l

3 f

l i

l 4

Public Radiation Safety:  ;

. There may be a need for separate performance indicators in the areas of l (1) effluents; (2) clearance of materials offsite; and (3) transportation of 1 radioactive materials. '

  • The criteria for performance indicators for effluents / direct radiation and  ;

clearance / release of materials will be structured similar to the criteria for the occupational radiation safety performance indicator (above). The dose-based criteria will utilize individual dose only.

  • Tlie criteria for transportation may have to be structured differently (t.o be l determined by the 10/29 meeting).

l l

4

Proposed Schedule  ;

i l

Meetine Date' Obiective(s)/ Deliverable (s)  !

10/29 NRC and NEI/ Industry bring detailed proposals, including specific criteria, definitions of barriers, etc. Reach alignment on draft performance indicator framework for NRC and industry i review and comment (10/30-11/9) i 11/3 Verify and validate draft performance indicator framework ,

against historical record of events, violations, and operating experience. Determine scope of relevant baseline inspection j program.  ;

.11/10 Finalize performance indicator framework. Reach alignment on scope of relevant baseline inspection program. l 11/17 To be determined.

l i

)

1 5

- - - l

- . . - . . - - - . . . ~ . . . - . - . . - - - - - - . - . _ -

Insights From Data Analysis

1. Plants with historical strone nerformance typically have data values and trends.  !

in the upper half of the green band with an occasional" dip"into the lower half of the green band.

2. Averare nerformine plants typically have data values and trends in the lower half of the green band with an occasional" dip" into the top of the white band, usually only on one indicator at a time.
3. Plants exhibiting a declinina trend that was corrected show a gradual decline in performance into the top of the white band on severalindicators. The decline typically occurs over a several year period. Likewise, the correction of the trend

, to restore performance into the green band also occurs over a several year period.

4. Plants that have been recently on (or still are) on the NRC watch list have severalindicators that were (or are) in the white band for extended periods.
5. The barrier integrity indicators (fuel activity, RCS leakage, and containment integrity) show little or no variation for excellent and average plants. There is I some variation for plants with declining trends or on the watch list. Most of the detectable variation in performance occurs in the operating challenges and mitigation capability set ofindicators.
6. For operating challenges, the plant transient indicator appears to be somewhat leading to the scrams and safety system actuations indicator. The shutdown indicator shows a very low frequency of shutdown events and a low severity index for the few events that do occur.
7. No consistent data is yet available for the mitigation set of SSPI systems. The AEOD data for the safety system failures was used as a surrogate for the near term. The safety system failure data is consistent with plants that have exhibited declining trends or were on the NRC watch list.

Summarv

1. The set ofindicators provide an overall perspective of safety performance.

f

2. Indicators do distinguish levels of performance, not in all indicators simultaneously, but in enough to be a viable assessment tool for allocating resources.

-l ATTAC HMcNT 9

Kev Plants Performance History A,B,C Excellent D,E Average F,G,H Declining Trend I,J,K,L,M,N,0,P Watch List l

)

l l

1 l

i j

i ,

! Unavailabiftty j o o o o o o

o 2 8 2 8 8 i 1 l 9 I

13 17 -

21 '

f 25 b _

29 -

, 33 - -

4 37 -

i 41 -

t l

l M i .

l 49 :

i 3 4

g 53 m

! 57 h i

a*

! 61 5

j 65 i

  • l >
O 69 - ,

73 :

77 - ,

- , ,~ . .

4 85 l

i 89 m i i

l 93 - I l

9 97 l 101 5 Imam->

t a r rem wg fo

1

  • a j

Unavailability o o o 9 p l o $ 9 G 8 $ 8 i 1  :

1 '"

5 '""

4

- l i

4 9 i  !

$ M 1 13 """ '

i e

I 17 l <

f j 21 '

! . 1 M

)

i 29 i I i 33 i.e I 37 i i i t

j 41

' I i

i 45 """ i l.

49 l .

I 2 $3 j 57 - >

61 ""

! momen .

D

} 65 I

D M

1 69 I i 73 l

( 77 81 l

j i 85

. o 1

89 m.u-93 -

mummme 97 101 9

4 d

i j

4 4

i., __ ., _ - . _ . . . . , _

___ _____ _ _____- . ~ - . - . . . - . - - ~ . - - . .,. -_ -~_~..-- ..- -. ~- -

.- - - - - - . . ~ - .

l .

1 i

l Unavailability C o o o o o

! O 3 O O E O 8

' ? l 4

h

9

. - . . < < + ,,.a..

. 13 """"

i i 17 ".

. l 1 -

  • """ l 21 ===== i 1 25 4

m -

1 29 t

33 """"" l l

i  :<

t 37 "" .l: l l o i 41

e 6 B l 45

! I 49 o I

[ 53 8 sr 57 '

c

== I m g 61

-. ,c e

65 -

- mum 2 -

69 -

k y f 73 l 8 .

m i M

N m . _ ,

81 B

W 85 i 89 l

"" l 93 97 101 --

6 1