ML20202D265

From kanterella
Jump to navigation Jump to search
Summary of 971106 Public Meeting to Discuss Improvements to Current NRC Performance Assessment Processes.Meeting Attendees Listed in Encl 1
ML20202D265
Person / Time
Issue date: 11/19/1997
From: Louis D
NRC (Affiliation Not Assigned)
To:
NRC
References
PROJECT-689 NUDOCS 9712040216
Download: ML20202D265 (25)


Text

..

a-onszu

' 2'.

UNITED STATES y-E j

NUCLEAR REGULATORY COMMISSION WASHINGTON, D.C. 20666-0001

%,,,,, *?g November 19, 1997

[

(Q )

MEMORANDUMT0i File DavidLouisGamberonipnc.)w/

dstwd FROM:

w Inspection Program Bra Division of Inspection and port Programs Office of Nuclear Reactor Regulation

SUBJECT:

PUBLIC MEETING ON INTEGRATED REVIEW 0F ASSESSMENT On November 6,1997, the NRC staff held a public meeting to discuss improvements to current NRC performance assessment processes and the

-Integrated Review of Assessment (IRA). is a list of the meeting attendees. Attachment 2 is a copy of the NRC handout that was used in the meeting. Attachment 3 is a copy of a Nuclear Energy Institute handout that was used in the meeting.

The staff made brief presentations that addressed:

(1) the information base for the senior management meeting (SMM)

(2) improvements to the SMM process, and (3) the integrated Review of Assessment.

Following the staff presentations, the Los Alamos National Laboratory contractor facilitated a comment period.

Comments from the public and industry included:

e The trend models do not include scrams, significant events, and safety system actuations.

These performance indicators are tied closely to safety.

A trend model (if used) should be based on public health and safety.

Plants are unique and can not be graded on a single scale or against e

each other.

Eliminate the SALP program and Watch List because they provide no

!/

e meaningful information.

/l 1

o Alternatively. (if necessary) consider annual presentations to the ugg Commission that describe safety performance for each plant in a region SF.g a.

and the NRC's regulatory priorities.

g Match SALP functional areas (if retained) to template categories.

e e

' Economic indicators should not be used because they can not discriminate N>

between a plant that is cutting corners and one that is improvin.

g a:

productivity.

e The trend models are event driven and are not useful. They are pg i.;,jji$onsistentwithCommissionstaffrequirementsmemoranda.

e a.

[

3- ($

lll,l!!l ll)h o y ; 4, w#

sc EE CEBffER CDPV

2 e

Develop performance expectations for each area that have a direct relationship to public health and safety.

Objective indicators should be defined for determining the degree to which performance expectations are being met.

e Assessmer.ts should be accurate, timely, and objective.

Assessments should be tied to public health and safety and focus on e

specific safety issues.

e SALP assessments are untimely.

The Watch List is untimely, misleads the public, and is open to political pressures.

e The Watch List results in unfair treatment of licensees because there is no licensee response and no opportunity for hearing.

Allegations should not be used for performance assessment because it e

could result in less allegations being raised.

If a new assessment process is put in place it is very important to e

communicate the new process to the public.

The staff invited the attendees to provide written comments.

The Integrated Review Team will consider the comments received at the meeting and any written comments that are received.

Attachments:

1.

List of Attendees 2.

NRC Handout 3.

Nuclear Energy Institute Handout l

cc w/att:

See next page Distribution:

Central Files PUBLIC Project No. 689 PIPB R/F RBorchardt MHalloy DOCUMENT NAME:

P:\\ MEMO. IRA To receive a copy of this document, indicate in the box: "C" - Copy without attachment / enclosure "E" = Copy with attachment / enclosure "N" = No copy Tif7EE PIPB:DT5P ty.4 Ifi PIPB: DISP -

k7 l

l NAME DLGamberoni" #

MRJohnson/f?p" DATE ll/N /97 11// 9/97 OFFICIAL RECORD COPY l

l

i 2

e Develop performance expectations for each area that have a cirect relationship to public health and safety. Objective indicators should be defined for determining the degree to which performance expectations are being met.

o Assessments should be accurate, timely, and objective.

Assessments shoud be tied to public health and safety and focus on specific safety issues.

e SALP assessments are untimely.

The Watch List is untimely, misleads the public, and is open to political pressures.

e The Watch List results in unfair treatment of licensees because there is no licensee response and no opportunity for hearing.

e Allegations should not be used for performance assessment because it could result in less allegations being raised.

e If a new assessment process is put in place it is very important to communicate the new process to the public.

The staf f invited the attendees to provide written comments.

The Integrated Review Team will consider the comments received at the meeting and any written comments that are received.

Attachments:

1.

List of Attendees 2.

NRL Handout 3

Nuclear Energy Institute Handout cc w/att:

See next page

1 -

a A

Meetino Summary on Inteorated Review of Assessment dated:

November 19. 1997 Mr. Ralph Beedle Ms. Lynnette Hendricks. Director Senior Vice President Plant Support and Chief Nuclear Officer Nuclear Energy Institute Nuclear Energy Institute Suite 400 Suite 400 1776 1 Street NW 1776 1 Street. NW Washington DC 20006-3708 Washington DC 20006-3708 Mr. Alex Marion. Director Pr@ rams Nuclear Energy Institute Suite 400 1776 I Street. NW Washington. DC 20006-3708 Mr. David Modeen. Director Engineering Nuclear Energy Institute Suite 400 1776 1 Street. NW Washington, DC 20006-3708 Mr. Anthony Pietrangelo. Director Licensing Nuclear Energy Institute Suite 400 1776 1 Street. NW Washington DC 20006-3708 Mr. Nicholas J. Liparulo. Manags:

Nuclear Safety and Regulatory Activities Nuclear and Advanced Technology Division Westinghouse Electric Corporation P.O. Box 355 Pittsburgh, Pennsylvania 15230 Mr. Jim Davis. Director Operations Nuclear Energy Institute Suite 400 1776 1 Street. NW Washington. DC 20006-3708 I

_ ~-

i a-Integrated Review of Assessment - Public Meeting November 6, 1997 Ha.m.2 Oraanization Bill Borchardt NRC/NRR/PIPB Mike Johnson NRC/NRR/PIPB David Gamoeroni NRC/NRR/PIPB Tim Frye NRC/NRR/PIPB Bill Dean NRC/0EDO Glenn Tracy NRC/0EDO Gail Marcus NRC/NRR/DRPW Bill Reckley NRC/NRR/DRPW Melinda Malloy NRC/DRPM/PGEB Larry Nicholson NRC/ Region I Mark Lesser NRC/ Region II Michael Parker NRC/ Region III Bill Johnson NRC/ Region IV Ernie Rossi NRC/AE00 Alan Madison NRC/AEOD Peter Prescott NRC/AE00 Jose Ibarra NRC/AE00 Joel Kramer NRC/RES Heidi Hahn Los Alames National Laboratory Pamela Ulibarri Los Alamos National Laboratory Steve Floyd Nuclear Energy Institute Herb Fontecilla Virginia Power David Lochbaum Union of Concerned Scientists John Matthews Morgan. Lewis, and Bockius LLP Deann Raleigh SERCH David Stellfox McGraw-Hill M. Straka NUS Info Services 1

PUBLIC M3ETING_ PRESENTATION OK IMPROVEMENTS TO NRC PERFORMANCE ASSESSMENT PROCESSES NOVEMBER 6, 1997 i

N E

Ea w

h l

l l

i OUTLINE o Information base for the senior management meeting 3

o Improvements to the SMM process o Integrated Review of Assessment Processes 1

w e

OBJECTIVE - CONSISTENT - LEADING - SCRUTABLE THE CA WISSION HAS INITIATED A COMPREHENSIVE REVIEW OF THE SENIOR MANAGEMENT MEETING PROCESS A SERIES OF STAFF REQUIREMENTS MEMORANDA HAVE CALLED FOR INDICATORS THAT:

i "CAN PROVIDE A BASIS FOR JUDGING WHETHER A PLANT SHOULD BE PLACED ON OR REMOVED FROM THE WATCH LIST,"

ARE " OBJECTIVE, MEANINGFUL AND LEADING,"

i

" REDUCE RELIANCE ON EVENT-DRIVEN ASSESSMENTS,"

L

" ESTABLISH (ES) AN UNDERSTANDABLE LEVEL OF PERFORMANCE EXPECTATION,"

" IDENTIFY FACILITIES IN A CONSISTENT MANNER."

b t

h

4 k

IMPLEMENTATION PLAN l

l o Template

- Indicators and measures

- Criteria for watch list plants o Trending Methodology t

- Criteria for discussion plants o Economic Indicators i

I I

i i

P 3

l i

PERFORMANCE ASSESSMENT INFORMATION i

i information Objective Assessment Sources Data I

insp & s Performance Performance Issues Template l

Event Reports Allegations l

investigations l

Performance Performance Enforcement Indicators Trends r

Actions Reliability DMa Monthy Operating Economic Economic l

Reports indicators Trends i

i i

.s l

I i

4

PLANT PERFORMANCE TEMPLATE l

l' Operational Performance (Frequency of Transients) 1A Normal Operations 1B Operations During Transients 1C Programs & Processes 2 Material Condition (Safety System Reliability / Availability) 2A Equipment Condition 2B Programs & Processes 3 Human Performance 3A Work Performance 4B Knowledges/ Skills / Abilities 3C Work Environment 4 Engineering and Design f

i 4A Design 4B Engineering Support 4C Programs & Processes 5 Problem Identification & Resolution l

5A Identification 5B Analysis SC Resolution 6 Organizational Effectiveness j

e 4

r TEMPLATE INPUT MEASURES o Multiple sources of " issues"

- Start with regional Pla.nt Issues Matrix (PIM) l

- Safety significant LERs, Significant events, ASPS i

- Escalated enforcement and civil penalties i

- Substantiated allegations and investigation findings o Issues evaluated by appropriate staff based on guidance from HQ f

- Merge redundant issues

- Assign risk significance (high/mediur/ low)

- Map issues to template subcategories i

o Headquarters audit of implementation i

i

(

I PLANT 103 PERFORMANCE TREND MODEL (6-QTR) 6' Threshow 4-

=

=

=

=

=

=

=

=

_=_=_==

k : ".., Average be-6--l.

0 l

192 292 392 492 193 293 393 493 194 294 3S4 494 195 295 395 495 196 296 396 496 Forced Outage Rate (percent) 10 M.

l 8

60 6-wy,

g.

4-2e

.,-- 4.....

0 0

Q Q 8 l8 K. 8 8

8 8

8 8

8 8

8 8

8 8

8 8

l i

i ECONOMIC VARIADLE TRENDS: MULTI-UNIT FACILITY 4

1 ECONOMIC INDEX 100-00<

00<

40<

20- *=..,,,,,,,,,,,,,,,,,,,,,,,,,_______

0 1891 1992 1983 1994 1906 1908 1997 4

COST PER MWH (9)

COVEMAGE to 7

6<

90 <

5 40<

4<

30 -

3<

.. = = = *

  • 4 2<

l 20 <

l 10<

0<

(

0

+1 1M1 1992 1983 1994 1996 1986 1997 1991 1992 1983 1994 1906 1906 1987 NONFUEL O & M COSTS (SM) m pm 200 -

1.0 240 <

  • **, e l

0.8 - ',...=,#

200<

l 0.6

sgo, 120<

0.4 <

00 02<

40 <

0 0.0 1991 1992 1M3 1994 1996 1908 1997 1991 1992 1983 1994 '1906 1906 1981 l - e d Jue.... industry machen for mubuntL Note: ectual data are legged one year.

l

~

ASSESSMENT PROCESS DECISION PROCESS I

Template Issues Cats.

integ.ated Decision Remedial Subcats.

Performance Indicators

~

Factors Actions Model i

l l

l l

i 2

l

i;i!i!i i!!t

!!t !fi!!>

l','

i!

G N

3 S

I CT R R A N R O E T E P C H O A T

E F RR O O R

F A

WSE ESL IV EC ECU RON R L D P A E

TI IAN C R

RE E

GM M eS SM I

NEO SC I

S A

i i!

!!i 4;i!ll I

t 1

3 i

i l

OVERVIEW OF CURRENT ASSESSMENT PROCESSES i

i i

)

SYSTEMATIC ASSESSMENT OF LICENSEE i

PERFORMANCE (SALF) l l

- Implemented in 1980 following TMI event

- Allowed for a systematic, long-term, integrated evaluation ef aeralllicensee performance j

SENIOR MANAGEMENT MEETING (SMM) l

- First implemented in April 1986 following the 1985 i

Davis-Besse loss-of-feedwater event

- St..i developed to bring to the attention of the highest levels of NRC management those plants whose l

performance was of most concern l

Process developed so that the primary focus of the SMM is on operational safety l

- Allowed senior NRC managers to plan a coordinated course of action i

i f

i i

l i

i l

l i

~,.

i i

OVERVIEW OF CURRENT ASSESSMENT 4

[

PROCESSES (Continued) l I

l e

PLANT PERFORMANCE REVIEW (PPR) i

- Initial process...,.,.. ~..=u in October 1990 as a l

quarterly activity l

t l

- Csa:wd to provide mid course adjustments in inspection focus in response to changes in licensee performance &

i j

eirm.ving plant issues l

l

- A major emphasis to improve the PPR process occurred l

following the South Texas Lessons Learned Task Force l

l l

l PLANTISSUES MATRIX (PIM) l e

r

- i...pa.r.,..;M across the regions in Spring 1996 r

- Developed as part of the effort to improve the integration i

i of ligc.,iion findings folkrwing the South Texas Lessons l-Learned Task Force j

- Provides an index of the primary issues that are evaluated during the PPR, SALP and SMM processes.

i i

i

-.. ~ -

..i i

l

(+) STRENGTHS /(-)'#EAKNESSES OF

{

. CURRENT ASSESSMENT PROCESSES i

PPR i

- (+)PPR provides short term,intsy ;;d assessments and is l-effective at identifying leading indicators of change in performance 1

1

- (-)PPR is not as effective at iC...iifying long term trends and l

I recurring issues l

SALP j

- (+) Periodic, integrated nmews of licensee performance over an I

l extended time period are effective at iC...iify;.g long term trends l

l

- (-)Due to a long assessment period, the SALP process is backward i

looking and provides lagging indicators of licensee performance

- (+)SALP process c.;epkee licensee performance so that relative FL...snce between plants can be measured

- (-)SALP scores are not clearly defined, not well understood by i

the public, and often misused by the public, financial. institutions i

and industry f

. ~

r

. l l

i l

t I

f 7

l

(+) STRENGTHS /(-) WEAKNESSES OF CURRENT ASSESSMENT PROCEGSES (Continued) 4

[

sMM l

-- (+)SMM provides for a coordinated agency position for both L

declining and superior performance 4

f

- (-)Significant administrative rap.;.sva..is pieced on staff and j

senior managers in preparing for and participating in sums

- (+)SMM process effective at highlighting agency concem to l

l Ilcensees. Plant performance often increases foNowing Watchust l

da=Egnation and issuance of trending letters l

e' GENERAL I

i t

- (-)Many assessment processes are redundant and have similar end products

- (-) Assessment criteria differs between processes such as the SALP and SMM

- (-)F.seex = have potential for inconsistent implementation among the regions i

- (-)Fi&eene have gone through many changes and require more msources for;...fa.T;..i.i;o. thrn cri-i-i_"f ntended i

i i

I j

- - e.

i i

1 I

l SUCCESS CRITERIA FOR INTEGRATED REVIEW j

i t

ATTRIBUTES TO MAXIMtZE i

- Single assessment process. Early kkwFation of declining I

Ilcensee performance. Ability to detect iong term trends and j

recurring events

- Staff job assignments for critical assessment activities weII i

defined i

s l

- Open dialogue of assessment results with the industry and public l

t i

e ATTRIBUTES TO MINIMIZE

- Inconsistent assessment criteria between difierent steps of the process i

- Cx.'5-;Jng responsibilities among staff. Excessive j

j administrative requ;.eT2..6 to 's@.T4..t the process l

- Latitude among regions /HQ in implementing the process i

- Opportunities for conmeting messages on performance a

- l 4

i r

l

. i: ll

!:I!;!l:! < !i. !i!jI l!l; f !lt,

~

=

s e

s e

s r

e a

c y

s o

c e

r i

R p

o se l

O g

p n

n t

e F

n c

i SW ts e

i l

x m

i r

NE d

o e

e e

f I

OV y

rc h

y T E n

o I

n t

a f

o i

n R

n g

I f

D o

e e

u t

N D t

r d

a o

"o n

c O E u

aw p

e s

p I

Q mi n

C A t

o v

s ae a

d YR u

r r l

t g

p n

RG a

ois a

l AE S

pt a

n t

l rh o

DT f

i nn o

t NN e

oi c

d e

a h

i I

U t

t c

r ce O

o ed n

ed t n a

t pu n

B d

sl miocp c

e n n o

P s r

it i

f e

i t

et r

br o

ho e

u o N

Tn P

Pt l.

i4 ij!

i l!

ll:}4 i!!Iil iI!,

i f

i i

?

INTEGRATED REVIEW ASSESSMENT l

e PROCESS l

l

- NRR has project lead t

- A series of meetings will be held with active participation j

l from all regions and several program offices e

SCHEDULE

[

i

- March 1998-integrated Review and Assessment Results Finalized

- May 1998-PublicAndustry Comments Received and Reviewed

- June 1998-implementation Plan Developed l

)

- June 1998-Commission Briefing For Approval of Piucess l

l and Implementation j

j

- July 1998-Commission Approval For Process implementation

- December 1= :...pic,1.c,,Milon of New Assessment Process i

1 4

l i

l i

i SCHEDULE / MILESTONES

{

l i

l r

i o PUBLIC COMMENT PERIOD AND WORKSHOP FOR BOTH PROJECTS:

SPRING, 1998 t

t t

o RECOMMENDATION FOR COMMISSION DECISION:

SUMMER, 1998

{

I I

i o IMPLEMENTATION:

END OF CALENDAR YEAR 1998 i

- REVISION TO MANAGEMENT DIRECTIVE 8.14 I

?

i

\\

t I

1 t

)

i i

i t

[

I

DRAFT fagiglina Princiales for Performina Safety Assessments

1. The objectives of the assessment activity should be clearly defined.
2. Performance expectations should be well dermed and be clear and understandable for each assessment area.
3. Performance expectations should have a direct relationship to public health and safety.
4. Objective indicators should be defined for determining the degree to which performance expectations are being met. Attributes of appropriate 3

indicators are:

direct relationship between the indicator and safety e

necessary data should be available or capable of being generated e

able to be expressed in quantitative terms e

unambiguous e

meaningful e

significance should be understood e

not susceptible to manipulation e

able to be validated

5. Assessment findings should be supported by the direct measurement of the performance indicators.
6. Assessment findings should be scrutable and repeatable.

i l

Atta9mmt 3

.m.

L_)