ML20154A688

From kanterella
Jump to navigation Jump to search
Transcript of 880428 Briefing in Rockville,Md Re Status of Performance Indicators Program.Pp 1-57.Supporting Documentation Encl
ML20154A688
Person / Time
Issue date: 04/28/1988
From:
NRC COMMISSION (OCM)
To:
References
REF-10CFR9.7 NUDOCS 8805160104
Download: ML20154A688 (90)


Text

_-.

UNITED STATES OF AMERICA I

NUCLEAR REGULATORY COMMISSION l

)

\\

~

Title:

Briefing on Status of Performance i

Indicators Program j

Location:

Rockville, Maryland l

Date:

Thursday, April 28 1988 PageS:

1 - 57

[

l

,l I

i

)

I l

I.

1 Ann Riley & Associates Court Reporters 1625 i Street, N.W., Suite 921 l

Washington, D.C. 20006 (202) 293-3950 l

8805160104 000420 PDR 10CFR PDR PT9.7

i

(

DISCLAIMER This'is an unofficial transcript of a meeting of the United States Nuclear Regulatory Commission held on "P, 7 T - # #

in the Commission's office at One White Flint North, Rockville, Maryland.

The meeting was open to public attendance and observation.

This transcript U

has not been rev'iewed, corrected or edited, and.it may contain inaccuracies.

The transcript is intended solely for general informational purposes.

As provided by 10 CFR 9.103, it is not part of the formal or informal record of decision of the matters discussed.

Expressions of opinion in this transcript do not necessarily reflect final determination or beliefs.

No pleading or other paper may be filed with the Commission in any proceeding as the result of, or addressed to, any statement or argument contained herein, except as the Commission may authorize.

9

.1 r

1 UNITED STATES OF AMERICA f

2 NUCLEAR REGULATORY COMMISSION 3

BRIEFING ON STATUS OF PERFORMANCE INDICATORS PROGRAM

[

t 4

j i

5 Public Meeting 6

t 7

8 One White Flint North 9

Rockville, Maryland 20555 10 Thursday, April 29, 1988 4

11 12 The Commission met, pursuant to Notice, at 10:00 l

l 13 a.m.,

the Honorable Lando W.

Zech, Jr., (Chairman of the

[

i i

14 Commission), presiding.

i 15 h

]!

16 COMMISSIONERS PRESENT:

17 II.NDO W. ZECH, JR., Chairman of the Commission 18 THOMAS M. ROBERTS, Commissioner j

19 FREDERICK M. BERNTHAL, Commissioner 20 KENNETH M. ROGERS, Commissioner i

21 1

22 23 24 l

l 1

25 i

l i

a

A q-2 1

NRC STAFF AND PRESENTERS SEATED AT COMMISSION TABLE:

3 2

S. Chilk W. Parler r

3 V. Stello E. Jordan 4

C. Johnson J. Rosenthal l

5 R. Singh i

t 6

7 6

t 9

t 10 i

11 i

12 i

i 13 i

J 14

[

f 15 r

1 l

16 i

17 i

l 18

[

19 i

\\

)

20 I

i 1

21 i

f i

22 l

b 23 24 i

i i

25 f

I l

i

l 3

1 P R O C E E D I'N G S 2

(10:00 a.m.)

3 CHAIRMAN ZECH:

Good morning, ladies and gentlemen.

i 4

Commissioncr carr will not be with us this morning and 5

Commissioner Bernthal will be joining us shortly.

4 6

This is an historic occasion to be here at our new l

l 7

building.

It's many years in coming, and I think the whole e

agency and certainly the Headquarters people are to be 9

congratulated on the way they've accommodated to the new f

I 10 building.

We have some growing pains still that are being s

11 worked out, as you know, but I appreciate everybody's l

12 forbearance and I think our new facility will certainly allow i

13 us to work together a little better perhaps.

Rather than being i

i 14 in 10 or 11 buildings in town, we've got 60 percent of us l

I 15 roughly in this one building now.

It's a big step in the right i

16 direction and I sincerely appreciata the way everybody has been j

17 patient through the kinks we've had and are still having.

I 18 But in general, I think the accommodations are good i

19 and I want particularly to give my respects to the Headquarters

)

20 staff for the way they have kept smiling through the growing i

21 pains of getting settled in a new building.

l i

]

22 This morning we are here to talk about performance j

i 23 indicators, and the NRC performance indicator program is one 24 tool for senior NRC management to monitor the trends and 25 overall performance of each of the licensed power reactors in I

a

4 1

our country..

2 The staff recently provided the commission a paper 3

recommending certain modifications and additions to the i

4 Performance Indicator Program.

The purpose of today's meeting 5

is for the staff to brief the commission concerning the current I

6 status of the Performance Indicator Program and the staff's 7

plans for modifying the current set of performance indicators.

8 Today's meeting should help us to better understand 9

tho staff's recommendations.

This is an infornation briefing 10 and the meeting is intended to prepare the Commission to make 11 some decisions on this program at some time in the near future.

l 13 I understand that copies are available for those of you that 4

13 are here, in the back of the room.

Do any of my fellow 14 Commissioners have any opening comments to make before we 15 Degin?

16 (No response.)

17 If not, Mr. Stello, would you proceed, please?

l 18 MR. STELLO:

Thank you, Mr. Chairman.

I agree 1

19 completely, it is really a pleasure to be here and have our 20 first meeting at One White Flint North, and for us and the 21 ataff, we see the benefits of consolidation in a real way 22 rather than the usual 45-minute trip to come downtown.

It's 1

23 really nice to be able to get up from your desk and come down 1

i 24 to a meeting and be able to do so quickly.

And I thin,k as we l

25 get used to our new quarters we're going to find that it really

5 1

does improve the efficiency of this agency in a real way and 2

let us do our jobs better.

3 CHAIRMAN ZECH:

Let me just make one comment, too, on 4

that before we get into the meeting.

To encourage senior 5

managers, now that we are in one building, to take advantage of 6

this fact and get out of your offices and see your building and 7

see the other people that we've had such a difficult time 8

really communicating with, even in this one metropolitan area.

9 I'm going to try to do that, I have tried to do that and I 10 intend to continue to get out and see the various people, all 11 the people, every single one of them in this building.

12 So I c'ommend the office managers to get out from 13 behind your desks and go see your people, I think you're going 14 to find that we have some absolutely wonderful people in this 15 organization that we haven't had a chance to get to know very 16 well.

Let's take advantage of it.

17 Go ahead, Mr. Stello.

18 MR. STELLO:

Let me get on to introducing the subject 19 of performance indicators.

As you know, we have been, over the 20 past few years now, trying to understand what information we 21 can get out of it, and some weeks ago we presented to the 22 Commission for the first time the benefits of collecting this 23 information that now is starting to show the very purpose that 24 the agency exists, and that's to be sure that the current I

l 25 reactors are safe.

The information we've collected from this

6 1

has started to show that the trends are clearly there.

2 Overall, reactors are in fact safer and we can now do this in a 3

quantitative sense in showing that the trends are such that we 4

are being able to make that point.

5 I think one has to be careful, however, that one does 6

not take performance indicators on any one plant or look at 7

these in such a way that they become a definitive way to 8

classify quickly the plants, and that's an admonishment that I 9

think we all have to keep in mind and we do keep that in mind 10 all the time.

And the others that tend to want to use 11 information that we collect in this fashion need to remember 12 whenever they use it that it's just a tool to assist us in 13 assessing the overall safety, and there are many, many other 14 sources of information that need to be included.

15 What we're going to do today is to identify a couple 16 of new areas that look like perhaps they are fruitful for us to 17 get further into, and with the Commission studying the matter 18 and deciding that is fruitful, we would propose to go forward 19 and do that.

I 20 We are not here to suggest to the Commission that we 21 have studied and learned enough about this subject to say that 22 we can conclusively tell you here are the right collection of j

23 performance indicators and we ! waw this without question or l

l 24 reservation.

We're just clearly not there and it will be 25 sometime, years, before we feel really comfortable with that

7 1

idea.

2 With that introduction, I'll ask Mr. Jordan to go 3

through the briefing for us and get started.

Mr. Chairman, if 4

you'll excuse me I have another appointment I could not get out of that coincided with this meeting because of the need for 5

6 changing it because of the hearing.

So with that, I'll ask Mr.

7 Jordan to start and I'll leave.

8 CHAIRMAN ZECH:

Yes, I understand you have a 9

conflict, Vic.

Thank you very much.

Mr. Jordan, you may 10 proceed.

11 MR. JORDAN:

Thank you.

I'd like to ask Robbie Singh 12 to move up here as well to assist us.

First, I'd like to 13 introduce the people who.will be participating directly.

Carl 14 Johnson from the Office of Research, Reliability and Human 15 Factors, is here on my right and will be giving a discussion on 16 the support that Research has provided in developing 17 performance indicators.

18 Robbie Singh is the Section Chief responsible for the 19 Performance Indicator Program and has been implementing the i

20 program and working with the further development.

Jack i

21 Rosenthal is the Branch Chief from the Reactor Operations 22 Analysis Branch who has been supervising those developmental 23 efforts.

Tom Novak, Director of Safety Programs, is behind me; l

24 and Denny Ross from the Office of Research is here to respond I

25 where a research question is offered; and Arlene McKenna from

i o

8 1

NRR is present so that she may respond.

2 I would like to refer back to the last briefing that 3

we gave June 9, 1987, where we received some direction from the 4

Commission.

We feel we are proceeding towards implementation 5

of indicators that we were in the process of developing at that 6

time and continuing development of other indicators.

As Mr.

7 Stello indicated, we are proceeding very cautiously, and I 8

think there are two reasons for caution.

One is to assure that 9

we maintain continuity with indicators that we have in order to 10 use them fully and to maintain stability of the definitions.

11 And then secondly, so that new indicators, as we develop them, 12 are validated, confirmed to be. valuable before we fully 13 implement them.

14 Could I have the first slide, please.

15 (Slide.]

16 CHAIRMAN ZECH:

Let me ask the people in the 17 audience, can you see the slide?

Good.

Okay.

Let's see if we 18 can see it here.

19 MR. JORDAN:

It may be easier to look at your 20 handout, sir.

21 COMMISSIONER ROBERTS:

Is there any difference l

l 22 between the handouts dated April 21st and those dated April 23 28th?

24 MR. JORDAN:

Yes, sir.

The difference is we changed 25 some of the slides so that they would project and be visible l

9 1

for people here.

[,

2 CHAIRMAN ZECH:

All right, let's proceed.

l l

3 MR. JORDAN:

The program status -- at this time, we 4

are continuing development.

We have support from the Task i

1 5

Group which is composed of personnel from the Office of 6

Research, from the Office of NRR and from the regions.

This is 7

an advisory group that meets periodically and gives us insights 8

on the development process.

It's been a very, very valuable 9

cooperative effort.

10 The implementat!.on is the responsibility of the 11 Office of AEOD, and development -- we receive a great deal of 12 support from the Office of Research and work very closely with 13 them.

14 (Slide.)

15 The products that we're presently putting out are the 16 quarterly reports, and these are all prepared and issued to 17 management; after about two weeks they are sent to the Public 18 Document Room, and that data then becomes input to the senior 19 management for the semi-annual performance review programs.

20 The next one is scheduled for June.

21 We feel that in the implementation we have improved 22 the graphic display, and I will show you that briefly.

We've 23 made the packages more user-friendly and the users are quite 24 pleased with them.

That is, the managers from the program 25 offices in the regions.

10 1

(Slide.]

/

2 On the development side, and that's principally what 3

we're here to talk about, the areas are risk-faced PI's, and 4

this is the system availability / safety system trends.

5 Programmatic indicators are ranging from management to 6

maintenance and we're looking Ot a number of specific areas in 7

the maintenance area that we will describe to you.

8 Program improvements -- we're try.ing to continue to 9

improve the presentation and the packaging, the usefulness of 10 it for the managers.

And an area that I'd like to bring to 11 your attention is an expanded program; that is, we're 12 participating with selected utilities in getting plant-specific 13 data, data that they are already collecting, and observing 14 their trending, and that's helping us with validation of trial 15 indicators.

So that's a very valuable part of our program.

16 And then there is other related research that ties 17 back into performance indicators indirectly -- the modeling of 18 management is one example.

19 (Slide.]

20 I would briefly touch on the resources that we apply P.1 to the Performance Indicator Program.

We have budgeted and are 22 spending about four FTE staff in this office, and Research is 23 budgeted and I understand spending about.5 and.75 24 respectively for the two areas that they're supporting us in.

25 The contractor support, we are actually spending more in AEOD

11 1

than budgeted.

We have reprogrammed'about $310,000 in order to 2

accelerate the maintenance performance indicator deveslopment.

3 COMMISSIONER ROBERTS:

Then you have to ask the 4

obvious question:

where did that come from?

What was cut?

5 MR. JORDAN:

We cut some of our other data programs 6

and shifted the emphasis to these maintenance indicators.

So 7

that was my decision, and I believe it's in the best interest 8

of the office and the agency.

9 COMMISSIONER ROBERTS:

Good.

That was not a 10 criticism; I'm just curious.

11 MR. JORDAN:

Yes, sir.

(

12 In order to accelerate the maintenance program 13 itself, we have some two years of data that were collected by 14 one of the regional offices for 20 plants, and so we're having 15 a contractor analyze that data to help support, extracting from 16 that more maintenance PI information.

17 COMMISSIONER BERNTHAL:

Let me see if -- I was 18 distracted here for a minute, but I understand that -- let me 19 put it this way.

Do I understand that the maintenance 20 performance indicators are based on your tagging the other 21 performance indicators, this cause tagging that we talked 22 about, or -- ?

23 MR. JORDAN:

The cause code is one way of getting at 24 maintenance and we will talk in some detail about that, but 25 that's not a leading indicator, it's really a trailing.

So

12 1

ve're looking further at indicators that may be leading in 2

terms of maintenance backlog, those kinds of data.

Today you 3

may see the backlog building but the plant is still running 4

okay, and in a year and a half or in some later timeframe, 5

because of this building maintenance backlog, the lack of 6

maintenance, then you would see a deterioration in the way the 7

plant is running.

We'd like to have that leading indicator 8

assisting us.

9 COMMISSIONER BERNTHAL:

So you're trying to develop 10 certain indicators.

Well, let's take that example, though.

In 11 the absence of a regulation, a requirement of some kind or even 12 a Reg Guide at this point, how do you know that there is any 13 uniformity at all -- in fact, we already know there is no 14 uniformity --

15 COMMISSIONER ROBERTS:

You know that answer on the 16 front end; there is none.

17 MR. JORDAN:

There is trend information on an 18 individual plant that is useful and I don't believe that data 19 will be comparable -- certain of those data will not be 20 comparable between plants; it would be simply trending that 21 particular plant, and you do not have requirements for 22 utilities to maintain data in that fashion.

23 And if you recall from our previous presentation to l

24 you, we had proposed internally the maintenance backlog., and 25 the industry comments back on that and interactions with l

l

P 13 1

industry were that it was subject to manipulation, it was not 2

comparable and so it didn't fit some of our selection bases.

3 And so we deferred selection.

INPO has some indicators similar 4

to that which they are tracking and we're reviewing those with 5

them, as well as data that we've collected from selected plants 6

in those specific plants.

So we have not given up.

7 COMMISSIONER BERNTHAL:

Well, it seems to me that in 8

principle that sort of thing is useful, but I've got to say 9

that if you expect it really to be useful, comparing the 10 utility with itself based on however it may choose, evolving 11 with time to categorize outstanding maintenance items, which 12 box to put them in -- you know, they tend to have different 13 categories as we all know

-- without some sort of regulatory 14 requirement and now you're talking about getting into a 15 maintenance rule which certainly would happen, if it ever does, 16 long after I'm gone from this place, it seems to me that that 17 is the only way that you're going to get some uniformity.

18 There would have to be a uniformity of requirement there which 19 not only would ensure that a single utility maintains a 20 consistent system, but that across utilities then there is the 21 basis for comparison in maintenance performance.

22 Right now, we don't have any of that, and I think 2'

while the idea isn't a bad one, I'm skeptical that the data 24 can't be cooked very easily.

25 MR. JORDAN:

And that's why we have to be very

14 1

careful in selecting, and if we put new requirements on, 2

assuring that the new requirement has a beneficial effect and 3

not an adverse effect.

So it's a very real concern.

4 Slide No.

6, please.

5

[ Slide.]

6 I was just going to briefly touch on the data 7

presentation that we presently have.

We feel that the 8

presentation is very important.

We, as you know, started out 9

with simple tabular data and it was impossible to use for 100 10 plants.

We have now condensed it down to two pages per plant 11 that we issue quarterly.

This is one box out of six that would 12 be on the lefthand page in a, package.

And so from that one 13 panel -- for instance, this is Safety System Actuations, you 14 can see the actual number of actuations for that particular 15 parameter.

You can see the critical hours of the plant, so if 16 the operational time has some apparent effect during a period 17 or for instance there is a long shutdown, you can see that very 18 easily.

19 The dark bar is a four-quarter moving average for 20 this particular parameter, and then the light bar down low is 21 the older plant average.

So on that one chart you can easily 22 compare this particular plant with that parameter with itself l

23 and with the other plants and see the actual data.

So I think l

24 the staff did an outstanding job in assembling in one picture a 25 very large amount of data.

l

i 15 1

Could I have Slide No.

7, please.

i 2

[ Slide.]

1 3

I confess that what I call the finger charts, this 4

slide and the next one, have been my favorites but I'm 5

beginning to look more at the more detailed chart.

These are 6

easier to skim through, and especially when there is a lot of 7

coherence between the charts.

For six indicators, this is the 8

trend chart which shows a trend from the previous four quarters 9

to the last two quarters.

So this is, in terms of standard 10 deviations, the actual delta change that has occurred in each 11 of those parameters.

So from this -- and this is not a real 12 plant -- from this sample, this would show a plant that is 13 improving in every characteristic, and it would be easy to pass 14 that one up in terms of trends for concern.

15 Could I have Slide No.

8, please.

16 (Slide.)

17 This chart is a similar presentation but it's 18 comparing this particular plant with the mean of all other 19 plants of that same generation, so these are the older plant 20 means that we're comparing to.

So this particular plant, 21 although improving, would be seen to be below average 22 performance.

And in our normal sense, this would not be a 23 basis for having the performance indicators cause us a concern 24 that would need further attention.

However, in the management 25 reviews, the results of inspections, the SALP scores and

O 16 1

interactions with NRR would be a basis for perhaps considering 2

further issues with this particular plant.

The performance 3

indicators -- it's below average but improving measurably.

4 At this point I'd like to turn to Jack Rosenthal and 5

ask him to pick up on describing the maintenance performance 6

indicator development and where we stand.

7 MR. ROSENTHAL:

We had the opportunity, some of us, 8

to meet with your tech assistants yesterday and I am to discuss 9

maintenance and cause codes, and we think that maintenance 10 should come first because of our understanding of your 11 interest.

So I'd really like to.

o.' to Slide No. 17 in your 12 packet, and then given time, I'll "w-back to cause factors.

13 CHAIRMAN ZECH:

Fine, go right ahead.

14 (Slide.)

15 MR. ROSENTHAL:

Let me start out by saying that the 16 staff is mindful of the guidance that the Commissioners have 17 prepared over the year with respect to developing maintenance 18 programmatic indicators.

For example, Commissioner Rogers' 19 guidance of 1/21.

And we're working very hard to be 20 responsive.

21 The program has been a top-down program where the 22 first thing you did was to implement events that take place in 23 a plant.

The next thing you'd like to do is say well, what's 24 causing those events; how effective are the programs that you 25 have in place at maintaining the readiness of the plant?

And I

O 17 1

think we're ready to go on effectiveness.

2 With measures of the effectiveness of programs in 3

place, it seems to me that the next place you go is into 4

attempting to measure the program itself.

If you can succeed 5

in that, then you can even look at inputs to that program.

And 6

at least in my mind, this is more than a question of 7

scholasticism on drawing these distinctions.

Effective i

I 8

maintenance should result in things like a low forced outage 9

rate due to equipment failures, and that's in the program 10 already.

That will be a measure of balance of plant.

11 And you'd like to know something about the 12 operability of standby systems, of safety systems; are they 13 ready to do their intended purpose.

And our, what we've 14 termed, safety system unavailability but we think a better term 15 now might be safety system function trends, where you look at 16 those trains of equipment that you know are important, provide 17 good measures of how effective the program is and measures the 18 overall goal of that maintenance program.

Are you achieving 19 results.

20 Now, I can distinguish between that and attempting to 21 measure the program.

When I measure the program I look at 22 things like overdue preventive maintenance or a maintenance 23 backlog or a ratio of corrective to preventive maintenance, and 24 depending on the hardware mix of the specific plant, the 25 optimal ratio will be different.

18 1

On the input side of the coin you could look at 2

corporate goals, plant management direction, you can look at 3

money that's being spent on the program.

The program over the 4

last two years has gone this top-down approach.

Could I have 5

the next slide, please, Slide No. 18.

6

[ Slide.]

7 We were very interested in maintenance rework because 8

of its -- just from an engineering stand -- its face validity.

9 You go in, you maintain a piece of equipment.

Especially for 10 safety systems which you perceive as being very reliable, at 11 the next quarterly surveillance or the next time it's demanded 12 to work you surely expect it to be operable.

And so if you 13 find evidence of cases where you have to rework an item -- it's 14 a steam-driven aux feedwater pump, and then you go in at the 15 next surveillance and if you have to work on it again, that's 16 an indication of inadequate maintenance.

17 We attempted, using our automated systems, 18 information that's reported to us, to come up with a measure of 19 maintenance rework.

We were unsuccessful.

In our systems as 20 we concurrently access and manipulate them, it just isn't there 21 to support that item.

22 We tried a surrogate, and that is to say well, let's 23 look at the trend level.

They go in and work on --

24 CHAIRMAN ZECH:

Excuse me, we're going to turn the 25 air conditioning on and see what happens.

It does make a

19 1

little noise but it's getting a little warm, so please bear 2

with us until we experiment a little bit.

If it's difficult to 3

hear with the air conditioning on I hope somebody in the back 4

row or the front row will raise your hand and we'll do our best 5

to accommodate to our new facility.

6 Please proceed.

7 MR. ROSENTHAL:

It was attempted to look at times 8

that individual systems worked Vow ideally in the rework, you 9

would say okay, some bearing had to be replaced and then you'd 10 go back and say, my gosh, three months later you had to replace 11 that same bearing.

Well, what was wrong with your maintenance 12 program that you didn't do it right the first time?

13 CHAIRMAN ZECH:

Excuse me, let me interrupt you for a 14 second.

Before we go any further, let me ask Mr. Springer, Mr.

15 Chilk, let's see what we can do to quiet this down a little 16

bit, 17 MR. CHILK:

Have already asked, Mr. Chairman.

18 CHAIRMAN ZECH:

It's -- I think most people can hear 19 but it is distracting, and it seems to me that the engineers 20 we're dealing with perhaps can do a little better than that.

I 21 feel sure they can but please follow through.

22 COMMISSIONER BERNTHAL:

Don't pay their bill if you 23 haven't yet.

24 (Laughter.)

25 CHAIRMAN ZECH:

Let's just keep it going if it's not

20 1

too distracting.

If'it is, please raise your hand and we'll 2

turn it off again.

3 COMMISSIONER ROBERTS:

I'd rather be distracted than 4

hot.

5 CHAIRMAN ZECH:

We'll make a big judgment if we have 6

to, but please proceed.

7 MR. ROSENTRAL:

The concept there is that one time 8

you may have gone in and worked on a pump; another time you may 9

have gone in and worked on a valve within a train.

But 10 somehow, you surely expect that when you re-declare a train 11 operable, that it's all running.

So if you have to go back in 12 and look at that train or you find it deficient at the next 13 surveillance or the next demand, it's an indication of 14 inadequate maintenance.

It's not as good as the maintenance 15 rework which would get down on the pump or the valve level.

16 That shows more potential for us to be able to i

17 extract it from our automated systems, and we're now attempting 18 to do that.

19 We looked at items out of service -- that's just a 20 gross count of what is out of service at that plant, and you 21 can trend it at that plant.

And of course it will be different 22 at every plant.

23 We tried NPRDS and it's not working out.

The next 24 place to go would be some direct plant, once a month type count 25 of what's out of service.

21 1

That's what we did, and we're running out of data 2

that is currently reported to us that would allow us to come up 3

with maintenance indicators.

4 CHAIRMAN ZECH:

But you are going to come up with a i

f 5

maintenance indicator.

I 6

MR. ROSENTHAL:

One way or the other, yes, sir.

7 CHAIRMAN ZECH:

You know, as I recall, it was the 8

sense of the Commission that we wanted one and we asked you to 9

give us one and you haven't so far, but I'm pleased to hear 10 that you're still working on it and you're going to come up 11 with one.

12 I appreciate, and I know my colleagues do, how 13 difficult it is to come up with a really good maintenance 14 indicator.

You've shown us that and pointed out the wide range 15 of things you may look at to determine a good maintenance 16 indicator.

But for that very reason I think it shows me and I 17 think it should show most of us the value of a maintenance 18 indicator.

So we want one.

Keep working on it.

19 MR. JORDAN:

Yes, sir.

20 MR. ROSENTHAL:

Let's move to the backup slide which 21 repeats some of the information on this slide.

22 (Slide.)

23 This is my best shot at where I think we'll come out 24 on maintenance indicators.

25 COMMISSIONER BERNTHAL:

Which slide is this now?

22 1

MR. ROSENTHAL:

It's a backup slide.

And I 2

specifically don't have them in any special order.

Equipment 3

out of service --

4 COMMISSIONER BERNTHAL:

Am I blind or is that out of 5

focus?

I guese my eyes are going.

6 CHAIRMAN ZECH:

It's jumping around a little bit.

7 MR. ROSENTHAL:

Why don't I read down the list.

8 Equipment out of service; maintenance work request backlog; 9

safety system rework -- we're still interested in that; an 10 overdue surveillance rate; control room alarms and instruments; 11 preventive maintenance overdue; preventive to total 12 maintenance; high priority total work orders -- high priority 13 work order requests to total work orders; the ratio of 14 preventive to corrective maintenance; mean time to return to 15 service.

Our best shot now.

Now, how are we going to get 16 there?

17 Somehow we have to validate, trial indicators against 18 data in plants, and that's been hard to get.

But we've managed 19 now to get data on a voluntary basis from a representative 20 sample set of plants, and that allows us to perform a 21 validation exercise.

And we've developed, with the cooperation 22 of several utilities some of which have excellent 23 recordkeeping, a cooperative program where they have agreed to 24 provide us data -- it will be selected plants, just a few --

25 where we can do in-depth studies of what's going on at those

23 7.

1 plants.

And it's been painful to work out these agreements, 2

but that's in place.

That now provides the basis to go 3

forward.

4 In parallel with that, we have an expanded indicator 5

program at a few plants.

Slide No. 19, please.

6 (Slide.]

7 We are working with the regions and with NRR and 8

attempting to get data at a few plants that would be more 9

detailed --

10 COMMISSIONER ROBERTS:

Excuse me, how many is a few?

11 MR. ROSENTHAL:

Four.

12 COMMISSIONER ROBERTS:

Thank you.

13 MR. JORDAN:

This'is generally data that those plants 14 were collecting for their own purposes.

15 COMMISSIONER ROBERTS:

How did you select the four?

16 MR. JORDAN:

These were plants that were in a restart 17 program and so they are under special agency scrutiny and 18 realized it and are developing their own indicators, as it 19 were.

20 MR. ROSENTHAL:

We visited those plants, proposed 21 what information we'd like to get, found out what information 22 they already collected and are customizing the programs to 23 minimize the burden on them.

And so we end up with this data 24 that they would be collecting anyway and they're sharing it 25 with us.

24 1

COMMISSIONER BERNTHAL:

How many of these categories 2

that you've listed in the backup slide -- how many of those has 3

INPO focused on?

In other words, has the industry itself taken 4

any steps to voluntarily gather and collate that kind of 5

information?

6 MR. ROSENTHAL:

INPO has 11 or 12 major indicators 7

and then they have supplemental information.

They have a 8

corrective maintenance backlog greater than three months old.

9 Ratio highest priority -- maintenance work requests to total 10 maintenance work requests completed.

Three, preventive 11 maintenance items overdue; four, ratio of preventive to total 12 maintenance; maintenance overtime; and maintenance radiation 13 exposure.

14 COMMISSIONER BERNTHAL:

And the industry, pretty much 15 every plant is committed to gathering that kind of information?

16 MR. JORDAN:

Yes.

17 MR. ROSENTHAL:

Yes.

And we could well adopt some or 18 all of those indicators.

19 COMMISSIONER BERNTHAL:

Why don't we just get them 20 from INPO on a voluntary basis, of course.

21 MR. JORDAN:

We have, of course, been communicating 22 with INPO.

We have now recently received three years back data 23 for the four common indicators that we have, and so we're 24 reviewing their data and our data for verification.

We would 25 like to somewhat independently select an indicator, decide that l

25 1

it in fact has validity, and where there is a sufficiently 2

common definition, then accommodate with INPO and then adopt 3

what --

4 COMMISSIONER BERNTHAL:

Well, do you think INPO's 5

indicators somehow are deficient or that they have not picked 6

the best ones?

Why --

7 MR. JORDAN:

We know that they've put a great deal of 8

resource into it and they have, we think, a very fine program 9

that is benefiting industry, but our purpose is regulatory and 10 theirs is not focused at regulatory.

So we want to make sure 11 that the indicators we select are, in fact, associated with our 12 regulatory tools; that they are for plant safety performance.

13 COMMISSIONER BERNTHAL:

Well I agree, but what you're 14 saying then is you have not yet reached a judgment, or it 15 sounds like you have reached a judgment that the indicators 16 that they have are not adequate or sufficient for our purposes.

17 The only reason I'm pressing you on this is because 18 the key word here is voluntary, unless and until there is a 19 regulatory requirement.

And if it's the same set of data and 20 it's being sent to both places it just seems to me that we 21 could maybe cut through some junk here.

22 MR. JORDAN:

That's correct.

And our intent would be 23 that if we adopt a common indicator, that we would access the 24 INPO data.

But we're not simply going to embrace the INPO 25 program because their data become our data; I don't think that

O O

26 1

would be an appropriate way to handle it.

2 COMMISSIONER BERNTHAL:

But their data are the 3

utilities' data, and the utilities' data are our data.

Unless 4

we have a regulatory requirement, and we don't.

5 MR. JORDAN:

Right.

And for the indicators that we 6

have adopted thus far, there is a regulatory nexus connected to 7

them.

We obtain that data already and there is on problem.

8 But where we are getting data that the utilities are not 9

required by regulatory requirement to collect, then I think we 10 have to be very certain that that is the data we want and that 11 we're willing to, if necessary, issue a regulation to obtain 12 it.

13 We can very nicely study data that's being collected 14 for another purpose, but once we decide that it is to be one of 15 the performance indicators, then I believe we should be willing 16 to make that regulatory judgment.

And we have to be certain in 17 those cases.

18 CHAIRMAN ZECH:

Let's proceed.

19 MR. ROSENTHAL:

I am essentially finished with that.

l 20 Slide No. 10, please.

21 (Slide.)

i l

22 We disaggregate information typically --

l 23 CHAIRMAN ZECH:

You weren't finished.

You meant you 24 were finished on that subject?

25 MR. JORDAN:

Yes, we were finished on the maintenance l

lu________________-_____________...____

27 1

indicators.

2 CHAIRMAN ZECH:

Okay, and now we're going to a new 3

subject, fine.

4 COMMISSIONER BERNTHAL:

Let me see if I can get a 5

clear answer here on one point, though.

It appears that you 6

are convinced that the maintenance indicators that INPO already 7

gathers and collates from the industry which are being supplied 8

voluntarily now by the industry and which are the same data 9

that at least in those categories we would get voluntarily from 10 the industry, our not being in a position to require it at this 11 point, you're convinced that there are other things that we 12 need to ask for that are not currently included in that body of 13 data?

Or you haven't reached a judgment on that.

14 MR. JORDAN:

I would say it differently.

We believe 15 that some of those might be excellent candidates for us to use 16 but we believe there are others that will be of greater benefit 17 to us for our regulatory purposes.

18 COMMISSIONER BERNTHAL:

For our purposes.

Can you 19 give me one or two examples of things that aren't there now, 20 aren't going to be done, are not being done by INPO, perhaps 21 are not even being collated routinely by the industry that you 22 really feel are good candidates for what we would need to look

7. 3 at?

24 MR. ROSENTHAL:

I'd like to -- rework.

Rework has an 25 enormous amount of face validity.

28 1

COMMISSIONER BERNTHAL:

Rework means you do the job 2

once and it isn't done right?

3 MR. ROSENTHAL:

You do it once; you think from your 4

PRA's that you have 01 trains, which means you've got to have 5

001 components.

That means that if you work on that and you 6

fix it, fine.

It broke, you fix it.

You surely expect next 7

time you demand it or next time you survell it, at whatever the 8

surveillance interval is, that it will be operable.

And if you 9

have a plant that ham to go in -- and we have the example with 10 RCIC or steam-driven aux feedwater which we know are important 11

-- where they go in and surveillance interval by surveillance 12 interval go in and fiddle with it and fiddle with it and fiddle 13 with it.

14 COMMISSIONER BERNTRAL:

Right.

Sort of like my car.

15 But rework is one.

Give me another example.

16 MR. SINGH:

Another example is the number and 17 duration of items out of service.

18 COMMISSIONER BERNTHAL:

And INPO does not collect 19 those data right now.

20 MR. SINGH:

No.

21 COMMISSIONER BERNTHAL:

Okay, thank you.

22 CHAIRMAN ZECH:

Proceed, please.

23 MR. ROSENTHAL:

Cause codes, Slide No. 10.

24 (Slide.)

25 The NRC has in the past and continues to disaggregate

29 1

LER data, and we find it useful.

Again and again, whether it's 2

a commi'ssioner site visit or input to NRR in a program or 3

looking at PI data, you have relatively little data, it is 4

seemingly disparate events, there are two trips and one safety 5

system failure -- you say what does it all mean.

And there's 6

this need for sewing a common thread between those events.

We 7

think that associating those events with broad programmatic 8

areas provides some useful information, and that's why we want 9

to do it.

10 We believe that by disaggregating information that's 11 reported to us into these broad programmatic areas, we can 12 identify areas of weakness.

Surely it's softer than how many 13 trips did you get or what was your forced outage rate, but we 14 think it would be useful.

15 And we think we can monitor the effectiveness of 16 corrective actions, and I'm very interested in the case where 17 the corrective action and the cause don't match, where you see 18 a hardware deficiency and it's fixed by additional training of 19 the operator or a procedural change, and it's not one, but when 20 you see many lining up then there's reason to question.

21 We think that disaggregating the causes will be 22 predictive.

Right now about half the LER's we receive end up 23 in the current PI program as some data point, and about half 24 don't.

There's been work over the years to say that little 25 problems that you may find, the event that wasn't safety-

30 1

significant but reveals a programmatic deficiency will come 2

back to haunt you in some bigger event.

So by using all the 3

information that's available to us, we think we can get a leg 4

up in terms of deficiencies in programs.

5 You have to use this stuff in conjunction with 6

everything else you have; you can't make any judgments unto 7

themselves.

And in fact the maintenance indicators you'd use 8

in conjunction with all the other information.

It gives us an 9

independent view of programmatic effectiveness.

10 I want to talk a little bit about quality control.

11 COMMISSIONER ROGERS:

Before you leave that subject, 12 the specificity of some of these categories is really very 13 different from one to the other, and your Category 4,

design, 14 installation, fabrication -- that is such a broad category.

Do 15 you really feel that has value?

It's so enormously broad, what 16 do you do when you see something there?

It involves a number 17 of different kinos of activities.

18 MR. ROSENTHAL:

You have established plants that are i

19 10, 15, 20 years old, and we still read about problems in these 20 areas.

But you expect in a mature plant to have relatively few 21 of this nature.

After all, they were built, they were tested, 22 they'll run.

You aggregate into this category, you ideally 23 will have very few things falling in there.

And when you do, 24 it causes you to then go look at the details of what went on.

25 So it's simply a trigger --

31 1

COMMISSIONER ROGERS:

You expect that to be rather 2

infrequent then.

3 MR. ROSENTHAL:

Hopefully.

And then -- but you don't 4

use that category alone.

All you do is you say okay, that's a 5

trigger.

Now because that stuff is important, I've got to 6

understand each event.

7 COMMISSIONER ROGERS:

It's just that the way you 8

ultimately want to display this -- and you will be saying 9

something about that in these pie charts -- that's going to be 10 buried, it's going to be hard to see any activity in that and 11 changes in it.

12 MR. JORDAN:

That's why we would be simply looking 13 for a plant with an unusually large wedge in that area, and 14 then backing up, for instance, to the corrective action.

If 15 they have a large wedge there in corrective action it's very 16 frequently design modification or equipment replacement.

One 17 can learn more from that.

18 COMMISSIONER ROGERS:

From what I'm hearing it sounds 19 to me like you're going to see -- that's going to be almost a 20 little straight line in that pie chart --

21 MR. JORDAN:

That's what we anticipate.

22 COMMISSIONER ROGERS:

-- of zero area, almost always.

23 MR. JORDAN:

It's an important element.

If we look 24 at the logic chart in terms of what contributes to risk, if the 25 plant has a number of design issues, then this might be a way

O 32 1

of getting an early indication of it.

2 MR. ROSENTHAL:

My branch reviews 3500 LER's a year 3

and I think last year, 1987, we have roughly about 100 that we 4

would put into that category.

But we tend to classify those as 5

significant.

6 Let me get to the QA and then I'll get back to your 7

other question.

One, you have to read those LER's by 8

knowledgeable engineers; you can't rely on boxes checked.

Two, 9

and this is true for the entire PI program, we collect data and 10 we send that data out to the regions and we send it out to NRR, 11 and it's a two-way cycle that we go through and then we publish 12 the data.

13 So that LER then in which we've associated, let's 14 say, a maintenance deficiency can have the imprint of the 15 regions who know far better what's going on at that specific 16 plant.

So that's one way of upping the QA.

17 The next thing is that we have envisioned that Synet 18 will ultimately come into being, and that will allow us to 19 correlate other regional products such as inspection reports 20 with LER's, and that also will increase the quality.

21 And then the last thing is, of course, you'll have to 22 be mindful that you don't ascribe to this more than is 23 justifiable.

24 COMMISSIONER BERNTHAL:

I want to make a comment as 25 well.

Commissioner Rogers picked up on an important point I

33 1

thin.k.

I noticed that there also are -- there is a diversity 2

to be sure, but at least what appears to be two or three rather 3

different and important categories, as you well know I guess, 4

in these cause codes, and the one that interests me the most, 5

frankly, or the several that interest me most are the ones that 6

clearly relate to management.

We've talked about management in 7

many respects around here, the difficulty of getting any kind 8

of management performance indicator.

I've seen individual 9

utilities, frankly, attempt to do that with some greater and 10 lesser degrees of success.

We aren't very good at it, period, 11 around here, probably never will be, if anybody is very good at 12 doing predictive management assessment.

There's no evidence 13 yet that we are.

14 But there just may be something in this; things like 15 licensed operator error, other personnel error, administrative 16 control problem.

And you might even go'out to Clinton because 17 I know that's one plant -- you've probably already done this --

18 where they have attempted at the initiative of the plant 19 manager there to set up a kind of management system like that.

20 I'm not suggesting another great new realm that we 21 delve into here, but if simply by the cause code approach, 22 which I think in principle can work fairly well, you can get 23 that done, that kind of information, that would be good.

24 MR. ROSENTHAL:

The admin control is particularly of 25 interest to us in the kind of events we see.

I have to say

34 1

something about a random equipment failure and maintenance 2

problems, often you sea people ascribe equipment failure as 3

the cause of 50 percent of the problems, and we want to 4

distinguish between what we call random equipment failure and a 5

maintenance problem.

6 A lightning strike I'll say is random.

An alpha 7

particle ripping through a chip and causing you to drop a bit 8

I'll call a random failure.

But a clogged fuel.strein on a 9

diesel line in my mind is a maintenance deficiency and not a 10 random failure.

A rotating bearing that blows -- you ought to 11 find it before it catastrophica]ly fails.

And so we'll ascribe 12 those to maintenance rather than to equipment failure, and 13 we're going to be very critical in assessing what's going on 14 there.

15 MR. JORDAN:

That's the reason that the definitions 16 that we lay out for these are very important and that we have 17 trained reviewers who do it in a consistent way continually 18 thereafter.

19 CHAIRMAN ZECH:

Let's proceed.

20 (Slide.]

21 MR. ROSENTHAL:

Slide No. 12 shows corrective actions 22 and we intend to monitor those, and I'd like, if there are no 23 questions, to move on to Slide No. 13, please.

24 (Slide.)

25 In Volume II of our report now we list each event

d 35 1

that has taken place and give a one-line description of that 2

event.

Let's say it's a safety system failure -- well, what 3

system was it, when did it fail, how did you find out, was it 4

an LER, a 50.72; and for people on, let's say, my level, I find 5

that Volume II more interesting than some of the top level 6

graphics.

7 We've had people from the region say you know, I knew 8

about each of those events but I never realized that there were 9

six events in the last eight months all involving, let's say, 10 RCIC.

That was a specitic case.

And putting it on a piece of 11 paper and you can read each event -- RCIC, RCIC, RCIC -- then 12 you start getting a pattern.

And that Volume II will have the 13 corrective action.

We're not proposing a graphic display at 14 this time.

And you'll look and you want to see how many times 15 are they going for procedures, and how many times is the cause 16 described as operator personnel.

So that's the backup for the 17 graphics, and it gives us an audit ability of the system which 18 I think is very important.

19 Anyone reading that, be it a licensee or somebody in 20 the region, can see how the top level graphics were developed 21 from the individual events.

It's all out in the open.

That's 22 the way I think we ought to go.

Next slide, please.

23 COMMISSIONER ROGERS:

Excuse me, just on that, you're 24 going to have some kind -- you've got a review process.

This 25 cause now, where does that come from?

Is that the one that the

36 I

1 licensee -- or is that the one that's at the end of your 2

process?

3 MR. ROSENTRAL:

We have done a lot so I randomly 4

picked an LER and I read it for myself, and this is a real LER 5

and I said could you really say what's going on here.

Okay?

6 And if you read that LER -- it's a real one -- it really is --

7 COMMISSIONER ROGERS:

But you're developing a data 8

base here.

These sheets will be a data base for the future.

9 MR. ROSENTHAL:

Right.

10 COMMISSIONER ROGERS:

Now, what is the status of that 11 cause analysis?

Where did that come from?

In the future as 12 you accumulate this.

13 MR. ROSENTHAL:

From an engineer reading that LER 14 from a QA cycle that involves the region and NRR prior to 15 publication, and ultimately enhanced by advance data.

So 16 that's the --

17 COMMISSIONER ROGERS:

That's the terminal assessment 18 then of this cause; it's not a preliminary, it's not one that 19 just comes from the licensee --

20 MR. JORDAN:

No, that's right.

21 COMMISSIONER ROGERS:

-- when the LER is filed.

It's 22 a review after that.

And it could be different from what the 23 licensee --

24 MR. ROSENTRAL:

It will often be different.

25 MR. JORDAN:

And presently, we have cause codes in

37 1

the Oak Ridge data systen and so we're issuing new instructions 2

on what the causes are and what the rules are ascribing the 3

causes.

4 COMMISSIONER ROGERS:

Okay, fine.

5 (Slide.)

6 MR. ROSENTHAL:

We can prepare bar charts that are 7

not fancy but that are consistent with the other information in 8

the program.

Now, you couldn't use pie charts -- Slide No. 15, 9

please.

10 (Slide.)

11

-- without Slide No. 14, because if you're talking 12 about relatively few data -- you know, if you have a lot of 13 data then the pie charts look wonderful.

But if you have 14 relatively little data you'll see swings'in the pie chart that 15 really don't have anything backing it up.

So you have to use 16 the package as a whole.

I like the pie charts because it shows 17 a real good graphic snapshot about what's the relative mix of 18 this plant, and how is it changing.

Slide No. 16, please.

19

[ Slide.)

20 And you could also look at the plant and compare it 21 to some sort of industry average.

22 Now, the relative proportions here are going to be 23 different than you're uced to seeing from, let's say, EPRI type 24 splitting where you get half equipment and half personnel 25 because of our driving that maintenance versus random equipment

f 38 1

failure.

So it would be different from, let's say, the EPRI P

2 program.

3 And with so much information, it sums to 100 percent, 4

so what you're looking for here is does that plant profile look 5

significantly different than the industry average profile?

6 It's something to light you off and then I go look and read 7

that Volume II stuff and that's when the real work starts.

8 With that, I'd like to turn the presentation over to 9

Carl Johnson.

10 MR. JOHNSON:

Thank you, Jack.

Slide No. 20, please.

11 (Slide.)

12 As you recall, the logic model we used for 13 performance indicators is based on risk considerations.

For 14 cxample, safe plant operation implies a low frequency of 15 transients and a high availability of safety systems, and a 16 high availability of safety systems in turn implies a high 17 availability of individual trains and a low potential for 18 common cause failures due, for example, to programmatic 19 problems of maintenance, training and so forth, 20 The existing indicator for unavailability of safety 21 systems is safety system failures as reported in licensee event 22 reports.

During the past year, we've developed an improved 23 indicator for safety system availability based on research done 24 by Brookhaven and SAIC, and in ongoing discussions in the staff 25 in the few days since we just sent you this paper we've made

39 1

two changes in this indicator for unavailability.

One is we a

2 changed the name to safety system function trends in order to 3

emphasize that the value of this indicator is for trending 4

unavailability, not in the absolute value itself.

And 5

secondly, instead of aggregating the data to an overall 6

indicator at a plant level, we believe it is more accurate to 7

aggregate the data to individual system levels for some 8

important safety systems and look at the trends of the 9

important safety systems.

10 CHAIRMAN ZECH:

Is the goal here for predictive 11 purposes, predictive validity and looking into the future?

Is 12 that what you're trying to do here?

13 MR. JOHNSON:

This i's an outcome indicator; this is 14 what has the plant been doing in the recent past, and looking 15 at trends so that we can use the trends to say if that trend 16 continues, is it getting better or worse?

17 CHAIRMAN ZECH:

Okay, fine.

18 MR. JOHNSON:

May we have the next slide, please.

19 (Slide.]

20 I'd like to discuss the basis for this indicator, 21 what it means, the input data and possible sources for this 22 data, and our plans for future development.

23 The existing indicator for safety system 24 unavailability is the safety system failures reported in LER's.

25 This uses input data at the system level; in other words,

40 1

complute loss of safety system function -- these are events i

2 that don't happen very often so they're difficult to trend.

3 The new indicator uses input data at the train level, 4

aggregates it to an estimate of system availability so that we 5

can see the trends a lot faster.

6 In order to evaluate the practicality of this 7

indicator we collected data from a BWR and a two-unit PWR and 8

analyzed them to see is it practical.

We're also in the 9

process of evaluating some data from another two-unit PWR.

And 10 to evaluate how well the indicator responds to changes in plant 11 performance, we used computer simulation.

This is illustrated 12 on the next chart.

13

[ Slide.)

14 Because this indicator uses train level data, this 15 new indicator responds a lot faster than the existing indicator 16 and thus it's more predictive of what's happening.

This is a 17 computer simulation of -- the trend line going up is a computer 18 simulation of unavailability of a typical two-train system 19 where the train failure rate is inputted into the computer as 20 having doubled at time equals zero.

And you can see that 21 within four to six quartero you have a distinct trend of 22 degrading availability of this system.

23 With the existing indicator we go for two or three 24 years before we get the first data point: namely, a safety 25 system failure, and of course it would be longer before you

41 1

could distinguish a trend.

2 CHAIRMAN ZECH:

Do you have any real data -- this is 3

computer data.

4 MR. JOHNSON:

Yes, we have real data from these three 5

units that I mentioned, but they were done to try out the 6

practicality of using this indicatcr.

Do we find some things 7

when we try to take real data and use it; is there something 8

strange there that we didn't understand.

9 CHAIRMAN ZECH:

Does the real data correlate to your 10 computer simulation?

11 MR. JOHNSON:

Yes, in that it does work and does the 12 same thing, but we were not able to take real data and say --

13 let's take data from a plant that we believe is going to have 14 an accident and then say could we prove it.

So we broke it 15 into two pieces saying use the real data to test out the 16 practicality of using this kind of an indicator, and use the 17 computer simulation to say what could it show us.

18 CHAIRMAN ZECH:

Here's what I mean.

Did you use real 19 data or could you use real data that you've had on plants we 20 know we've had problems with in the past few years, take that 21 data and trend it like you did and put it in your computer 22 simulation.

Would the problems we've had in the past on 23 certain plants -- real data -- would that correlate and would 24 it have shown trends that would be significant?

25 MR. JOHNSON:

We did this initially when we started

42 1

the job to try out with Davis-Besse, looking at unavailability 2

of feedwater, and over the several-year period between when it 3

first started commercial operation and when the loss of 4

feedwater accident occurred sometime ago, there were several 5

occasions when that unavailability of the auxiliary feedwater 6

system would have raised a flag in people's minds about the 7

performance of that plant.

8 CHAIRMAN ZECH:

Did you put it on a chart like this?

9 MR. JOHNSON:

We did not put it on a chart like this 10 to see what we do.

11 CHAIRMAN ZECH:

Why don't you do that?

What would it 12 look like?

13 MR. JOHNSON:

I don't know, sir.

14 CHAIRMAN ZECH:

Why don't we do it?

15 MR. JOHNSON:

All right, we'll do it.

16 CHAIRMAN ZECH:

Okay.

17 MR. JOHNSON:

The next chart, please.

18

[ Slide.]

19 The next chart shows the criteria we use to select 20 the safety systems that we want to monitor.

We selected seven 21 important safety systems for monitoring trends in 22 unavailability; six of these account for approximately 90 23 percent of the increase in core melt likelihood due to taking 24 an individual train out of service.

Then we also added a 25 containment function, containment spray.

So the scope of this i

43 1

indicator is the important safety systems _that prevent core 2

melt and an important containment system.

3 (Slide.)

4 This slide shows what this indicator means.

The new 5

indicator monitors trends in unavailability, and this is then 6

directly related to trends in safety performance.

Also, this 7

indicator directly relates to maintenance effectiveness, both 8

overall maintenance and preventive maintenance, in the 9

following way.

We define maintenance as the aggregate of 10 functions aimed at preserving, restoring safety, reliability 11 and availability.

So therefore, a degrading trend in 12 availability of some important safety systems is one indicator 13 of ineffective maintenance.

14 Also, we can analyze the same input data in a 15 different way and estimate the mean time between failure of 16 these same important safety systems.

Now, a preventive 17 maintenance program is intended to prevent failures, so a trend 18 in mean time between failures of these safety systems is 19 indicative of a trend in preventive maintenance effectiveness.

20 Now usually these two parameters will trend the same 21 way.

A degrading availability and degrading time between 22 failures will either be improving or degrading together.

But 23 if these trends diverge, that tells you some different 24 information; additional supplemental information.

For example, 25 if the availability is degrading but the mean time between

44 1

failures is remaining the same, then that could indicate an 2

increase in trains out of service for maintenance, a change in 3

the balance of preventive maintenance in the program, perhaps 4

not helping safety.

5 This indicator then, safety system failure function 6

trends, provides information on trends in safety system 7

availability and maintenance effectiveness.

8 Next slide, please.

9

[ Slide.)

10 The input data are simple.

For each of these seven 11 selected safety systems, we need two pieces of data.

One is 12 the number of train failures that occurred in each of the 13 trains of these seven safety systems during the quarter and the 14 other is the hours that each of these trains was taken out of 15 service for maintenance, or whatever reason.

16 We also considered a third category of data, but did 17 not recommend it.

This third category would include event type 18 information on each of the train failures, particularly the 19 time of the train failure and the causes.

This would make the 20 trends more predictive, because if you knew the causes, causes 21 tend to persist, so you could have more confidence that the 22 trend would persist or if you thought the cause had been 23 corrected, you could adjust your prediction of the trend.

24 However, this would increase the burden of reporting 25 and we felt instead of asking everybody to report that

45 t

1 information, rather we would go collect that information on an 2

"as needed" basis.

The Regions, if there were some cause for 3

concern about a trend, then in that case we could find out this 4

additional information, rather than asking everybody to provide 5

that kind of information.

6 MR. JORDAN:

I think this is a case where the cause 7

codes will be complementary, that is you are looking at a 8

different set of data but if the utility has a problem with one 9

particular activity, you should be able to see it through the 10 cause codes attached to all the LERs, 11 MR. JOHNSON:

In addition to this information on 12 train failures and hours out of service, we do need some one i

13 time data on the design of the plant, nacely, how many trains 14 are in eacn of these systems and how many' trains are required 15 to perform the system function.

Also we need the surveillance 16 interval.

17 These data on train failures and hours out of service 18 are not now reported to NRC and the next chart shows two 19 options for obtaining this data.

20 (Slide.)

21 These two options are rulemaking and voluntary data 22 collection.

The staff is assessing rulemaking for obtaining 23 these data and reviewing a similar indicator being developed by 24 INPO and discussions with INPO are continuing.

25 (Slide.)

)

46 1

In summary, we have developed a risk based indicator

(

2 that will monitor trends in unavailability and these trends are 3

directly related to trends in plant safoty performance and 4

plant maintenance effectiveness.

5 Our plans for future development of risk based 6

performance indicators are to improve the indicator for 7

frequency of transients and to improve the treatment for 8

evaluating common cause events.

9 These methods are being developed in 1988 and 1989 10 and the results will further improve our capability to monitor 11 trends in plant safety performance.

12 MR. JORDAN:

In summary, we are continuing with the 13 development of performance indicators.

We have two that we 14 would like to implement on a trial basis.

We are proceeding 15 with developing the rulemaking in parallel with our further 16 coordination with INPO, one of those is a success path and we 17 will come back to the Commission with a recommendation of which 18 path.

19 With the cause codes, we would like to proceed with 20 trial implementation there,.

That data is already in our 21 possession.

We are instructing our personnel who extract this 22 information in a new procedure upon your approval, and then the 23 trial implementation would consist of providing that data to 24 the Office Directors for their trial use in the evaluations 25 that are done semi-annually.

47 1

We would not as yet publish it as performance 2

indicators that are in final form and adopted for full 3

implementation.

4 I believe the staff is proceeding consciously but we 5

feel we are responsive to the Commission's direction.

We feel 6

the maintenance area is one in which we have a very good 7

understanding of maintenance effectiveness.

We can give you 8

views about plants and their maintenance effectiveness, but we 9

do not yet have good leader indicators that would give you 10 indications about the programs, the candidates, and we are 11 working hard on those candidates and we will continue to do so.

12 CHAIRMAN ZECH:

All right.

13 MR. JORDAN:

We do have some coordination with IAEA, 14 Robbie Singh has participated in some development efforts with 15 them.

He was involved in OSAR which uses some of their 16 indicators.

We are well appraised of their activities and 17 continue to have that full understanding and coordination.

18 CHAIRMAN ZECH:

All right.

19 MR. JORDAN:

We do have a program that is fully 20 implemented.

The data has been found to be useful by senior 21 management.

We feel that the industry performance trend data 22 which was a by-product has been extremely valuable.

We were 23 very pleased that was a by-product.

We plan to continue the 24 development of the performance indicators and the two 25 directions we have described'.

48 1

^

CHAIRMAN ZECH:

All right, are you concluded?

l I

I 2

MR. JORDAN:

Yes, sir.

t 3

CHAIRMAN ZECH:

Thank you very much.

Questions from 4

iny fellow Commissioners?

Commissioner Roberts?

i 5

COMMISSIONER ROBERTS:

No.

6 CHAIRMAN ZECH:

Commissioner Bernthal?

7 COMMISSIONER BERNTHAL:

No.

8 CHAIRMAN ZECH:

Commissioner Rogers?

[

9 COMMISSIONER ROGERS:

I think this whole question of 10 validation is key to this.

Everything that I heard sounds 11 really very good but it is a question in my mind of really what l

12 ultimate use it has for predicting the possibility of problems

(

13 arising and I think the computer model, of course, will give 14 you a little comfort that you are sort of on the right track in l

l 15 some way, but I would really urge you to spend a significant 16 effort to try to validate these approaches.

i 17 I think there is a lot at stake in doing that, to 18 convince the volunteers that it would be useful to them to see 19 what these outcomes are, so they will supply the data to us 20 that is necessary to run this program on a voluntary basis and 21 also give us some confidence that we are not just doing I

22 something that sounds very good, but really when all is said 23 and done isn't that useful.

I think if you can go back and 24 validate any of these indicators and also particularly the 25 safety system function trends, that looks very, very exciting l

49 1

to me.

It is a marvelous idea.

2 I would sure like to see some real data that says if 3

we had generated this trend, we would have had an indicator 4

early on, on something that we missed.

That also brings to 5

mind some of the latest things that we have heard about, that 6-INPO turned up that we weren't aware of, how about that 7

situation?

Can we look at any data there in the light of these 8

safety system function trends or any of your other performance 9

indicators that you are suggesting we haven't been looking at, 10 that would have indicated a problem?

11 MR. JORDAN:

Yes, sir, I understand.

We will look in 12 that direction.

The validation is very difficult.

We have 13 actually, I think, made some change in the process of 14 validation.

We were doing statistical correlations among the 15 indicators and then making that correlation to SALP.

16 Co?.missioner Bernthal at our last meeting identified 17 that we may have some greatly correlated indicators that may 18 not be really measuring safety.

We feel that the logic model 19 is our preferred method, that we establish it does have face 20 validity, it does fit into a safety model and then actually do 21 the trial program, the real data is the best way to assess it.

22 As you say, look at performance as it has evolved at 23 a particular plant and see if we can understand the data.

24 COMMISSIONER ROGERS:

It looks to me like a very 25 professional and well constructed approach to these problems.

50 1

I really want to compliment you.

I am very impressed with what 2

I have heard.

Again, let's go that next step of convincing us 3

all that it is real.

4 MR. JORDAN:

We have had excellent support from the 5

Offices and the Regions.

That is why the program is working 6

well.

7 COMMISSIONER BERNTHAL:

I want to second what 8

Commissioner Rogers has said.

I think it is a tribute to you, 9

Ed, that you have developed a philosophy for one thing, you are 10 being very systematic in the way you are approaching it.

I 11 would hope you are also cautious and go step-wise as you move 12 along in this entire area, because you have talked about a lot 13 of things, a lot of different indicators, a lot of ways to 14 gather, perhaps require eventually, reporting of information.

15 That could easily get out of hand.

16 I think you want to proceed deliberately and make 17 sure that what we are doing here doesn't fail to serve the l

18 regulatory needs of the agency and safety in the industry.

l 1

19 I was curious about one point.

I think this is the 20 one Ken was alluding to.

What about the Hatch case?

We have 21 been at this performance indicator thing now for a little while 22 at least.

How come we didn't seem to pick up what INPO did?

23 MR. JORDAN:

We did look at the performance 24 indicators for the last quarter and Hatch would not, from the 25 performance indicators, have been brought to our attention nor

51 1

would we bring it to any other management's attention.

2 COMMISSIONER BERNTHAL:

Is there anything that is in 3

your proposed bag of tricks so to speak that would have caused 4

Hatch to appear on the scope?

5 MR. JORDAN:

We don't know yet.

We will examine that 6

data to see if any of those~in fact would have.

7 COMMISSIONER BERNTHAL:

I have one item that I would 8

like the general counsel to comment upon.

I understand the 9

Memorandum of Agreement with INPO that had been proposed, that 10 OGC had some comments on that or some concerns about it.

Are 11 you prepared to speak to that, Bill?

12 MR. PARLER:

I'm prepared to speak to the fact that 13 there were some concerns with the Memorandum, beyond those 14 concerns, which certainly could be accommodated from OGC's 15 standpoint but perhaps not from the staff's standpoint.

The 16 staff can speak to that.

It was my understanding a policy 17 question was raised as to whether there was a need for an 18 additional Memorandum.

19 We already have a Memorandum of Understanding with 20 INPO after a Court decision, where we had to have such a 21 Memorandum of Understanding to comply with the Court decision 22 as publiuhed in our materials in the Federal Register as part 23 of the policy.

24 The first question I have is why isn't that good i

25 enough.

The basic issue that is involved in anything such as i

52

(

1 this is this; this agency'is the regulatory agency that is l

2 responsible for the public health and safety of the American i

3 people.

INPO provides a very useful role which this agency in 4

performing its regulatory commission can use to complement its i

5 responsibility.

Now, if one starts in official agreements

)

6 carving up our regulatory responsibility, that is when this 7

general counsel starts having legal problems.

That is the 8

basic issue.

Beyond that, I guess we can discuss it later on.

9 COMMISSIONER BERNTHAL:

Thank you.

I appreciate it.

10 CHAIRMAN ZECH:

Let me just say, Mr. Jordan, I think 11 you in particular with your leadership in this program have 12 done an excellent job.

I would like to compliment your own 13 people, too, both the AEOD staff and the research staff as well E

14 au others who have contributed, the Regions, too, all those who 15 have helped.

I think you have made significant progress.

16 I would like to make several comments.

First of all, 17 you know, the whole purpose of this program is to see if we 18 can't predict problems.

We are trying to look into the future.

19 That is very difficult, as we all know.

One of the ways to 20 look into the future is to look into the past, get some trend 21 data that will perhaps indicate declining performance in time 22 enough for us to take actions to prevent more serious 23 incidents.

That is what we are trying to do.

i 24 The whole performance indicator program, I think it.

25 is important to keep that in mind, the purpose of the program

53 1

is to try to help us do our job better by preventing incidents.

2 I think you'are getting there.

This is a very difficult 3

initiative and it needs maturing and it needs all the kind of 4

thinking that you snd your fine staff are putting into it.

5 I am very encouraged by what I have heard.

Keep your 6

eye on the ball.

We are trying to design a program, not to 7

just gather a lot of data for data purposes.

We are trying to 8

do something extremely difficult.

We are trying to design a 9

program where we really can extrapolate, the line gets dotted 10 with uncertainty but if we can design a good program with a 11 certain degree of confidence by quantifying as much as we can, 12 perhaps we can have justification for a prediction of perhaps 13 trouble ahead, and then we can do something about it to prevent 14 that trouble from occurring.

That is what we are trying to do.

15 What I am saying is in all the design you are doing, 16 keep track of the forest as well as the trees.

17 Let me give you one example of what I heerd this 18 morning in your safety system function trends, which I think is 19 excellent.

It is an excellent logical process.

If I 20 understand it properly, previously you had one system that you 21 put existing performance indicators into for safety system 22 failures and all that in one indicator, if I understand it 23 right.

24 MR. JORDAN:

Correct.

25 CHAIRMAN ZECH:

Now what you have done is taken

54 1

several more,~maybe six or seven different indicators, focused 2

on specific systems that would perhaps result in a safety 3

system failure.

Is that correct?

4 MR. JORDAN:

That is correct.

5 CHAIRMAN ZECH:

You have gone from one to many. That 6

is fine in a way, I suppose.

I think what you are telling me, 7

the reason you did that is because when you just had one, you 8

kind of had a flat line, it didn't really show you very much.

9 On your new system, you will perhaps get more signals.

That is 10 fine.

11 MR. JORDAN:

We should get earlier signals.

12 CHAIRMAN ZECH:

Earlier signals and perhaps more.

13 All I would say is remember what we are trying to do 14 and perhaps that system you are designing you could use not all 15 seven systems, six or seven, but perhaps you could combine them 16 and still come up with what you are trying to do.

What we are 17 really trying to do is make a system that senior management can 18 use and if you need to get down in that level, that is what we 19 will do.

If you don't need to get down there and if you, with 20 your very professional people can figure out, keeping the 21 forest and the trees in mind, a way to make those trends come 22 together, so that you can with a fair degree of confidence and 23 logic put them together, so you don't have to look at seven of 24 them, but you are looking at one or maybe two, it might be 25 something for you to think about.

55 1

We do need a maintenance indicator.

I know it is 2

hard.

We need one.

3 Just a word on the subjective versus objective 4

things.

As we have discussed before, you are trying to

~

5 quantify it, you are trying to make as much objective data we 6

can get our hands around, and that is exactly what you should 7

be doing.

As we develop the program and as it matures, it is 8

my experience we will have more confidence in moving from the 9

objective to the subjective.

For example, canagement.

We 10 don't know how to quantify management.

If we think of thin 11 program enough and you get a number of different categories, 12 for example, that you could talk about management and quantify 13 it, for example, turnover rate, operator exams, supervision of 14 even industrial accidents, various things that you can kind of 15 relate to management, have half a dozen of those that your 16 staff might work on, and then result in one perhaps or two 17 categories of management assessment.

18 In other words, we don't want to just say management 19 is a concern without really knowing what we are talking about.

20 My view is that as you develop this and as your people think 1

21 about it, perhaps you can show from the regulatory / safety l

22 standpoint, which is our mission, that there are certain things 23 that would add up to something we ought to be concerned about 24 for management.

It is a very difficult tank.

Maybe almost too 25 difficult.

l f

56 1

Predictive, validity and ability to be confident of 2

the data you have and use real data to proceed to come up with 3

a system.

That is extremely important.

4 Let me just say a little bit about the incidents we have had in the past that we were concerned about.

If you can 6

take those significant incidents we have had in the past few 7

years, fortunately, we haven't had any recently, but we have 8

had over the past few years, Davis-Besse, Rancho-Seco, the TVA 9

situation, the Pilgrim problem, Peach Bottom, if you could go 10 into those, now we know we had problems, do what I've heard 11 termed in the past, retrogressive analysis or retrospective 12 analysis, wnat it really means is analysis by looking back and

~

13 taking real data from those plants.

Now we know what happened, 14 now you can go back and see if at the time we measured all that 15 data, would we have been able to predict something.

You can I

l 4

16 learn something from that.

That is real data.

17 If you can look back and trend that perhaps, then it 18 might help you design performance indicators, are there ccamon 19 performance indicators of those plants that would have 20 triggered our attention perhaps.

l 21 MR. JORDAN:

Those did lead us to the existing set of I

22 indicators, so we did look at that early data.

23 CHAIRMAN ZECH:

That is excellent.

Very good.

That

[

24 gives us a fair degree of confidence that we are not just 25 designing something without real significant data.

f t

l

57 1

Those are the comments I wanted to make.

I think you 2

are right on track.

It is a very important initiative.

We are 3

not going to finish it soon but I feel good about it.

You are 4

making progress in a very difficult area that perhaps will help 5

us do our job of safety and public health and safety.

That is 6

what we are trying to do, keep the plants operating safely and 7

knowing more.

We have all this data as you have discovered, 8

Ed, and your people, and now we are trying to bring it together 9

and use it more intelligently and do the very difficult task of 10 perhaps looking into the future and seeing where we can make a 11 very reasoned and at least fairly confident judgment.

12 I am very pleased with what I have heard and I 13 commend you and your people for the efforts you have made to 14 date.

Kqep up the good work.

We hope to hear from you again 15 in the future when you have developed the program.

16 MR. JORDAN:

We will bring you more on maintenance.

17 CEAIRMAN ZECH:

Good.

Anything else from my fellow 18 Commissioners?

19 (No response.]

20 CHAIRMAN ZECH:

If not, thank you very much for an 21 excellent briefing.

We stand adjourned.

22 (Whereupon, at 11:32 a.m.,

the briefing was 23 cer.cluded.]

24 25

1 2

REPORTER'S CERTIFICATE 3

4 This is to certify that tha attached events of a meeting of the U.S. Nuclear Regulatory Commission entitled:

5 6

7 TITLE OF MEETING 4-d'. *n 44 6

EGA 8

PLACE OF MEETING Washington, D.C.

9 DATE OF MEETING:

f

~'

10 e.-

11 were held as herein appears, and that this is the original 12 transcript thereof for the file of the Commission taken

('

13 stenographically by me, thereafter reduced to typewriting by 14 ne or under the direction of the court reporting company, and 15 that the transcript is a true and accurate record of the 16 foregoing events.

17 is 19 20 j

21 i

22 Ann Riley & Associates, Ltd.

23 24 i

25 l

1

SR O

T L

A A

C 8

I T

8 D

R T

9 O

E 1

l l

I P

E E

P N

8 C

A 2

N S

D A

U R

L M

T O

I R

A J

R O

T P

A F

S R

E EP

[ ll l;1 l

l

G 9

PI PROGRAM STATUS Ol PfOQrCm I

' Csk Grouc (AEOD)~

i imo:ementation l

Deveco r:t (AEOD)

} ( !ES/AEOC) 1 t

9

J lmolementation I

(AEOD)

' Ouart erly input to Senior Reports' MancgeTent i

i 2

i l

l i

4 Development (RES/AEOD)

Program Risk-Bosed improvements Pls (RES)

(AEOD)

Prog (rammatic i

Expanded Pl Pls RES)

Program (AEOD) iIOther Related

- ~iResearch (RES) 4 4

4 l

a i

9 i

l 1

I 1

i

~

i i

RESOURCES FOR THE PI PROGRAM 4

STAFF CONTRACTOR FY88 FY 88 i

AEOD 4

$230K

($540K)

RES

- RISK-BASED Pls 3

0.5

$400K

/

s

- PROGRAMMATIC P!s 0.75

$350K j

j i

J 4

J e

I.

FIGURE 1 CED bcheator PLANT X

Legend:

  • - ~i 4. w Critical Hotss S6-1 to 87-4

"" * " '" "*i9 **" 9' 1 Automatic Scrams WhAe Critkof

2. Sofety System ActLotons 225o e

J 2250 i t h

j s.

~

h s.

~

N) it

. noo '.

j)

=,

. eco '-

r2

.pa.

. usoj p 2.

. nso 3

d

~

"3 f

,1 1

~ "3

?

g

's. -s f

. 500

. W

?f

\\.

d M

2so o

o l

. 250 m

$81 Pb li i

o o

es-,86-2 m ) w e g s) 2s7 3 s7 4 m i w 2 m,3 es-4 g s7 2 s?.3 s7-4

3. Spfkcnt Events
4. Sofety System Foues s

h5

  • b

-t 3<

.t P ~

g E'

1 JJ i -

t !' ]I 5

i I'

f l

I

?;

f

'i Nb

' i;

. $(3 Y

o o

es., ed2 W3 ed.e

.337337-4 m, W3 m) 3 372373874

5. Fcreed Outcge Rote (s)

Cr tccf 5

E I

.o.

It

- l t

FI "-

i i jI 2

ot*-

1ss t..i"-

E3 R.

, i

,p ro.

m k

o le o 9

3 AA 11 h, da

," $WR NM' o

.a,es.s y @ g.j.,.7->.,-4

.a,.a,.ap g.j.,.7'.,.7'-.

5

.4 EG Indicator Legend:

oid er eio,i A v er a g e Critical Hours 4 Oucrter Moving Avercge

2. Sofety Systern Actuations 2500

- 2250 -

8 s,

j3 s

'g $ 3,

8 3

- 2000r.t y[

.h 5

- 1750 3 "E

^

15 C0,

j,,2

. 1250 $

E 62.

y,

/ s

- ce0 2

t 6

.i e

ti s

. 750 s i-1

+q,

1 1 )If %

=

e w

a e

P e:g

~

- 500 a

~

  1. " g a,

~

u 2 '. C Ss3*

~

Ig'

'l'!

H v

S 0

0 86-186-286-386'-487-187'-287'-387'-4 Ye:r - Ouorter l

6

t PLANT Y: Trends Performance Indicators Declireo irrpeoved l AutomCte. Seroms While Criticol (2 Otr. Avg end 87-4) -

CAS

2. Sofety System Actuotons '? Otr. Avg end 87-4) f os llt 3 Sig6ficont Events (2 Otr. Avg end 87-4) -

o go Sofety System Fodures (2 Otr. Avg end 87-4).

A 4

e4 $

l S. Fe<ce Outoge Rcte (2 Ctr. Avg end 87-4) -

0 66

6. Equement Forced Outooes/1000 Cret Mrs E

(2 Cir. Avg end 87-4) -

0 57 8JU'

-2.5-$.0 -i5 -lo -O 5 0.0 0'S l'O t'5 2'O 2.5 Devictions from Previous 4 Otr. Piont W (Mecstsed in Stond&d Devstions) eens i

l l

l t

i I

t

PLANT Y: Deviations from Older Plant Means Performance incicatore Below Avg. Perf. Above Avg. Perf.

1 Automotec Scroms WMe Criticol (4 Otr. Avg end 87-4) -

-022

2. Sofety System Actuctions (4 Qtr. Avg and 87-4) -

-2. 5

3. Significent Events (4 Ctr. Avg. end 87-4).

-os i

4. Scfety System Feues (4 Otr. Avg end 87-4).

-c47 g a

5 Forced Outoge Rote (4 Otr. Avg end 87-4).

.te ;};;l

6. Equement Forced t

Mrs I

hs/1000 Crit

. Avg end 87-4) -

08 L,

-2.5-i0-15 -to -d 5 0 0 Ch L'O t'5 2 0 2.5 Devictions from Okw Plant ueons (Meosured in Standard Devotions) l l

l l

l

}

8 l

l

4 l

i i

i j

a, STAFF'S PLANS i?

INCLUDE CAllSE CODES IN THE PROGRAM

{

ASSESS MEASilRES FOR IMPLEMENTING l

SAFETY SYSTEM FUNCTION TRENDS l

CONTINilE DEVELOPMENT AND IMPLEMENTATION l

1 i

i i

9

)

1

.i s

1 I

i j

4 I

CAllSE CODES VALUE AS A DIAGNOSTIC - IDENTIFY AREAS i

0F WEAKNESS MONITOR EFFECTIVENESS OF CORRECTIVE ACTIONS 4

  • PREDICTIVE.

}

USE IN CON.IllNCTION WITH CilRRENT EVENT

)

BASED INDICATORS 4

INDEPENDENT VIEW 0F PROGRAMMATIC i

EFFECTIVENESS QUALITY CONTROL i

10 1

I m

i i

i I

I CAUSE CODES j

(PROGRAMMATIC FACTORS) i LICENSED OPERATOR ERROR OTHER PERSONNEL ERROR 1

l MAINTENANCE PROBLEM J

DESIGN / INSTALLATION / FABRICATION PROBLEM 1

ADMINISTRATIVE CONTROL PROBLEM l

RANDOM EQUIPMENT FAILURE 11 i

l-1 1

i 1

i e

i 4

h 1

1 j

CORRECTIVE ACTION i

j TRAINING PROCEDURE DISCIPLINE MANAGEMENT CHANGE i

DESIGN MODIFICATI.ON E0llIPMENT REPLACEMENT /ADJUSTEMENT t

i i

i l

J NA 12/17/87 LER#: 34187055 50.72#:

10933 POWER: 607.

s CAUSE:

OPERATOR / PERSONNEL i

j CORRECTIVE ACTION:

PROCEDURE / TRAINING SYSTEM:

REACTOR CORE ISOLATION COOLING l

SYSTEM DESC:

ISOLATION OF RCIC STEAM LINE WITH RESULTING llIGil DIFFERENTIAL PRESSURE l

4 f

13 i

F

l

.l Cause Code

' ' +

Legend:

Criticoi soors 4 Ouorter Moving Average

2. Other Personnel Error 16 2500

_ 2250 ~

. 2000f

]

e g12.

1750 5 w

r

_ 1500,

O 8

_ 1250 y%

7 i

H

- 50 3

'3 is r

.4

- 750 y 4'

.l.f

'?

?

_ 250

_ 500 5 3

3 tr 3

~

j 6h

h ib 5

~~-

0 O

85'-385'-486'-186'-286'-386'-487'-187'-2 Yeor - Overter 14

PERFORMANCE INDICATOR CAUSE CODES PLANT Z: Trends nRscua

" *G 2'

tweetwet

s, 1

A CMMT7 0%%T3 magg,a

' es Expsc.,-

ECLFACNT

<f N-i; 9.0;j

. ~~

AIAe G RATPE j

gj ACut 5TRATvt ttson PREV. 4 OTR. AVG.

2 OTR. AVG. ENDING 87-4 FOR 31 LERs FOR 12 LERs i

l l

15

PERFORMANCE INDICATOR CAUSE CODES PLANT Z: Deviations From Industry PEPscn c

@'24 S~Aru N

L*PJDA*CE j 6g.,-.g.gg C;E 9tCF m

4

- CE RATCR s=,, E :.ne.'

Eccescur

!V Arv s uivE ccscs as,

l lNDUSTRY DISTRSUTON FOR 1987 PLANT DfSTR8lJTON FOR 1987 FOR 30 LERs/ PLANT FOR 37 LERs l

l l

l 16 l

l

t l

i l

i l

I i

I s

1pp T

\\

.ggigTEBA"CE 3

B "1 "

~

i-s

,I f

43b,. /

3 b

4 Y

I f1EASURES (PROGRAMMATIC INDICATORS) 17 i

i M

i i

l l

i CANDIDATE INDICATORS OF MAINTENANCE l;

MAINTENANCE REWORK REPEAT EQUIPMENT FAILURES IN INDIVIDilAL SYSTEMS i

ITEMS OllT OF SERVICE MEASURES OF MAINTENANCE BACKLOG j

OVERDUE PM CORRECTIVE A PREVENTIVE MAINTENANCE RATIOS MEANTIfiE TO REPAIR a

l 1

I 18 I

i i

c EXPANDED PI PROGRAM VOLUNTARY FEW PLANTS RAPID ASSESSMENT OF PLANTS l

INFORMATION (EXAMPLES)

LC0 HOURS NUMBER OF ITEMS OUT OF SERVICE-OVERTIME SIMllLATOR TRAINING HOURS 1.9

PLANT SAFETY LOGIC MODEL PLANT SAFETY I

1 LOW FREQUENCY HIGH AVAILABlUTY FEATURES AND LOW OF TRANSIENTS OF SAFETY SYSTEMS POTENTIAL FOR COGNfTIVE ERRORS I

I I

]

8 l

l HIGH TRAIN LOW MEWL 8

AVAILABILITY M

N 8

US AI URES I

I I

Pts Pts Pls Pls NEW Pts oCAUSE CODES eCAUSE CODES

  • CAUSE CODES
  • CAUSE CODES eSafety System eSafety System Function Trends Function Trends EXISTING Pts OSCRAMS e SAFETY SYSTEM e EQUIPMENT FAILURES
O SAFETY SYSTEM FORCED ACTUATIONS OUTAGES e COLLECTIVE RADIATION
  • SIGNIFICANT EVENTS EXPOSURE eFORCEO OUTAGE RATE 20

i i

I.

SAFETY SYSTEM FUNCTION TRENDS PLANT i

I' 1

i EXISTING PI:

SYSTEMS

-s

$AFETY SYSTEM FAILURES 1

I 1

TRAINS NEW PI:

j 3

EKFETV SYSTEM FilNCTION TRENDS I

l l

i 21

I 1

j I

I a

SAFETY SYSTEM FUNCTION tress RESPONDS FASTER THAN SAFETY SYSTEM FAILURES i

i Ohhk!$D-I

-_7,

,.,i...

)

0 1

2 3

{

l YEARS AFTER FAILURE RATE DOUBLED

( cenPUTER SIMULATION) i 22

a 6

q 4

7 SELECTED SYSTEMS 6 ACCOUNT FOR APPR0X 90% OF INCREASE OF CORE-MELT LIKELIHOOD DllE TO TRAIM OUTAGE.

PLUS CONTAINMENT SPRAY.

23

= -.

l M

E Y

T T

S I

Y L

S S

S I

B E

Y A

N T

L E

ES I

V F

D A

I A

N V

T S E A

C R

N E

F T U

F O

F N

M E

EO E

E 4

C I

T E

V 2

N T S

C I

A C

Y N

LT CN S

A LN U

S N

AE I

F F YD E

RV TN T

EE I

N EE N

VR G

FR I

OP I

AT A

S S

M

-i l!

l1ll l

a

~

EC I

VR S

E E

S R

U F

L 0

I A

T F

U O

A N

5 T

S I

2 A

A I

D RT N

T I

U F

A P

O R

N T

I RE S

B R

M U

U O

N H

~

lll

DATA SOURCES' RULEMAKING:

ASSESSING RULEMAKING FOR OBTAINING THE DATA.

VOLUNTARY:

REVIEWING SIMILAR INDICATOR BEING DEVELOPED BY INP0.

DISCUSSIONS WITH INPO WILL CONTINilE.

26

FUTURE DEVELOPMENT OF RISK-BASED PERFORMANCE INDICATORS IMPROVED INDICATOR OF FREQUENCY OF TRANS!ENTS.

IMPROVED METil0D FOR EVALUATING COMMON-CAUSE EVENTS.

^

1 27

3 i

I t

SUMMARY

i

  • PROGRAM FULLY IMPLEMENTED
  • REPORTS BENEFICIAL TO SENIOR MANAGEMENT l
  • INDUSTRY PERFORMANCE TREND USEFUL j

BYPRODUCT 1

  • STAFF PLANS TO ADD CAUSE CODES AND j

SAFETY SYSTEM FUNCTION TRENDS

  • DEVELOPMENT OF Pts CONTINilING i

28 i

%%%Wd%%%%%W6%%%%%%%dW6%lgtfWgtggtjfffWflggggigg TPAHSMITTAL TO:

Document Control Desk, 016 Phillips fj ADVANCED COPY TO:

The Public Document Rocm

(~/J / M DATE:

5 FROM:

SECY Correspondence & Records Branch

)

Attached are copies of a Comission meeting transcript and related meeting document (s). They are being forwarded for entry on the Daily Accession List and i,

placement in the Public Document Room. No other distribution is requested or i

ll required.

/8+nv%

Meeting

Title:

M h M, h u M d I W Af O

dl,j j hLs-e,uw Meeting Date:

J/[F P Open X

Closed 1:

2

]

Item Description *:

Copies Advanced 005 1:

'8 to PDR C3 ll ll

1. TRANSCRIPT 1

1 l

i l

0) l A mWL_ ]

l i i v

i 15 i::

2+

i :.

.3:'

m:

3-a:

4.

a

!nI El j !

5.

at 3 ll 3<]!

6.

a:

3:

3:

  • PDR is advanced one copy of each document, two of each SECY paper.

3 !

C&R Branch files the original transcript, with attachtrents, withcut SECY j j

papers, g,

E alB6 5

Y

-lhlhlhl I h l l l I lEIElb

~

s bb bYlhlYl l l$lY l l iklhl lEl lib Y llYIIIIll l

_