ML20212G338

From kanterella
Jump to navigation Jump to search
Forwards 990729 Meeting Summary to Discuss Revised Reactor Oversight Program.List of Attendees & Info Exchanged at Meeting Also Encl
ML20212G338
Person / Time
Issue date: 08/19/1999
From: Spector A
NRC (Affiliation Not Assigned)
To:
NRC
Shared Package
ML20212G343 List:
References
NUDOCS 9909290200
Download: ML20212G338 (29)


Text

_

August 19, 1999 i

7 1 MEMORANDUM TO: File FROM: August K. Spector, Communication Task Leader inspection Program Branch (Original signed by:) i Division of Inspection Program Management Office of Nuclear Reactor Regulation

SUBJECT:

PUBLIC MEETING HELD ON JULY 29,1999 On July 29,1999, a public meeting was held to discuss the Revised Reactor Oversight Program. A summary of the meeting, the attendance list, and information exchanged at the meeting are attached.

Attachments: As stated

Contact:

August K. Spector 301-415-2140 J .

Distribution:

Central Files llPB R/F PUBLIC P

(

og 0 ,) J. 9 .

b/ .

pl01 p,) ,+ ,

DOCUMENT NAME: A:\MTG729

  • See previous concurrence.

To receive a copy of this document, indicate in the box: "C" = Copy without enclosures "E" = Copy with enclosures "N" = No copy .A OFFICE IIPB:DIPM M/ WM l l l l .

NAME AKSpector V/ f ,

DATE 08/ // /99 d lPNft OFFICIAL RECORD COPY 1

1 9909290200 990819 PDR REV0P ERONUMRC Mk}bd '

l PDR t__--. .. .. . - .

.------- - - ---- J

$ 3*Eto e 44 UNITED STATES E

j NUCLEAR REGULATORY COMMISSION WASHINGTON, D.C. 2055fKm01

,o

'+9***** August 19, 1999 MEMORANDUM TO: File FROM: August K. Spector, Communication Task Leader Inspection Program Branch

)Yh [bh Division of Inspection Program Management Office of Nuc! war Reactor Regulation l

SUBJECT:

PUBLIC MEETING HELD ON JULY 29,1999 On July 29,1999, a public meeting was held to discuss the Revised Reactor Oversight Program. A summary of the meeting, the attendance list, and information exchanged at the meeting are attached.

Attachments: As stated

Contact:

August K. Spector 301-415-2140 l

l 1

l L

j

1 1

Summ*ry of Mnting July 29,1999

1. PPEP Update / Overview: The first meeting of the PPEP was held on July 28,1999. The first part of the meeting was concerned with organizationalissues. Success Criteria was discussed with an agreement to rename to Evaluation Criteria. The Panel will provide NRC feedback in the future, acting as an advisory committee and will provide recommendations.

The PPEP is planning to meet again in August and will set schedules for future sessions. The PPEP expects to prepare its report to be available for the planned l Commission meeting in January 2000. I

2. NRC made available and distributed the NRC June 21-25,1999 SDP Training Materials and the Pilot Plant Specific SDP Data. These materials are in electronic form on computer disks.

4

3. The Chief Nuclear Officers meeting with NRC will be held August 3,1999. Group l

discussed issues which could be discussed, including the PPEP process and the I involvement of licensees in training their managers and staff in the new program and procedures.

Brief discussion to clarify how observations and findings will be documented in NRC reports.

5. Performance Indicators were discussed. NRC distributed draft definitions of unavailability, etc. information on Pl. See attachment. Questions was asked if NEl was receiving data from non-pilot plants. NEl indicated that they were working with plants to secure data.

This information represents minor changes from last months meeting. NRC indicated that August data to be reported in September would be due to NRC August 11. NEl distributed draft Containment Leakage report. See attached.

NRC initiated discussion of safety system functional failures but discussion tabled until next meeting.

6. Supplemental Inspection Program: NRC distributed draft manual chapter. See attached.

NEl indicated that they will sent to NRC comments on draft manual.

7. Shut down Performance Indicator: Handout was distributed regr n,1g this Pl. To be discussed on August 11,1999 meeting.
8. Brainstorming session on future implementation activities. The afternoon session NRC and NEl led a brainstorming session during which time participants identified activities which NRC and Industry must complete in order to fully implement the new program.

See attachment.

s

r l

l Deci:l:n item 2:

! 1. Definition of unavailability handout. NEl to provide input 8/11 1

' 2. Containment Leakage Pl. Additional information to be provided 8/11

3. Unplanned power issue: To be reviewed during pilot. NRC to provide examples of unplanned power Pl.
4. Safety System Functional Failures: NRC to provide examplos 8/11 l _ 5. Reactor Coolant System: NRC to provide examples 8/11

) 6. Security system: NEl and NRC to resolve issues 8/11 related to comment field.

Attachments:

! 1. Performance Indicator issues From t.% Pilot Program - Draft

2. NRC Inspection Manual
Supplemental inspection for issues Categorized Contained in l Column Three of the Assessment Action Matrix - Draft
3. Inspection Procedures to be used for assessing extent of condition / generic implications j of column three issues
4. Containmern 9.eakage -- Draft Shutdown Operating Margin - Draft t 6. Brainstorm Flip Charts Developed at afternoon meeting
7. NRC June 21-25,1999 SDP Training Materials -- electronic copy and paper copy
8. Pilot Plant Specific SDP Data - electron copy on disks. No paper copy available.

l l

l 1

Prepared by: August Spector August 3,1999 l-l I

l

?

l j

a BRAINSTORM FLIP CHART # 1 Status Report to Commission

  • Possible Date - October 19997
  • Shared responsibility:

NRC - TTF, PPEP NEl organization Industry representatives Public - UCS and others??

Pre-Tar 4ks for status report to commission

  • On going feedback / metrics / update
  • Feedback forms to be reviewed by NRC -- TTF and NEl/ Industry review their forms
  • Need for consistent evaluation criteria. NRC has developed this and distributed draft to PPEP BRAINSTORM FLIP CHART # 2 What do we want to see to accomplish final implementation?
1. Commission approval - final - by April 2000, latest.
2. Chief Nuclear Officer commitment and buy in to voluntary program
3. Final implementation by April 1,2000
4. Readiness by both NRC and industry:

Consideration--

  1. Resources in terms of people and dollars
  1. Cultural change internal to organization
  1. Training (Skill / Knowledge) in new processes
  1. Establishment of organizational infrastructure - ( Management, IRM, personnel, etc.)
  1. Communication to and by public, stakeholders. There must be an on-going dialogue leading to April 2000 and then beyond.
6. Short term pilot success - good data, four NRC objectives / themes. As a process, were we successful.
7. Long term success - achievement of NRC four objectives / themes, safe plant operation.

l l 8. Informed ' decisions by decision makers l

BRAINSTORM FLIP CHART # 3 )

ROLE CLARIFICATION ,

Area NRC Industry NEl Training in new *NRC has begun internal Need to begin

  • Training for pilot process training of regionalinspectors planning and plants August 12-13 and managers. offering internal related to SDP.
  • Continuing training is currently training by non- Is planning planned for Winter 1999 and pilot plants in Pi, development of Spring 2000. SDP, PSA, etc.
  • Desk-Top Guide for
  • Planning to hold with NEl distribution to plants.

regional workshops. *ls planning with ,

NRC to hold regional workshops. .

1 Cultural /Organi

  • Top management presented at  !

zational each region and to various HQ l Development offices.

Change *lssued to all employees publications about new program.

  • lssue news, stories, updates about new program on monthly basis in agency newsletter. 4
  • Established Change Coalition in each region and will begin to expand effort.
  • Currently considering additional organizational development efforts such as focus-discussion groups, gaining management support, etc.

Regional

  • Regional Workshops with NEl
  • Regional
  • Regional Workshop Activities *Need to consider additional Workshop with with NEl/NRC activities. NEl/NRC *Need to consider
  • Need to consider additional activities.

additional activities.

u__ _ - - _ _ _ _ _ _ -

i nl _

nb i uu e _

. eE s Pg 1 i t

nN/o

/ yl r

lr0 i

nCt ia

- p0 oRsD u A2 CNdc n n t i d r n t

_ nc oi o o ie t a

inli b isis nm e eg n eEu u eu sc e nNPg imde Sgn i //

ait m l

ir t

nCryo Cml a lane l lp i

p ort sila Ro in inae lumn A CNuD NCf FMM FI o

? s r g s e e u r en h dl e le i

is eu gd u s s d u

Cl c e i

c r nf ae M

a oe tuc CRos h

Rainr Nf p o

?

n y

r p f io s

S y r

r g en e l Etooh is es n

E a l

u r i l Naluk s Bmao i

dl e I

T u iegd s /

inis I

b r u nf at he Cgr Re o Cmf Ror eic CRos uc NRW oe V F e o _

I T NCf d _

C _

A n K

?

s f i o

_ R y r g en le f

atoss .

O r a dli iegd s u e u t

St r im -

W u0 n0 nf at he Copm

_ N J2 a0 oe uc CRos Reo NRC O

I

- T i

N ic o n y 2 r e

/

s r g

?

s l s s f ory e deb _

N a bt se ia s e uid ne b r en le ug fa la osuc su e ma A m dl i e u Pn u t sf int in n s iGs ts e oin t ma a R e c iY iegd s u nf a e Ce itspt e Ceo r

CGoC r

T e r

oe tuc h Re n s n RipmJ t

r s u

REnR d h w NMsre e co r

M D Ce CRos Nwre cfo ios NReN _

R _

E g k l

E cl T n somn 2Eo .

r i  ? c s 0Nt _

T e v s set i

l o s -

ig r g er le edetey bt ug e d n

0-yed

_ R b m s dl ei u it i eiyd d sb Pn s

ns E 9bt 9

it ivug s e t .

k O it e on l _

e n iegd s ei H v a nfu at he uecpwu Ce s npisg Re t

o l

aibd mC _

S N o h T

oe uc CRos l

l Art eonis es NMsrb eee l i

P inu FGsN u R .

n .

io -

r t oss p 1 _

e t iCi# l b

o t sr uopmNea mRFf r t

c t a eoyeh O SRCbSC la h n r t o e iwie se

- b m

e g st inr t g ei t

m t

p enm1 e eoo2 /

S MCC9 2 _

t 1 C .

log gR t

i nt 9s in N 2y ug Pis nug Eriau3 l Oteh

- Net l

uu i JA NTA1 CMw -

1 0

0 2

l i

r p

A l

i r

p A

t s clagn u no h c

Cd mi nis r Rneisot nr ae a

_ Ncit s M

- t n c s os u s tu nt d 1

e n n s y in intr o r ol a cagn ol r a

Fd o ef p no u E ;a ep l sRt o Cmi nis b r

Rteis Nitnrsae loPli e NRR CPp F u t 0 _

d Tl c 0 n u

/ ic I

OPo t d n

0 2

sd neRu Cl b Ll I

Pc u ae cagn ol s y r

o sm//NP idr no a u

s :r l y No aC t Cmi eisn is n coEr LLNs t Os NhdN i aR t

Rt Nitnrs ae J a

t t c s n u r s e e n d e t

f nye ic m eo vi n s b r tr int ys it a t

cagn ol no m Dsd uapo r

cs au Cmi nis e l

Ed ee d i soet ml r a Rteis c e

NInRR M a pfsio ve l

Nitnrs ae D _

o t it c _

n al u r ic ir o t nE d n s e so eN/

n b cagn o!

o r i sf mC no m _

t ca mdlyeR Cmi is e -

let b apN eisn v o

Ed au sri em Rt ae n Nitnrs N _

r r l kr e E d gn ig n N isno aT o -

nist osah c' u i c ms sm it ane sd b

r e

t nt t luo sm o l

ela elo Es ivbi e a Pelee loa i v sa t c

Nde" pT l O

_ r e

b m

e t

p e

S t

s u

g u

A

i l-l i

I ATTEN9EES US Nuel. EAR REGULATORY COMMISSION 1 I

August Spector, NRR I Tim Frye, NRR l

Don Hickman, NRR j Alan Madison, NRR l Michael Johnson, NRR  !

Lee Miller, NRR -

l Randy Sullivan, NRR Jim Pulsifer, NRR j Marie Pohida, NRR '

' Bill Dean, NRR Jeff Jacobson, NRR Ken Brockman, NRR  ;

4 OTHER

~ R. Cary Gradle, BGE i Pat Loftu, Comed Wally Beck, Comed j Gabe Salamon, PSEG _ ,

Richard Jaworski, OPPD l Bob Evans, NEl Jim Sumpter, NPPD'

- Yue Gwan, ASTM, Inc. ,

David C. Tubbs, MidAmerican Energy.

David N. Perkey, NUS Information Services James A. Hutton, PECO Energy >

Dennis Hassler, PSEG Nuclear

' Wade Warren, Southem Nuclear Tom Houghton, NEl John Butler, NEl

- Mark Burzynski, TVA Kevin Borton, PECO Energy i

L I

ie t

F DRAFT FR -

PERFORMANCE INDICATOR ISSUES l FROM THE PILOT PROGRAM

1. Definition of unavailability This is intended to be a measure of the capability of the equipment to perform its intended function when demanded; it is not intended to be a measure of the ability of operators to restore system function during an accident. We propose the following for NEl 99-02:

Causes of planned unavailable hours include, but are not limited to, the following:

preventive maintenance, corrective maintenance on non-failed trains, or inspection requiring a train to be mechanically and/or electrically removed from service

  • - planned support system unavailability causing a train of a monitored system to be unavailable (e.g., ac or de power, instrument air, service water, component cooling water, or room cooling) testing, unless the test configuration is automatically overridden by a valid starting signal, or the function can be immediately restored either by an operator in the control room or by a dedicated operator stationed locally for that purpose. Restoration actions must be contained btI rb in a written procedure, must be uncomplicated (generally, a single action), and must not uM pf require diagnosis or repair. Credit for a dedicated local operator can be taken only if (s)he is
  1. s- positioned at the proper location throughout the duration of the test for the purpose of pr restoration of the train should a valid demand occur. The intent of this paragraph is to allow gue licensees to take credit for restoration actions that are virtually certain to be successful (i.e.,

probability nearly equal to 1) during accident conditions.

=

any modification that requires the train to be mechanically and/or electrically removed from service Note: Most PIS, including the SSU, inonitor events or conditions that represent varying degrees of increased risk to public health and safety. The NRC has determined that the increase in risk is not safety significant (excluding exceptional events, which are promptly identified and handled outside the Pi and baseline inspection programs) unless the white-yellow threshold is exceeded.

The NRC responds to Pl events or conditions to determine their safety significance when the green-white threshold has been exceeded. There are situations in which a safety system is unable to respond sutomatically to a valid demand signal, yet there is virtually no increase in risk

., to public health and safety. These situations occur in some surveillance test configurations where the operator is able to immediately restore system availability. Therefore, conditions in which the probability of restoration is essentially 1.0 need not be reported because the increase in risk is negligible. Any other condition would be captured by the SSU because it represents some increased risk to public health and safety, which is what this Pi is intended to moniter.

2. Unotanned oower chanoes The intent is to " provide an indication of stability of plant operations" (A New Regulatory Oversight Process, NEl, Draft dated 9/10/98). This would include problems that perturb steady state operation for any reason, planned or unplanned, with few exceptions. The

, word " unplanned" may cause licensees to take aMons for the sole purpose of avoiding a count, rather than for plant safety or electricity generation. We propose to eliminate the word " unplanned" and count all power reductions of greater than 20%, with only a limited DRAFT

DRAFT number of exceptions (for refueling and mid-cycle maintenance outages, load following at the direction of the load dispatcher, and anticipatory power reductions for plant safety).

3. Safety System Functional Failures

- Some licensees do not appear to be using the five years of SSF data that we provided, and/or do not agree with our determinations. This was intended to be the same indicator we have been using in our PI program for 13 years.

- The appropriate date to be assigned to an SSFF is the date of the event, unless that date is not included in the calculation period of the current Pl, in which case the discovery date would be used. NEl 99-02 says to use the LER date, which is not appropriate.

- Licensee representatives at the INPO workshop on EPIX reporting requested that the name be changed to avoid confusing SSFFs with Maintenance Rule functional failures.

4. Containment Leakaoe An acceptable method, if it can be used by all licensees, is as follows: as each valve is tested, if it meets its acceptance criterion, total leakage is known to be below some value less than .6 Lg if any valve exceeds its acceptance criterion, total leakage would be calculated (calculational method to be discussed).
5. RCS Leakaoe This indicator was intended to be a measurement of total RCS leakago. There is a problem identifying the appropriate denominator (allowable leakage), however, since licensees measure unidentified leakage against one limit and identified leakage against another limit.

So the indicator was changed to be identified leakage only, since unidentified leakage is small in comparison. We then found out that some licensees have tech specs on unidentified and total leakage, and not on identified leakago. NEl proposes that they subtract the two to get identified leakage and subtract the two limits for the denominator.

This could result in a yellow PI when no tech spec limit is exceeded. The simple answer is for those licensees to use total leakage, especia!!y since that is what we wanted from the start.

6. S_ecurity Eouioment Performance Index

- The thresholds were specified as decimal numbers rather than percentages like the other PIS that are ratios. They were specified to two decimal places, which does not achieve the I same number of significant figures as the other thresholds. This resulted in one pilot plant l

calculating the index as 0.054, which they rounded to 0.05, in accordance with the NEl 99-02 guidance. When this is plotted, it is clearly in the white band, yet the table says it is on the 0.05 threshold. The thresholds for this Pl must be changed from 0.05 and 0.15 to 5.0%

and 15.0%.

- The method for calculating the index results in progressively higher unavailability required as the number of zones increases from 1 to 20. At 20 and above, the required i

unavailability is 99.75%, which cannot be met by available equipment. The calculational method must be revised.

7. Comment Field There is no guidance on what to put in this field. Some licensees have explained changes to previously reported values by stating only that the first report was wrong. We need an explanation of why it was wrong, such as, they got new information, or they discovered data that they had missed the first time. Some licensees have provided lengthy comments even though every PI was green. There needs to be some standardization on what to put in this field.

DRAFT

m NRC INSPECTION MANUAL AAAA Inspection Procedure XXXXX SUPPLEMENTAL INSPECTION FOR ISSUES CATEGORIZED CONTAINED IN COLUMN THREE OF THE ASSESSMENT ACTION MATRIX j PROGRAM APPLICABILITY: 2515 FUNCTIONAL AREA: Initiating Events Mitigating Systems Barrier Integrity Emergency Preparedness Occupational Radiation Safety l Public Radiation Safety Physical Protection INSPECTION BASIS:

The NRC's revised inspection program includes three parts: baseline inspections:

generic safety issues and special inspections: and supplemental inspections performed as a result of risk significant performance issues. The inspection program is designed to apply NRC inspection assets in an increasing manner when risk significant performance issues are identified, either by inspection findings evaluated using the significance determination process (SDP) or when performance indicator thresholds are exceeded. Accordingly, following the identification of an inspection finding categorized as risk significant (1.e.. white, yellow, or red) via Table 2 of the SDP. or when a performance indicator exceeds the

" licensee response band" threshold, the NRC regional office will perform supplemental inspection (s). The scope and breadth of these inspections will be based upon the guidance provided in the NRC's " Assessment Action Matrix" and the Supplemental Inspection Selection Table (included in 2515 xxxxxxx).

This procedure provides the supplemental response for performance indicators or inspection issues included in Column Three of the Assessment Action Matrix. The guidance provided in this procedure was developed with consideration of the following boundary conditions:

. supplemental inspection will not be done for single or multiple green issues:

. the baseline inspection procedure for identification and resolution of problems is independent of the supplemental response: however, information previously obtained during the baseline inspection can be Issue Date: 07/09/99 Draft XXXXX

l 1

I used to address the inspection requirements contained within this procedure; l i

the inspection requirements contained in this procedure will be completed

for all issues contained in Column Three of the agency's Assessment i Action Matrix;
  • the inspection requirements are the same for any issue contained in Column Three of the Assessment Action Matrix regardless of whether the issue emanated from a performance indicator or from an inspection finding:

. new inspection issues (other than programmatic) resulting from supplemental inspections will be evaluated and categorized in a similar manner to that of the baseline inspection program using the significance determination process; and.

+ programmatic issues resulting from this supplemental inspection will not

)

l be evaluated using the significance determination process, but will be addressed via appropriate enforcement actions or additional supplemental  ;

inspection.

The supplemental inspection program is designed to support the NRC's goals of maintaining safety, enhancing public confidence. improving the effectiveness and efficiency of the regulatory process, and reducing unnecessary regulatory burden.

I l

XXXXX-01 INSPECTION OBJECTIVE (S) 01.01 To provide assurance that the root causes and contributing causes are understood for individual and collective issues contained in Column Three  !

of the Assessment Action Matrix. j 1

, 01.02 To provide assurance that the generic implications and extent of I l

condition are identified for individual and collective issues contained in Column Three of the Assessment Action Matrix.  ;

01.03 To provide assurance that licensee corrective actions are sufficient to i address the root causes, the contributing causes, and prevent recurrence for individual and collective issues contained in Column Three of the f<

Assessment Action Matrix.

01.04 To independently assess the extent of condition and generic implications i l

associated with both individual and collective issues contained in Column Three of the Assessment Action Matrix.

XXXXX-02 INSPECTION REQUIREMENTS It is expected that the licensee will complete an evaluation for each white or yellow issue contained in Column Three of the Assessment Action Matrix. In addition, for those instances where entry into Column Three is due to multiple .

white issues. it is expected that the licensee will perform a collective l evaluation (self-assessment) of the multiple issues. The intent of this XXXXX Issue Date: XX/XX/XX 1

)

~. . ,,

procedure is to assess the adequacy of these. licensee evaluations and to perform an independent _NRC inspection to assess the validity of the licensee's

. conclusions.regarding the extent of condition and generic implications of the ,

individual and collective issues. The licensee's evaluations should generally {

" address each attribute contained in the inspection requirements section: however. '

the depth of the evaluation may vary depending on the significance and complexity of the issue (s). In some cases, the answers to specific inspection requirements 1 will be self-evident with little additional review or analysis required. J L A -review of individual issues that were previously reviewed during the i supplemental inspection procedure for Column Two need not be repeated; however.

a review should be performed of the licensee's collective evaluation for multiple l White issues.

02.01 Problem Identification

a. Determine that the evaluation identifies who (1..e. licensee or NRC), and under what conditions the issue (s) was/were identified.
b. Determine that the evaluation documents how long the issue (s) existed.

and prior opportunities for identification.

c. Determine that the evaluation documents the plant specific risk  !

consequences and compliance concerns associated with the issue (s) both l individually and collectively..

l 02.02 Root Cause and Extent of Condition Evaluation

a. Determine that the issues were evaluated using a systematic method (s) to identify root cause(s) and contributing cause(s).
b. Determine that the root cause evaluation was conducted to a level of detail commensurate with the significance of the problem.
c. Determine that the root cause evaluation included a consideration of prior occurrences of the issue (s) and knowledge of prior operating experience.
d. Determine that the root cause evaluation included consideration c~

potential common cause(s) or generic impacts (extent of condition) of the issue (s).

02.03 Corrective Actions

a. Determine that corrective action (s) are specified for each root / contributing cause or that there is an evaluation that no actions are necessary.

o

b. Determine that the corrective actions have been prioritized with

~

consideration of the risk significance and regulatory compliance.

Issue Date: 07/09/99 Draft XXXXX

r-

's* n.

~

c. Determine that a schedule has been established for implementing and completing the corrective actions.
d. Determine that quantitative or qualitative measures of success have been )

developed for determining the effectiveness of the corrective actions to I prevent recurrence.

02.04 Indeoendent Assessment of Extent of Condition and Generic Imolications

a. Perform a focused inspection (s) to independently assess the validity of )

the licensee's conclusions regarding the extent of condition and generic  !

implications of the issues. The inspection (s) chosen should be selected from the list contained in Appendix A or Appendix B to Inspection Manual 2515. The objective of this procedure should be to independently sample

, performance in areas related to the issues contained in Column Three of the Assessment Action Matrix, as necessary to provide assurance that the licensee's evaluation regarding extent of condition and generic implications is sufficiently comprehensive. The intent is not to re-perform the licensee's evaluation, but is to assess the validity of the licensee's evaluation by independently sampling performance in related areas. The results of the inspection should be documented in this

-inspection report.

XXX-03 INSPECTION GUIDANCE General Guidance .

This inspection procedure is designed to be used to assess the adequacy of the licensee's evaluation (s) of issues categorized as White or Yellow and entered into Column Three of the agency's Assessment Action Matrix. As such, a reasonable time (generally within 30-60 days) should be allowed for the licens9e

to complete their evaluation
however, all corrective actions may not be fully l

completed upon implementation of this procedure. The adequacy of the completed corrective actions will be assessed in the baseline portion of the inspection program. The inspection report associated with this inspection will contain the NRC's assessment for each inspection requirement, whether positive or negative.

Weaknesses in the licensee's evaluation will be considered for enforcement  ;

action. In addition, weaknesses in the licensee's evaluation may require t

additional NRC supplemental inspections, including a more comprehensive review of the licensee's root cause analysis and corrective action programs.

The following sections of the procedure are provided as guidance to help the '

inspector fulfill the specific inspection requirements contained in paragraph

02. It is not intended that the inspector verify that the licensee *s evaluation of the issues address every attribute contained in the inspection guidance  ;

l section.- The intent is that the inspector use the guidance sections of the l procedure to look for weaknesses in the licensee's evaluation that might indicate an issue associated with one of the inspection requirements.

Definitions Root Cause(s) are defined as the bas 1c reason (s) (i.e., hardware, process, human perfccmance), for a problem, which if corrected, will prevent recurrence of that problem.

XXXXX Issut Date: XX/XX/XX t-

E l

L Contributing Cause(s) are defined as causes that by themselves would not

! create the problem, but which are important enough to be recognizcd as needing corrective action. Contributing causes are sometimes referred to as causal factors. Causal factors are those actions, conditions, or events which j directly or indirectly influence the outcome of a situation or problem. j

  • Repeat occurrences are defined as two or more independent conditions which are the result of the same basic causes.  ;

Common Cause is defined as multiple failures (i.e., two or more) of plant equipment or processes attributable to a shared cause, l Generic Impact is defined as the extent to which an identified problem has the potential to impact other plant equipment or processes in the same manner identified in the root cause analysis. Generic impact is sometimes referred to as extent of condition.

Consequences are defined as the actual or potential outcome of an identified  ;

problem or condition. I Soecific Guidance <

Sections 03.01 through 03.03 apply to the licensee's evaluation of both individual and collective issues.

03.01 Problem Identification

a. The evaluation should state how and by whom the issue was identified.

When appropriate, failure of the licensee to iden;ify the problem at a precursor level should be evaluated. Specifically, the failure of the licensee to identify a problem before it becomes risk significant may be indicative of a more substantial problem. Examples would include a failure of the licensee's staff to enter a recognized non-compliance into the corrective action program, or raise safety concerns to management, or the failure to complete corrective octions for a previous problem resulting in further degradation. If the NRC identified the issue, the evaluation should address why licensee processes such as peer review, supervisory oversight, inspection, testing, or quality activities did not identify the problem.

b. The evaluation should state when the problem was identified, how long the condition (s) existed, and whether there were prior opportunities for correction. For example, if a maintenance activity resulted in an inoperable system that was not detected by post-maintenance testing or by quality assurance oversight, the reasons that the testing and quality oversight did not detect the error should be included in the problem

~

identification statement and addressed in the root cause evaluation.

c. The evaluation should address the plant specific risk consequences of the issues, both individually and collectively. A plant specific assessment l may better characterize the risk associated with the issues due to the l generic nature of the performance indicators and the significance determination process. For conditions that are not easily assessed Issue Date: 07/09/99 Draft XXXXX

i 1

quantitatively, such as the unavailablity of security equipment, a qualitative assessment should be completed. The evaluation should also 1 -

include an assessment of compliance. As applicable, some events may be more appropriately assessed as hazards to plant personnel or the l environment. The inspector's review of the risk assessment should be coordinated with the Senior Reactor Analyst.

03.02 Rgpt Cause Evaluation

a. The licensee's evaluation should generally make use of a systematic method (s) to identify root cause(s) and contributing cause(s)

The root cause evaluation methods that are commonly used in nuclear facilities are: i Events and causal factors analysis -- to identify the events and conditions that led up to an event; e Fault tree analysis -- to identify relationships among events and the probability of event occurrence; e Barrier analysis -- to identify the barriers that, if present or i strengthened, would have prevented the event from occurring:

. Change analysis -- to identify changes in the work environment since the activity was last performed successfully that may have caused or contributed to the event:

. Management Oversight and Risk Tree (MORT) analysis --

to systematically check that all possible causes of problems have been considered; and a Critical incident techniques -- to identify critical actions that, if performed correctly, would have prevented the event from

. occurring or would have significantly reduced its consequences.

The licensee may use other methods to conduct the root cause evaluations.

A systematic evaluation of a prcblem using one of the abcve methods should normally include:

. A clear identification of the problem and the assumptions made as a part of the root cause evaluation.

For example, the evaluation should describe the initial operating '

conditions of the system / component identified, staffing levels, and training requirements as applicable. l

. A timely collection of data, verification of data, and preservation of evidence to ensure that the information and circumstances surrounding the problein are fully understood. The analysis should be documented such that the progression of the problem is clearly understood, any missing information or inconsistencies are XXXXX Issue Date: XX/XX/XX

identified. and the problem can be easily explained / understood by

-others.

!' . A determination of cause and effect relationships resulting in an identification of root and contributory causes which consider potential hardware, process, and human performance issues. For l

example:

- hardware issues could include design, materials, systems aging, and environmental conditions:

- process issues could include procedures, work practices, operational policies, supervision and oversight, preventive and corrective maintenance programs, and quality control methods; and human performance . issues could include training, communications, human system interface, and fitness for duty. J

b. The root cause evaluation should be conducted to an adequate level of detail, considering the significance of the problem.

. Different root cause evaluation methods provide different perspectives on the problem. In some instances, using a combination of methods helps to ensure the analysis is thorough. Therefore, the root cause evaluation should consider evaluating complex problems which could result in significant consequences using multi-disciplinary teams and/or different and complimentary methods appropriate to the circumstances. For example, problems that involve hardware issues may be evaluated using barrier analysis, change analysis, or fault trees.

The depth of a root cause evaluation is normally achieved by repeatedly i asking the question "Why?" about the occurrences and circumstances that caused or contributed to the problem. Once the analysis has developed all of the causes for the problem (i.e. root, contributory, programmatic),

the evaluation should also look for any relationships among the different causes.

The depth of the root cause evaluation may be assessed by:

. Determining that the questioning process appeared to have been i conducted until the causes were beyond the licensee's control.

For example, problems that were initiated by an act of nature, such as a lightning strike or tornado, could have the act of nature as one of the causes of the problem. However, the act of nature would i not be a candidate root cause, in part, because the licensee could not prevent it from happening again. However, a licensee's failure to plan for or respond properly to acts of nature would be under management control and could be root causes for the problem.

Issue Date: 07/09/99 Draft XXXXX l

l E

r e Determining that the problem was evaluated to ensure that other root and contributing causes were not inappropriately ruled out due to

+

assumptions made as a part of the analysis.

For example, a root cause evaluation may not consider the adequacy of the design or process controls for a system if the problem appears to be primarily human performance focused. Consideration of

. the technical appropriateness of the evaluation assumptions and their impact on the root causes would also be appropriate. '

. Determining that the evaluation collectively reviewed all root and contributory causes, for indications of higher level problems with l a arocess or system. This is particularly true when entry into column three of the Assessment Action Matrix is due to multiple i issues.

For example, a problem that involved a number of procedural inadequacies or errors may indicate a more fundamental or higher level problem in the processes for procedural development. control, review, and approval. Issues associated with personnel failing to follow procedures may also be indicative of a problem with supervisory oversight and communication of standards.

. Determining that the root cause evaluation properly ensures that correcting the causes would prevent the same and similar problems from happening again or sufficiently minimizes the chances of re-occurrence. Complex problems may have more than one root cause as well as several contributory causes. The evaluation should include checks to ensure that corrections for the identified root causes do not rely on unstated assumptions or conditions which are not

. controlled or ensured. '

For example, root causes based upon normal modes of operation may not be valid for accident modes or other "off normal" modes of operation.

. Determining that the evaluation includes an explanation for rejecting possible root causes. Providing a rationale for ruling out alternative possible root cause(s) helps to ensure the validity of the specific root cause(s) that are identified.

c. The root cause evaluation should include a proper consideration of repeat occurrences of the same or similar problems at the facility and knowledge of prior operating experience. _This review is necessary to help in developing the specific root and contributing causes and also to provide indication as to whether the White issue is do to a higher level concern L involving weaknesses in the licensee's corrective action program.

The evaluation should:

. Broadly question the applicability of other similar events or issues with related root or contributory causes.

-XXXXX Issue Date: XX/XX/XX l

L

For example, root .cause evaluations associated with outage activities and safety related systems could include a review of prior operating experience involving off-normal operation of systems, unusual system alignments, and infrequently performed evolutions.

. Assess whether previous root cause evaluations and/or corrective actions identified what root causes were missed or inappropriate and what. aspects of the prior corrective actions did not preclude reoccurrence of the problem.

For example, an adequate evaluation would include a more detailed validation of the implementation of the previously specified corrective actions and a reassessment of the identified root causes to determine process or performance errors which may have contributed to the repeat occurrence.

. Determine if the root cause evaluation for the current problem specifically addresses those aspects of the prior root cause evaluation or corrective actions that were not successfully addressed.

For example, if during the review of a tagging error that resulted in a mis-positioned valve the licensee determines that a previous similar problem occurred and the corrective actions only focused on individual training, then the root cause for the repeat occurrence should evaluate why the previous corrective actions were inadequate.

.' Include a review of prior documen;ation of problems and their associated corrective actions to determine if similar incidents have occurred in the past.

For example. the licensee should consider +1 its review of prior operating experience internal self-as sessments, maintenance history, adverse problem reports, and external data bases developed to identify and track operating experience issues. Examples of external data bases may include Information Notices. Generic Letters, and vendor / industry generic communications.

The inspectors should discuss the problem with other resident or regional inspectors associated with the facility to assess whether other similar problems or root causes for dissimilar problems have occurred at the facility that should have been considered.

d. The root cause evaluation should include a proper consideration of generic 1mpacts (extent of condition) of the problem including whether other systems equipment, programs or conditions could be effected. The evaluation should:

Assess the applicability of the root causes across disciplines or departments, for different programmatic activities, for human performance, or for different types of equipment.

Issue Date: 07/09/99 Draft XXXXX

l For' example. the Fire-Protection Organization considered that the root causes identified for the mis-alignment associated with the safety injection system could potentially affect their systems since they shared a common tagging and alignment method with operations.

As a result, feedback was provided to the incident review committee t to include modification of the Fire-Protection control procedure.

l and provide formal training to all Fire-Protection personnel.

l 03.03 Corrective Action l The proposed corrective actions to the root and contributing causes should:

a. Address each of the root and contributing causes to the White issue and the generic impact of the issue. The corrective actions should be clearly defined. Examples of corrective actions may include, but are not limited to modifications, inspections, testing, process or procedure changes, and training.

The proposed corrective actions should not create new or different problems as a result rf the corrective action. If the licensee determines that no corrective actions are necessary, the basis for this decision should be documented in the evaluation.

b '. Include consideration of the results of the licensee's risk assessment of the issue in prioritizing the type of corrective action chosen. Solutions that involve only changing procedures or providing training are sometimes I

over-utilized when more comprehensive corrective actions such as a design modification would be more appropriate. The corrective action plan should also include a review of the regulations to ensure that if compliance issues exist, the plan achieves compliance. .

Also, the licensee should ensure that:

c. The corrective actions are assigned to individuals or organizations that are appropriate to ensure that the actions are taken_ in timely manner.

l Also, the licensee should ensure that there is a formal tracking mechanism j established for each of the specific corrective actions.

'd. A mechanism exists to validate the effectiveness of the overall corrective action plan. Specifically, a feedback loop and data acquisition / validation j techniques should be established to measure, either quantitatively or qualitatively, the effectiveness of the corrective actions. Effective mechanisms would include, but are not limited to, assessments, audits, inspections, tests and trending of plant data, or follow-up discussions with plant staff.

03.04 IndeDendent Assessment of Extent of Condition and Generic Imolications

~

a. In choosing the ' inspection procedure (s) to assess the validity of the licensee's conclusions regarding extent of condition and generic implication, consideration should be given to whether entry into Column XXXXX Issue Date: XX/XX/XX

', Three of the Assessment Action Matrix was due to a single issue or was due to multiple issues. For those instances where multiple issues were the L

cause, a broad based inspection (s) which would assess performance across the associated strategic performance area should be considered. If entry into Column Three was due to a single yellow issue, a more focused inspection would likely be appropriate.

Consideration should also be given to the comprehensiveness of the licensee's evaluation (s). In those cases where significant weaknesses are identified in the licensee's evaluation (s) during implementation of paragraphs 02.01 through 02.03 of this procedure consideration should be given to performing a more in-depth programmatic review of the licensee's corrective action program using Inspection Procedure XXXXXXXX.

XXXXX-04 RESOURCE ESTIMATE The resource required to complete this procedure will vary greatly depending upon the specific procedure (s) chosen to independently assess the validity of the licensee's evaluation of extent of condition and generic implications. Generally it would be expected that the procedure could be competed within 40-240 hours.

XXXXX-05 REFERENCES None END l

l Issue Date: 07/09/99 Draft XXXXX

F4 INSPECTION PROCEDURES TO BE USED FOR ASSESSING EXTENT OF CONDITION / GENERIC IMPLICATIONS OF COLUMN THREEE ISSUES INITIATING EVENTS Protection Human Procedure Equipment Design Configuration Against Performance Quality Performance Control External Events 71111-01 71111-14 42700 71111-08 93803 71111-04 71111-05 71707 72701 71111-07 93807 71111-13 71111-06 XXXX-Human 71111-12 93811 71111-20 perf. TBD 62700 71111-03 62707 71707 41500 50001 62707 71715 55050 50002 55100 52001 62704 52002 62705 General Procedures 90712 92700 93801 93802 93806 93808 I

l

, l 4

l l

Issue Date:07/22/99 DRACT Attachment 1 2515 Appendix B I

(

l

u.

MITIGATING SYSTEMS Design Protection Configuration Equipment Procedure Human Against. Management Performance Quality Performance External Events 71111 71111-01' 71111 62700 42700 71111-11 71111 71111-05 71111-04 62707 72701. 71111-14 71111-23 71111-06 71111-13 71111-03 71111-16 93803 71111-16 71111-07 42001 71707 93807 71111-20 71111-09 73052 xxxx Human 93810 71707 73756 Perf.-TBD 71111 62707 71111-15 93811 71111-19 41500 71111-21 71715 52001 71111-22 52002 71111-23 93810 71111-17 49001

)

55050 55100 57050 57060 57070 57080 57090 l 62704  !

62705 j 93810 l General Procedures 90712'93803 92700: 93807 i

93801, 93808 l L 93804 4 issue Date:07/22/99 DRAFT Attachment 1 2515 Appendix B I

r

( -

1 l

BARRIER INTEGRITY Fuel RCS Contain- Human Proce- Design Configu-Cladding Equip. & ment SSC Perform- dure Control ration Perfor- Barrier & Barrier ance. Quality Control mance Perf. Perf.

61702 62700 62700 71111-11 42700 71111-02 71111-04 62707 62707 71111-14 71111-17 71111-13 61705 71111-08 71111-10 71707 70307 71111-23 71111-20 61706 71111-17 71111-22 xxxxx - 72701 93803 71111-03 61707 71111-22 71111-17 Human 73052 93811 71707 61708 71111-23 71111-23 Perf -TBD 62707 61709 61728 61715 50001 61710 73753* 70313 41500 50002 73051* 70323 71715 73755*

71111-09 49001 55050 55050 55100 55100 57050 57050 57060 57060 57070 57070 57080 57080 57090 57090 62003 62704 62704 62705 62705 73756 General Procedures 90712 92700 93801 93808

  • - to be Combined into one procedure issue Date:07/22/99 DRAFT Attachment 1 2515 Appendix B l

y- .

i

'O r

n

~

Emeraency Preoaredness l

E l

! ERO Readiness- Facilities Procedure ERO Offsite EP and Equipment Quality Performance

! 71114 " Facilities " Performance " Performance No NRC l and Evaluation" evaluation" inspection of l Equipment ~ TBD TBD this key TBD attribute -

71114 evaluation 71114 performed by FEMA l

j i

i l-t issue Date:07/22/99 DRAFT Attachment 1 2515 Appendix B

m .

~

t Public Exoosure Facilities / Equipment Program / Process Human Performance j 4

65051 Solid' Radioactive Waste Training and )

Processing Program - TBD Qualifications - TBD Solid Radioactive Waste Processing Program - TBD 86740- 1 84725 i

Occupational Dose Facilities-'and Equipment Program / Process Human Performance Control of Internal Control of Internal Training and Exposures - TBD. Exposures - TBD Qualifications - TBD Control of External Control of External Exposures - TBD Exposures - TBD 79701 83728 General Procedures Response to Radiation Overexposure Events - TBD i Significant Potential for an Overexposure - TBD l

l lasue Date:07/22/99 DRAFT Attachment 1 2515 Appendix B e

7z L ,' '

i 1 1 l Cornerstone: Physical Security j i

Physical Access. Access Control Respongg_tg Protection System Authorization System Continoency System Events L 81018 81018 81018 81018 81020 81020 81020 81020 81034 81034- 81034 81022 l 81038- 81038 81038 81034 81042 81502 81042 81038 81046 81700 81046 81042 81052 81054 81052 81058 81058 81054 81062 81064 81058 81064 81070 81062

,- 81066 81072 81064 81078 81074 81066 81084 81080 81078 81810 81080 81084 81088 81110 l 81501 81601 81700 1

i i

L l

I I

issue Date
07/22/99 DRAFT Attachment 1 2515 Appendix B L. .)

Cl4 DRAEI CQq 7/a/g 1 CONTAINMENT LEAKAGE 2 Purpose e 3 This indicator monitors the integrity of the containment, the third of the three 4 barriers to prevent the release of fission products. It measures Type B and Type C 5 containment leakage as defined in 10 CFR 50, Appendix J and is a percentage of the 6 technical specification allowable leakage.

7 8 Indicator Definition 9 Containment Leakage is the hiehest monthly maximum-total Type B and Type C 10 minimum nath leakage as a percentage of the design basis leak rate (L.)-as 11 deter:nined in accordance izith 10 CFR 50, Appendix-J.

12 13 14 Data Reportine Elements 15 The following data are reported for each unit:

16 17 9 the hiehest maximme monthly value of the leak rate of Type B and Type l 18 C test results, ag, a percentage of L., for each month of the quarter (three 19 values).

20 21 Calculation 22 The unit values for this indicator are calculated as follows:

23 24 Indicator = the hiehest maximum monthly value of Type B and Type C leak rate l 25 test results, as a percentage of L.

26 27 This indicator is calculated for each month 28 29 Definition of Terms 30 Containment leakage, expressed as a percentage of L., is calculated per the unit's 31 Technical Specifications 32 33 Clarifyine Notes 34 This indicator is calculated and recorded monthly, and is reported quarterly. The 35 quarterly report will display the monthly indicator results for the previous four 36 quarters.

37

1 This determination of Type B and Type C testing is routinely made by the plant to 2 ensure compliance with the Technical Specifications limit of 0.6 L. for Type B and 3 Type C leakage.

4 5 If the leak rate of Type B and Type C penetrations being tested cannot be 6 determined because of excessive leakage, it is assumed that the limiting value of 60 7 percent of L. is exceeded. In this case, a monthly value ma*imum of 1.0 La is 8 reported. The-quarterly report chould deceribe the condition of the "ac !cft" ctate 9 and the "ac !cft" L. value fellowing repair.A comment should be added to the report 10 to reference the annlicable LER number.

11 12 For months when no leak-rate testing takes place, the reported value is the last 13 known calculated value.

14 15 During cutagcc, v/ hen cignificant !cah rate tecting tchec place and a number of 16 repairc are effected, the reported value in the %La value at the cc.d of the month in 17 place of the maximum-monthly value.

18 If an outace spans the entire month. a value of N/A should be reported.

19 20 For months when an outage is becun but not completed. the reported value should 21 reDect the hiehest value durine power operation.

22 23 For months when an outace is ended. the reported value should reDect the hiehest 24 end of outace summation of minimum nath leakace values unless the at-nower 25 value is hieher.

~26 27 Post repair leakage measurements are not counted toward the hiehest maximum 28 monthly value unless the repair is accepted.

29 30 Maximum nath values may be used in lieu of minimum nath values if desired.

I u

1

?IN SHUTDOWN OPERATING MARGIN The following indicators identify losses of control representing a green to white risk change.

Loss of Thermal Marain (loss of RHR)

NEl's proposed indicator is adequate for BWRs and PWRs: )

1

(

(change in RCS temperature due to Loss of RHR)/(change in temperature that would cause

. boiling) > .2 Loss of Level PWRs inadvertent loss of 2 feet of RCS inventory when not in midloop OR Inadvertent entry into midioop conditions OR Inadvertent loss of 2 inches of RCS inventory wP.en in midloop conditions The NEl level indicators can miss losses of level control during midloop operation. The NEl level indicators also do not cover refueling cavity flooded operation well. A significant amount of RCS inventory would need to be lost to trigger the indicator under these conditions.

Loss of Level BWRs We are still thinking about this one. ' We believe that NEl level indicator may miss events where RCS level drops below the " spill over point" (the st. sam separator turnaround point) resulting in a loss of circulation between the downcomer and the core region.

l I

l

S e<; 3bP no e e n.x sse 2-\ f 2.; P 4

i

.l s-v LI 2_b, ,

s Pr Is sW ~ .

SbP cooru .

2if 21F

gm

~

p' k/

\C- I . J_ Q

s i  !

I l

4

\

\.,

3bP i 97 p,g l Q A 1 2sP lr 2t?

i j

' l I

h=~- shP l 21 F 2-te i l

), .'-m/ !! i _

j _. N_ ..

I l

i l

I i

f I

. _. #_+-., ' - * *

=P 73 _: f gi_ _. s= = e.g . :_

i .. \p / , ;

e l

l i

l 1

I

_ . . , . - - = ~ ..~..n.-u..

'31; ;f

, c /4 g P C r=* e-a > Ot' f) M

2. ) t' L\

~~~~~~~~~~~~ '~~

^

~

_ -~~-.- .~ ._ _ _ _ , .

P,- T -; sbP ,c s -y V <e,c cr _r. , i-7_ i f '

2;P N

f

1 l

e

'NNNMM '^# M g- ..w QA 3bP 97 p,n 2rP 2rP

- - - = . .

' "" bh?,.w e%

__wesour t

F+ CAL She s %-o- shp 21 P IF t

4

~.E1"

~ '-

I. .. . 2.~. [.~ l' . . 2 ~ ~~" . T

.L' ^' 2_ _: _  : .*'~. .: '^ =':

. _ . *:=.^

_ _ . .'.': ~ :7.71..^...~~..C.'.

T;'

' ' ' - ' ' - . _1]_ _ _1

.~;--'* ..

^ ' '

';1

'" . . _' ' ' - . ...':~.~

.. . . '"L.

".;' .: .2:

~~.^~. .

l 5 . '

  • T $ _5 , - .:::':~5'.

'i

Y-.. . . ....:. . .~ .

2:T . . ~:2.~.::_^ ~. ~ C  ? . ~"- ..^~.~.. ~ ~ ^ ' ~ ~ -^

Z .1 -^..Ti . .~ ' ^'i .-~.' - .1'. CE 1 : T'

. .. .~:7T ^ ^ --.

7 CT:- *:.CT,.Q ~' ~~L~.~." :~.~ ri . : . . . :_.::^ ^ :::

' ^ ' '

^

. _ _ - . . . . . . . . - ' . . .~'~.~-'^:^^.:T.'."~.~~~.~~...-..~2^T 0

9 e

I l

I i

l i

l 4

I