ML20207K006
| ML20207K006 | |
| Person / Time | |
|---|---|
| Issue date: | 03/05/1999 |
| From: | Spector A NRC (Affiliation Not Assigned) |
| To: | NRC (Affiliation Not Assigned) |
| References | |
| NUDOCS 9903170010 | |
| Download: ML20207K006 (55) | |
Text
,
_ ~... _. - _. _ _ _ _. _. _. _ _. _. _ _ _ _ _. _ _ _ _ _
_. _... _. _ _... _ _ _ _ _ _ _ _ ~
t 1
March 5,-1999 l
MEMORANDUM TO:'
File' I
FROM:
August K. Spector, Communication Task Lead (original signed by:)
Inspection Program Branch i
Division of Inspection Program Management Office of Nuclear Reactor Regulation 1
SUBJECT:
SUMMARY
OF THE FEBRUARY 10,1999 MEETING WITH THE l
NUCLEAR POWER INSTITUTE TO DISCUSS THE CONTINUED j
DEVELOPMENT OF PERFORMANCE ASSESSMENT PROCESS AND INSPECTION PROGRAM IMPROVEMENTS
. On February 10,1999, a public meeting was held between the NRC and the NEl to continue exchanging,nformation and views in further developing the concepts sent to the Commission for improving the process for overseeing the safety performance of nuclear power
{
reactors. The meeting agenda, a list of those who attended the meeting, a copy of written j
information exchanged at the meeting, and summary minutes are attached.
Attachments: As stated l
Contact:
August K. Spector 301-415-2140 1
1 Distribution:
Central Files
. PUBLIC
//
PIPB R/F
//
(77) ?
DOCUMENT NAME: G:\\SECY2\\210 MIN To receive a copy of this document, indicate in the box: "C" = Copy without enclosures *E" = Copy with enclosures "N" = No copy OFFICE PIPB:DIPM u ',
l l
l l
NAME AXSpector - W DATE 3///f9 '
OFFICIAL RECORD COPY W/$
9903170010 990305
[
C 4'EA/f@
PDR REV0P ERONUMRC f
,_PDR f
$g 4 0 Ai sg e
ggv&P SiswdC
s Public Meeting Minutes Date:
February 10,1999 1
Time:
9:00 a.m. to 3:30 p.m.
Topic:
NRC/NEl MEETING TO DISCUSS THE CONTINUED DEVELOPMENT OF PERFORMANCE ASSESSMENT PROCESS AND INSPECTION PROGRAM IMPROVEMENTS Attendees:
See Attached Listing Items Distribute: See attachments l
Overview.
j For NEl and the NRC to discurs and review the NRC's development of performance assessment process and inspection program improvements. Participants shared progress by the NRC's Transition Task Force (TTF) on the new Regulatory Oversight initiative and to gained input from NEl and the public.
Issues discussed:
Trainina initiatives:
NRC discussed proposed dates for conducting three workshops / training sessions.
Two of these will be open to the public prior to the initiation of the pilot study. Public sessions to be designed to provide information related to Performance Indicators, reporting activities, etc.
Workshops are planned to be held in Region 111, Region ll, and possibly Washington, DC subject to space availability. NEl will provide instructional assistance at these sessions. One workshop, closed to the public, will be designed to train intemal NRC employees on the new procedures.
Communication initiatives:
NRC distributed an updated listing (see attached) of current planned meetings, conferences, training sessions, and other communication activities. NEl provided a calendar of nuclear industry meetings / conferences planned for the next several years. NEl provided a listing of questions from its members related to the proposed plant assessment process darived from its recent conference. (See attached)
Contact:
August Spector, NRC 301-415-2140
r 2
Insoection Proaram initiatives:
The NRC TTF Inspection Task Lead introduced members of the Inspection Task Group which will be meeting during the next few weeks on developing inspection program activities.
Performance Indicator initiatives:
Comerstone areas were discussed with NRC indicating current efforts to clarify comerstone areas. NEl indicated that they are developing a document which will describe EP, Security and Radiation Protection Comerstones. This document will be considered for inclusion in the NRC response. NEl briefed participants on its recent Workshop and distributed to NRC workshop i
participant notebook which includes viewgraphs of each presentation. NRC distributed its draft Process for Characterizing Risk Significance of Inspection Findings (see attached) and discussed its framework.
Pilot Proiect:
NEl reported that all proposed pilot plants with the exception of Prairie Island have been notified and that NEl would confirm participation to NRC. It was agreed that NRC would notify each State Program Office of those plants selected for participation in the pilot program. NRC distributed and reviewed its draft Objectives of the Regulatoiy Oversight Process improvement j
Pilot Program report (see attached) and the current draft of the Pl Pilot Program Reporting Manual.
Teasibility Studv:
NRC distributed and reviewed the draft inspection Non-conformance Evaluation Matrix in relation to the emergency preparedness comerstone (see attached). Discussion between NRC and participants related to the correlation of enforcement policy and regional application during pilot study.
Next meetina:
Agreed to hold next public meeting on February 24,1999 from 9:00 a.m. to 3:30 p.m.
Meeting adjoumed at 3:30 p.m.
1 l
l
r i
ATTENDANTS Public Meeting February 10,1999 f
?
g Ellen Ginsberg Tom Houghton Stephen D. Floyd NUCLEAR REGULATORY COMMISSION David Gamberoni, NRR, DISP Peter Eselgroth, Region I Lee Miller, TTC Pete Wilson, NRR Jim Lieberman, OE l
Alan Madison, NRR, DISP Tim Frye, NRR, DISP Steven Stein, NRR, DISP Michael Runyan, Region IV Jim Heller, Region lil McKenzie Thomas, Region lli Dave Nelson, OE Barry Westreich, OE William Dean, NRR, DISP Frank Gillespie, NRR, DISP Michael R. Johnson, NRR, DISP August Spector, NRR, DISP Jeffrey Jacobson, NRR, DISP Donald Hickman, NRR, DISP Morris Branch,' NRR, DISP Garreth Parry, NRR Renee Pedersen, OE R.W. Borchardt, OE
. Desiree R. Calhoun, OE Terry Reis, OE Douglas Coe, NRR, DISP OTHERS Joe Burton, NPPD/CNS Michael Callahan, Self Rosemary Reeves, NUS-IS Sidney Crawford, Self Robert W. Boyce, PECO James McCarthy, Virginia Power Kevin Nietmann, Baltimore Gas & Electric Company Jeffrey Reinhart, INPO I
i
.... - ~ _.
s l
OTHERS. Continued Deann Raleigh, Bechtel Power Rosemary Reeves, NUS-IS l
Ralph Shell, TVA Edward J. Vigluicci, TVA John Lamberski, Troutman Sanders Greg Gibson, Southem Califomia Edison Don Irwin, Hunton & Williams Bill Baer, Morgan, Lewis & Backlus Doug True, Erin Engineering l
s Draft: Regul:t:ry Oversight Process Communication Plan May 1999 5/4-6 R-1 Resident Mtg. (Tentative)
January 1999
~
" ~ "
1/14 Brief RegionalDRP Directors Joint NRC/NEl meeting to resolve issues prior to Pilot 1/14 Meet with NEl to discuss Pilot Plan (TBA) 1/20 Commission briefing on Process 5/10-14 Pilot Workshop - Public R-1/HQ = (TBA)
Recommendations 1/20 Enforcement Coordinators Briefm' g June 1999 1/22 Press Release to announce 30 day comment 610 ANS Conference presentation (tentative) period.
1/26 Brief ACRS on Final Recommendations 6/15 Issue Press Release on Enforcement Revisions 1/27 NEl/Public Meeting 1/28 BriefIndustry Regulatory Compliance and July 1999 Technology Group 1/28 Visit Salem 7/12 Present at MIT Course (Gillespie) 7/15-30 Conduct Regional Meetings with States on February 1999 details of new process 2/3 R-1 Town Meeting Conference Call September 1999 2/2 NEI Meeting with Industry; Site VPs/ Licensing Managers-East Brief commission TAs on Progress (TBD) 2/3 NEI Meeting with Industry; Site VPs/ Licensing Managers - West October 1999 2/10 NEl/Public Meeting: coordinated with OE 2/11 NE1 Task Force Briefing ofNSIAC 10/11-25 (TBD) conductjoint NRC/ Industry 2 day 2/17 R-Il Resident Meeting Workshop (NRC/NEI) 2/18 R-IV Resident Countespart Meeting Issue a Press Release regarding the Workshop 2/23 Public Comment Period ends 2/24 NEl/Public Meeting December 1999 TBA - Regional Meetings (coincide with PPRs to describe new process)
Brief Commission TA's March 1999 Janmary 2000 3/3 5 Regulatory Information Conference (introduce 1/15 Press Release issued announcing full process concepts) implementation and SALP deletion 3/11 NEl/Public Meeting 3/24 NEl/Public Meeting May 2000 3/26 Draft IP and IMC 0610 & PIM Guidance, for Pilot use issued for comment (made available to the Commission Briefing on Assessment results public)
Press Release issued April 1999 4/7 NEl/Public Meeting Note:
4/6-8 Briefing for American Power Conference (Frank
- 1. Change Coalition, Change Champion and other Gillespie presenter) internal communication vehicles to be on going 4/12-16 PI Workshop (R-3) public
- 2. Public Information to be posted on NRC Web-page 4/22 NEl/PublicMeeting i
4/26-30 Inspector Workshop (R-2)NRC 2/99
i t
l l
~
DRAFT OBJECTIVES OF THE REGULATORY OVERSIGHT PROCESS IMPROVEMENT PILOT PROGRAM 1.
Limited scale exercise of processes to evaluate whether they can function efficiently including:
Performance indicator data co!!ection and reporting by the industry Risk-informed baseline inspection program implementation by the NRC a
Evaluation of Pl and inspection results and determination of appropriate actions through the assessment process Enforcernent process implementation 2.
Identify problems with processes and implementing procedures and make appropriate changes prior to full implementation in January 2000:
Final Pl collection and reporting guidance to the industry by October 1999 1
inspection procedures, IMC 0610, IMC2515, etc., issued by December 1999 Final enforcement policy revisions by December 1999 Assessment process management directive issued by February 2000 3.
To the extent possible, evaluate the effectiveness of the processes to determine whether:
Pls and their thresholds provide an objective measure of plant performance and can accurately reflect changing trends in licensee performance The baseline inspection program adequately supplements Pls so that the combination of Pls and inspection provide reasonable assurance that the cornerstone objectives are being met The baseline inspection program is effective at independently verifying the accuracy of the Pls Enforcement actions are taken more consistently DRAFT 2/10/99 6
1
l DRAFT t
The assessment process and action matrix are sufficient to aid in making i
consistent action decisions for plants with varying levels of performance t
i i
l l
1 l
DRAFT 2/10/99 2
i DRAFT REACTOR OVERSIGHT PILOT PROGRAM
- 1. Pilot coordinator-Tim Frye
- 2. P1 lead-Don Hickman
- 3. Inspection program lead - Steve Stein
- 4. Assessmentlead-Dave Gamberoni
- 5. Enforcement lead - Dave Nelson
- 6. Information systems lead - Tom Boyce i
Prerequisites:
- 1. Develop inspection finding significance screening process February 1999
- 2. Perform process feasibility study February 1999
- 3. Establish Transition Task Force (TTF)
February 1999
- 4. Select pilot plants February 1999
- 5. Develop success criteria for all parts of pilot February 1999
- 6. Develop PI procedure (Management Directive, administrative lotter)
March 1999
- 7. Develop baseline inspection procedures, PI verification inspection procedure March 1999
[also, IMO 0610, IMC 2515 revisions]
- 8. Develop assessment procedure (Management Directive)
April 1999
- 9. Develop enforcement procedure (guidance document)
April 1999
April 1999
- 11. Train licensees April 1999
- 12. Train BCs, SRis, Ris, and pes April 1999
- 13. Joint NRC/ Industry workshop May 1999
- 14. Issue Pl procedure, baseline inspection procedures, enforcement procedure May 1999 Major pilot activities:
- 1. Licensee Pl data collection and submittal May 1999
- 2. Commence pilot program June 1,1999
- 3. PI verification inspection July 1999
- 4. Periodic NRC/ Industry meetings to review pilot results Jul, Sep, Dec 1999
- 5. NRC baseline inspection trist and documentation in inspection reports Jul, Sep, Nov 1999
- 6. Assessment - quarterly review September 1999
- 7. Assessment - mid-cycle revie#
December 1999
- 8. Enforcement as required Analysis:
- 1. Pi results ongoing, December 1999
- 2. Baseline inspection procedure evaluations ongoing, December 1999
- 3. Assessment process efficacy December 1999
- 4. Enforcement process efficacy December 1999 DRAFT 2/10/99 3
.7_
s
~
l 2
DRAFT
- 5. Evaluate success criteria ongoing, December 1999
)
Final products:
- 1. PIprocedure October 1999
- 2. Baseline inspection procedures (& PI verification procedure)
December 1999
- 3. Assessment procedure February 2000
- 4. Enforcement procedure December 1999
- 5. Information systems December 1999 DRAFT 2/10/99
}
4
- =---
l DRAFT l
PILOT PROGRAM GROUND RULES The pilot plants would receive the new baseline inspection program in lieu of the current core program.
Pilot plants would be assessed under the new assessment process in lieu of the current l
PPR process (no August PPR for pilot plants). Assessment for pilot plants will occur l
under the mid-cycle review, scheduled for November.
l Pi data collection for the pilot program will start in May 1999, with the first Pl report due June 15,1999. In addition to the pilot plants, the NEl task group,nlants will be asked to also participate in the Pl reporting portion of the pilot. The participating plants will be asked to collect and report two years worth of historical Pl data to supplement the data collected during the pilot.
Pilot plants will be handled under the new enforcement policy, in lieu of the current enforcement policy.
Subsequent to the completion of the pilot program, pilot plants 'would continue under the i
new oversight processes if fullimplementation is delayed for the short term (less than 3 l
months). If it is expected that full implementation will be delayed for greater than three months, than staff will evaluate restoring pilot plants to the current regulatory oversight l
processes.
l The risk-informed baseline inspection program would be piloted as follows:
Inspection planning would be tested at all pilot plants Adjustments to the inspection schedule would be tested at all plants l
All new inspection procedures will be tested, but not necessarily at all plants. For example the biannual problem identification and resolution inspection procedure might be tested at only 3 pilot plants.
l The PI verliication portion of the inspection program will be tested at all plants, but not all Pls would need to be verified at each plant.
As many inspectable areas as possible will be tested based on their intended frequency and the availability of associated activities. Some inspectable areas will not be used because they will not be applicable to the pilot sites; such as the refueling and outage related activities and several of the occupational exposure l
inspectable areas.
l Regional inspection planning meetings, with program office oversight and assistance, l
will be held for each pilot plant in May 1999. At this time, previously scheduled regional initiative inspections will be re evaluated to determine the continued need for the l
inspection under the new oversight framework.
The need for additional regional initiative inspection during the pilot program will be determined based on Pls and baseline inspection findings.
DRAFT 2/10/99 5
DRAFT A mid-cycle review and inspection planning meeting will be held for each pilot plant, and a six-month inspection look-ahead letter will be issued for the pilot plants by end of November 1999. These assessment and inspection planning activities will be based on the 5 months of pilot data collected by the end of October 1999.
Pi!ot plants will be discussed as part of the April 2000 SMM. Performance review and discussion at the screening meetings will be on Pls and baseline inspection results, and actions specified by the action matrix. The action matrix will be used to the extent practicable to determine those pilot plants that need to be discussed further at the SMM.
DRAFT 2/10/99 6
DRAFT SUCCESS CRITERIA REGULATORY OVERSIGHT PROCESS IMPROVEMENT PILOT PROGRAM l
The following success criteria will be used to evaluate the results of the regulatory I
oversight process improvement pilot program. These criteria will determine whether the overall objectives of the pilot program have been met, and whether the new oversight 4
processes: 1) ensure that plants continue to be operated safely,2) increase the public confidence in regulatory oversight, 3) improve the efficiency and effectiveness of regulatory oversight by focusing agency and licensee resources on those issues with the most safety significance, and 4) reduce unnecessary regulatory burden on licensees as the processes become more efficient and effective.
Performance indicator data collection and reoortino by the industry i
Can Pl data be accurately reported by the industry, in accordance with reporting 4
guidelines? Yes, if by the end of the pilot program, each Pi is reported accurately for at least 8 out of the 9 pilot plants.
Can Pl data results be submitted by the industry in a timely manner? Yes, if by the end of the pilot program, all plants submit Pl data within 1 business day of the due date.
j Risk-informed baseline insoection oroaram imolementation by the NRC Can the inspection planning process be efficiently performed to support the assessment cycle? Yes,if the planning process supports issuing an inspection look ahead letter within tours weeks from the end of an assessment cycle.
Are the inspection procedures clearly written so that the inspectors can consistently conduct the inspections as intended? Yes, if by the end of the pilot program, resources 1
expended to perform each inspection procedure are within 25% of each other for at least 8 out of the 9 pilot plants.
Are less NRC inspection resources required to perform the new risk-informed i
baseline inspection program. Yes, if the direct inspection resources expended to perform the baseline program are about 15% less than that expended for the core inspection program.
Can the inspection finding evaluation guidance be used by inspectors and regional management to efficiently categorize inspection findings in a timely manner? Yes if by the end of the pilot program, inspection reports and updated plant issues matrices (PIMs)can be issued within 30 days of the end of an inspection period.
DRAFT 2/10/99 7
)
l
?
DRAFT i
Can inspection findings be properly assigned a safety significance rating in accordance i
with established guidance? _Yes,if by the end of the pilot program, at least 95% of the
{
inspection findings were properly categorized and no risk significant inspection findings l
were screened out. Success will be determined by an independent review by the pilot program evaluation panel.
j Are the scope and frequencies of the baseline inspection procedures adequate to address their intended comerstone attributes? Success will be det -
ad by an independent pilot program evaluation panel, t
Does the implementation of the entire baseline inspection program (planning, inspection, evaluation of findings, documentation) require no more resources than currently required? Yes, if by the end of the pilot program, baseline inspection program planning, inspection, evaluation of findings, and documentation of inspection results requires no additional resources than are currently a!!otted for these activities.
Periodic assessment of Pi and insoection results to determine aooroorlate NRC actions Can the assessment process be performed within the scheduled time? Yes,if for all pilot plants, an assessment of the Pls and inspection findings can be completed, and an assessment letter can be prepared and issued, within four weeks of the last Pi data submittal.
Can the action matrix be used to take appropriate NRC actions in response to indications of licencee performance? Yes, if there is no more than one instance (with a goal of zero) where the independent pilot program evaluation panel concluded that action required for a pilot plant is different than the range of actions specified by the action matrix.
Do the Pls and inspection findings provide an adequate indication of licensee performance? Does the process provide a reasonable assurance that the comerstone objectives are being met and safe plant operation is maintained? Success will be determined by an independent pilot program evaluation panel.
Are the mid-cycle assessments performed, with inspection look-ahead letters issued, for the pilot plants in a manner that is consistent across the regions and that meets the objectives of the assessment program guidance? Success will be determined by an independent pilot program evaluation panel.
Enforceir,ent orocess imolementation j
l Can the revised enforcement process be efficiently implemented by regional and i
HQ staff. Yes, if for at least 90% of the cases only one enforcement panel is needed to determine the significance and disposition of a violation.
l DRAFT 2/10/99 I
8 t
l i
I
DRAFT Are inspection findings appropriately dispositioned in accordance with the new enforcement policy. Yes,if 95% of the issues are handled in accordance with enforcement policy requirements, as determined by an independent pilot program evaluation panel.
Are enforcement actions taken for individual findings consistent with the i
inspection finding evaluation guidance? Yes,if 95% of the enforcement actions taken are determined to be consistent with the inspection finding evaluation guidance, as determined by an independent pilot program evaluation panel.
i Information Manaoement Systems Are the assessment inputs and results readily available to the public? Yes, if by the end
=
of the pilot program, the NRC information systems support receiving industry data, and Pls and inspection findings are publically available on the Internet within 30 days of the data submittal.
Are the time reporting and budget systems ready to support the process changes. Yes, e
if by the end of the pilot program, these information systems support the reporting of j
time expended for regulatory oversight, and that discrepancies with the reporting of hours are less than 5%.
Are the NRC information support systems, such as the Reactor Progrcm System (RPS) and its associated modules, ready to support full implementation of the new oversight processes? Yes, as determined by evaluation by the pifot plant evaluation panel.
Overall Have inspectors and managers been provided adequate training to successfully e
implement the new oversight processes? Yes, as determined by a customer satisfaction survey evaluated by the pilot plant evaluation panel.
Are the new regulatory oversight processes overall more efficient and effective? Yes, if
=
by the end of the pilot program, overall agency resources required to implement the inspection, assessment, and enforcement programs about 15% less than current required.
Do the new oversight processes remove unnecessary regulatory burden, as appropriate, for the pilot plants. Yes, as determined at the end of the pilot program, based on the results of a pilot plant licensee survey.
DRAFT 2/10/99 9
~ ~ ~ ~ " ~ ~ ~~ ~
DRAFT PILOT PLANT SELECTION The foliowing criteria was used to identify potential sites for the pilot plant program:
4 To the maximum extent possible, licensees were chosen that had either volunteered to be a pliot plant, or had participated in the NEl regulatory oversight process improvement task group. The number of different licensees chosen to participate was also maximized.
Plants were chosen to represent a broad spectrum of performance levels, but did not include plants that were in extended shutdowns due to performance issues.
A mix of both pressurized water reactors (PWRs) and boiling water reactors (BWRs) was chosen.
A mix of plant vendors and ages was chosen.
To the extent possible, two plants with different performance levels within each region were chosen.
NRC regional office concerns such as experience of NRC staff associated with pilot plants and transition issues (such as expected departure of key NRC personnel during the pilot study) were considered.
Licensee concerns such as their involvement with other significant NRC activities (license renewal, steam generator replacement, etc.) was considered.
i I
)
DRAFT 2/10/99 10
... _ _ _... _ _._ __. _ _ _ _ _.. ~
DRAFT Potential Pilot Plants o Region?,
t f Plant 59,#$i',.m; Licensee;f& yr
~Last ?
5 PWR/3:
Wendor/ Age 3
-a..
.ey
-w.#.s ;w f%
.am."
P SALP
' BWR " +'
- h='11 1
Hope Creek Public Service Electric 2221 BWR General
& Gas (PSE&G)
Electric (GE)
Type 4/
13 years 1
Salem 1&2 PSE&G 1221 PWR 4 Loop Westinghouse (W)/
20 years 1
Fitzpatrick Power Authority of the 2222 BWR GE Type 4/
State of New York 24 years 2
Harris Carolina Power & Light 1121 PWR 3 Loop W/
Company 12 years 2
Sequoyah.
Tennessee Valley 2221 PWR 4 Loop H.
1&2 Authority 18 years -
3 Prairie island Northem States Power -
2121 PWR.
2 Loop E' 1&2 Company.
25 years -
3 Quad Cities Commonwealth Edison 2332 BWR GE Type 3 1&2 Company 26 years 4
Ft. Calhoun Omaha Public Power 2212 PWR Combustion District Engineering (CE)/
26 years 4
Cooper Nebraska Public Power 2231
- BWR GE Type 4/
District 25 years
SUMMARY
9 Plants 8 Licensees 1121 5 PWRs 4 W plants to 4 BWRs 1 CE plant 2332 4 GE olants Note: Licensees for shaded plants are not on NEl task group DRAFT 2/10/99 11
._._._____m i
o
{
I l
)
- Process for Characterizing Risk Significance ofInspection Findings Obiectives l
j.
- 1) To characterize the risk-significance of an inspection findmg conststent with the regulatory.
[
l response thresholds used for Performance Indicators in the NRC licensee performance assessment process (and for the enforcement process?), and
/
L
- 2) To provide a risk-informed framework for discussing and communicating the potential i
significance ofinspection findings Ai
/y
. /p'r
{(,
P Entry Conditions
-z ;;
y
['
/&
f His process is designed to assess inspection' findings within the cornerstories for initiating events, f mitigation systems, and barrier integrity under the Reactor Safety Strategic Performance Area. Dere are certain types ofpossible inspection findings, both within and outside of this Strategic Performance Area, j
that cannot be assessed using this process. Dese findings either 1) must be evaluated using non-risk-based methods or 2) will require risk analysis methods beyond the scope of this relatively simple process.
First among these are findings under the emergency preparedness cornerstone, and radiation safety and safeguards areas. The significance of these findings will be assessed by the process described in (XXXXXXX). Additionally, if the finding involves any of the following, it may be a candidate for
" exceptional" treatment outside of this process: a significant programmatic weakness not yet manifested by actual degraded performance, multiple performance issues that individually would be considered l
minor but collectively point to an uncorrected underlying cause, the safety of ex-core reactor fuel (e.g.,
spent fuel), seismic qualification issues, core safety during shutdown cor.ditions. Finally, any actual event (e.g., a reactor trip) that is complicated by equipment malfunction or operator error will be assessed by NRC risk analysts outside of the process described here.
{
4
.c r
Defininn Characteristic
/ b i
.2 ~
f.
De most important characteristic of this process is intended to be that it elevates potentially risk-significant issues early in the process, and screens out those findings that have minimal or no risk-I significance, it is further intended that field inspectors and their management be able to efficiently use the basic accident scenario concepts in this process to categorize individual inspection findings by potential risk significance. It presumes the user has a basic understanding of risk analysis methods.
{
f Introduction r
The' proposed overall licensee aIsessment process (as defined outside of this document) evaluates licensee performance using a combination of performance indicators (PI) and inspections. nresholds have been established for the Pls which, if exceeded, may prompt additional NRC action to evaluate licensee performance and to help understand and arrest a potential decline in performance. De finding assessment process described below evaluates the significance ofindividual inspection findings so that assesamt.rev3.wpd Rev. February 10,1999 (9:28AM)
I
- '**4 4+me*=.;-aw werep N;
, epy e-t
2 the overall licensee perfonnance assessment process can compare and evaluate these findings on a similar significance scale as the PI information.
Inspection findings related to reactor safety cornerstones (initiating events, mitigating systems, and barrier integrity) will be assessed differently than the remaining cornerstones (emergency planning, occupational exposure, public exposure, and physical security). For the reactor safety cornerstones, each finding is evaluated using a risk-informed framework that relates the finding to : specific structures, systems, or components (SSCs), identifies the core damage scenarios to which the failure of the SSCs contribute, estimates how likely the initiating event for such scenarios might be, and finally determines y what capability would remain to prevent core damage assuming the initiating events for the identified. V scenarios actually occurred. The emergency planning and the non reactor cornerstones do not lend 4 themselves directly to the risk-informed framework described by this finding assessment process.- /
Therefore, each finding under these cornerstones will be characterized according to separately developed significance criteria.
N A
y
[s Process Discussion s
s.
The inspection finding assessment process is a graduated approach using a three phase process to 1
differentiate inspection findings based on their potential risk significance. Findings that pass through a screening phase will generally proceed to be evaluated by the next phase.
J g
- j y Phase 1-Definition and Initial Screening of Findings - Precise characterization of the finding and an initial screening-out oflow significance findings
/
f.
l, j
Phase 2 -
Risk Significance Appmtimation and Basis - Initial approximation of the risk-significance of the finding and development of the basis for this determination, for those findings that pass through the Phase 1 screer:ing Phase 3 -
Risk Significance Finalization and Justification - As-needed refinement of the risk-signifi nce of Phase 2 findings by an NRC risk analyst Phases 1 and 2 are intended to be primarily accomplished by field inspectors and their management.
Until a user becomes practiced in its use, it is expected that an NRC risk analyst may be needed to assist with some of the assumptions used for the Phase 2 assessment. However, after inspection personnel
' become more familiar with the process, risk analyst involvement is expected to become more limited.
The Phase 3 review is not mandatory and is only intended to confirm or modify the results of significant
(" white" or above) or controversial findings from the Phase 2 assessment. Phase 3 analysis methods will
. utilize current PRA techniques and rely on the expertise of knowledgeable risk analysts.
b
,,f f Q
f
~
assessmt.rev3.wpd Rev. February 10,1999 (9:28AM) j l
i 3
1 Step 1 - Definition and Initial Screening of Findings Step 1.1 - Definition of the Inspection Finding i
It is crucial that inspection findings be well-defined in order to consistently execute the logic required by this process. The process can be entered with inspection findings that involve one or more degraded conditions influencing equipment or operator reliability, or initiating event frequency. ' The definition of the finding should be strictly based on the known existing facts and should NOT include hypothetical failures such as the one single failure assumed for licensing basis design requirements. Further, any explicitly stated assumptions regarding the effect of the finding on the safety functions should initially be conservative (i.e., force a potentially higher risk-significance), because the final result will always be viewed from the context of those ?ssumptions. Subsequent information or analysis may reduce the significance of the finding, with appropriate explicit rational. A well-defined finding must make all assumptions explicit, because these assumptions can be modified using this process o examine their influence on the results. Because of the range of possible findings, inclusive rules for defining all possible findings cannot be developed. However, the general rule is that the defini. ion of the finding must address its safety function impact and any assumptions regarding other plant conditions. Some examples are:
(
j j
y
- 1) A finding involving failure or degradatian of equipment could be stated as follows:
" Equipment / System / Component X des not perform its safety function of...". For example, a motor operated valve (MOV)in a PWR auxiliary feedwater system that is found with hardened gearbox grease (i.e., degraded) and an M07 with a broken wire (i.e., non-functional) would both be characterized cons:rvatively as "MOV does not perform its safety function of opening to provide flow to the steam generatots." j
./-
- 2) A finding involving a deficiency in the design of the plant couid be stated as follows:
" Equipment / System / Component X would not perform its safety function of.... under conditions
..". For example, a remote shutdown panel that might be rendered inhabitable during a cable j
spreading room fire that causes a loss of offsite power, due to inadequate HVAC dispersion of the resulting smoke, would be characterized as " plant cooldown not possible from control room j
or remote shutdown panel during a loss of offsite power caused by cable spreading room fire, due to inhabitability from resulting smoke and loss of power to remote shutdown panel HVAC."
- 3) A finding involving degradation in operator performance could La stated as follows: " Operator action X would (or would not) be performed under the conditions of....". For example, an
,~
l observation that operators mis-manipulated offsite power source breakers could be characterized conservatively as " operator error increases the likelihood of a loss-of-offsite-power initiating i,.
event and reduces the likelihood ofrecovery of t,fTsitt power sources."
\\w assessmt.rev3.wpd Rev. February 10,1999 (9:40AM) l I
~
._~
4 4
Step 1.2 - Initial Screening of tbe Inspection Finding For the purpose of efficiency, the guidelines below screen out those findings that have minimal or no impact on risk early in this process. The screening guidelines are linked to the comerstones as follows: if there is arguably no significant impact on meeting the reactor safety cornerstone objectives, the finding can be identified as having minimal or no impact on risk, and t' us is equivalent to a green Pl. The n
process described in this document focuses on the impact to core damage frequency, and thus only the reactor safety initiating event and mitigating systems cornerstones are addressed. Findings related to barriers and non-reactor safety cornerstones will be addressed separately. 'M j
r f-k
/:'
De decision logic is outlined below.
g/
A De finding should be analyzed by the Phase 2 assessment processv k./
s s
IF 1.2.1 - the finding and associated assumptions could simultaneously affect any two or more of the following:
e.
s _
s.N increase the estimated frequency of inP'*r:ng events,.
/
b.
~
._f degrade mitigating system reliability, j; Q degrade containment or RCS barrier performance, j
c ll OTHERWISE, continue.
/
'+
/
The fiading may be screened Ol'T(consicerad "g-een" without performing the Phr e 2 assessment)
IF 1.2.2 - the mding and associated assumptions do NOT increase the estimated frequency ofinitiating events, s
e, k
y.-
1.2.3 - the finding and associated assumptions do NOT degrade mitigating system reliability,,,.
f[
{ 1[
l WD lL' f
1.2.4 - the finding and associated assumptions do NOT degrade containment or RCS j-barrier performance.
[
_e. ;
(
OTHERWISE, continue.
assesamt.rev3.wpd Rev. February 10,1999 (9:28AM) 1 I
f y
...x
f 5-2 i
Findings that ONLY affect mitigating systems or barriers may still be screened OUT:
i IF-1.2.5 - the finding and associated assumptions do NOT represent a loss of safety function
)
of a single train of a multi-train system, g
p
/p OR f. j' f f.
1.2.6 - the finding and associated assumptions re'present a loss ' f safety function of a o
single train of a multi-train system for LESS THAN the Allowed Outage Time prescribed '
l by the LCO for Technical Specification equipment, or 24 hours2.777778e-4 days <br />0.00667 hours <br />3.968254e-5 weeks <br />9.132e-6 months <br /> for other equipment,1 i
(
k --
r y
n 1.2.7 - the finding and associated assumptions represent design or qualification errors or structural degradations that indicate less than expected performance or margin, but do not result in the system or barrier being unable to perform its safety function (e.g., meets NRC Generic Letter 91-18 criteria to remain operable). N ' ' /
/
Findings that ONLY affect initiating event frequency may still be screened OUT:
,~
g-s IF 1.2.8 - the finding and associated assumptions have NO other impact than to increase the likelihood of an uncomplicated reactor trip. :'
Any inspection finding that is NOT screened out by the abovc decision logic should be assessed using the Phase 2 process described below. i ;-
\\
+
v N
v y
'l
\\
f F
'f ir
[.
.f '
l?
g,
((
/J
^
s.
.s v
- s assessmt.rev3.wpd Rev. February 10,1999 (9:28AM)
-n
6 Phase 2 - Risk Significance Approximation and Basis Step 2.1 - Define the Applicable Scenarios
- Once an inspection finding passes through the Phase 1 screening it is evaluated in a more detailed manner using the Phase 2 process described here. Licensee-identified issues, when reviewed by NRC inspectors, are also candidates for inclusion in this three phase process..The first step in Phase 2 is to ask the
- question "Under what core damage accident scenarios would the identified issue increase the risk to safe plant operation?"
/
N E
lsy N
n jL l
Determining which scenarios make an inspection finding risk-imponant may be intuitive based on the -
knowledge and experience ofindividual inspectors. flowever, plant-specific PRA studies, safety analysis reports, technical specification bases, and/or emergency operating procedures (as examples) should be i
reviewed as needed to ensure that the most likely events and circumstances are considered. Specifically, the inspector must determine which core damage scenario (s) are adversely impacted by each specific finding.
4 y.
.j s
i
(
During this phase of the process, inspectors may determine that several different scenarios are affected by
{
j a panicular inspection finding. This can occur in one of two ways.
- p
[
.L
+
First, the finding may be related to an increase in' the likelihood of an initiating event, which may require consideration of several dominating (i.e., most likely) scenarios resulting from this initiating event.
[
p:
ju
. Second, a finding may be related to a system required to respond to sevent initiating events. For example, the discovery of a degraded instrument air system could affect plant response to a both a loss of offsite power and a LOCA. Each of these two initiating events must be considered i
separately, so that the next step of the Phase 2 evaluation process can determine which scenario I
is potentially most significant.
l The scenario resulting in the highest significa ce will be used to establish the initial relative risk -
significance of the finding. If a Phase 2 assessment of multiple applicable scenarios results in all " green" i
significance, the user should seek assistance via Phase 3 of this process, since the Phase 2 process cannot l
effectively " sum" the significance of multiple low significance scenarios. Additionally, a particular l
inspection finding may affect multiple comerstones by both increasing the probability of an initiating l
event and degrading the capkbility or reliability of a mitigating system. Again, each applicable scenario must be considered to determine which is the most significant, although scenarios in which both the affected initiating event and system failure contribute would be expected to produce the greatest risk j
significance, s; *, -
j y.
y Q
p i
assesamt.rev3.wpd Rcv. February 10,1999 (9:28AM) l i
I f
i y--4
7 In identifying possible core damage accident scenarios, consideration must also be given to the role of support systems as well as front line systems. For example, if a panicular initiating event can be mitigated by more than one system providing the same safety function, but all such systems are dependent on a single train of a support system (e.g., service water or emergency ec power), the limiting scenario may involve the failure of the single train of the support system rather than the individual front line system trains.
7 y
[^
Step 2.2 - Likelihood estimation of scenario initiating events /t
/ is N
In step 2.1, the set of core damage accident scenarios or containment failures were determined that could be made more likely by the identified inspection finding (degraded condition). This should result in the identification of one or more initiating events each followed by various sequences of equipment failures or operator errors. To determine the most limiting scenario, perform the following analysis for3Eb set of scenarios with a common initiating event.
s
?
If the finding does not relate to an increased likelihood of an initiating event, the initiating events for which the affected SSC(s) are required are allocated to a frequency range in accordance with guidance provided in Table I below. Table 1 is entered from the left column using the initiating event frequency and from the bottom using the estimated time that the degraded condition occurred, to arrive at a likelihood rating (A - H) for the combination of the initiating event and the existence of the degraded condition.
f g
f-If the finding relates to an increased likeli ood of a specific initiating event, then the likelihood of that initiating event is increased according to the significance of the degradation. For example, if the inspection finding is that loose pans are found inside a steara generator, then the frequency of SGTR for that plant may increase to the next higher frequency category, and Table 1 is entered accordingly.
i Finally, remember that the definition of the finding and the selection of core damage accident scenarios should be strictly based on the known existing facts and should NOT include hypothetical failures, such j
as the one single failure assumed for licensing basis design requirements.
N-
~
,f3>
+
I l;
s
?
l
//
F
.A
.,/ ~~
3 assessmt.rev3.wpd Rev, February 10.1999 (9:28AM) i
8 Approx. Freq.
Example Event Type Estimated Likelihood Rating Reactor Trip A
B C
>l per 1 - 10 yr Loss of main FW Loss of condensor p.
1 per 10 - 102yr LOOP B
C
/
D;
[
SGTR g
,g x
MSLB (outside entmt)
['
,e,
/
Loss ofI SR AC bus p/,
y Loss ofinstr/Cntrl Air
['
^g. -
'N; Fire t
N 1 per 102 - 10' yr Small LOCA (PWR)
C D
E Stuck open PORV/SV s'N MFLB Flood s
1 per 10'- 10' yr Med LOCA (PWR)
D E
MSLB (inside cntmt)
./
.N Loss of all service water 1 per 10'- 105yr Lg/Med LOCA (BWR)
E F
G
F G
H ISLOCA Vessel Rupture i
Severe Earthquake N
> 30 days 30-3 days
<3 days N.-
Fxposure Time for Degraded Condition Y
+
Table 1 - Estimated Likelihood Rating for Initiating Event Occurrence During Degraded Period i
)
/.
Use of Table I should result in one or more initiating events ofinterest with an associated likelihood rating ("A" through "H") for each.
Step 2.3-Estimation'of remaining mitigation capability assessmt.rev3.wpd Rev. February 10,1999 (9:28AM) 4 4
i t
I c
9 The scenarios ofinterest have now been identified, and the likelihood of the frequency with which the l
initiating events associcted with these scenarios has been estimated. Each scenario represents an initiating event followed by a series of system, component, or human failures. For the evaluation of the 4
impact of each scenario on risk, the conditional probability of failure of redundant or diverse success i
paths will need to be assessed for each scenario. If an inspection finding involves a system or component d
in a failed state (e.g., pump rotor found seized), then the conditional probability would be' assessed by
(
combining the independent or common cause failures of the remaining trains or systems that constitute the remaining success paths at the time of the discovery. In order to perform this assessment, the inspector must determine how many alternate success paths v.ere available for each core damage
^
n scenario.
/J
/m
?
fv j/
\\_.
/
l Success paths may be redundant trains of components identical to the one discovered in a failed state, or l
they may be diverse systems that provides the same function. Dus in reading Table 2, Risk Significance Estimation Matrix, the following interpretation should be made:
y
)
I train refers either to a remaining train that was redundant to the one assumed failed (it is i
assum-d that if a CCF had occurred it would have been part of the finding), or it could be a diverse one-train system (e.g., RCIC).
,/
x*.
f-A redundant system refers to a multi-train system that is used to perfdrm the same safety function as that for the failed system or component (even though it may achieve it in a differerit way). An example would be depressurization and low pressure injection as a diverse means ofinventory control to high pressure injection (BWR). f
/
if an inspection finding involved a potentially recoverable system failure, such as an automatic start feature found failed but indication exists and simple operator action would be able to start the equipment, then such operator action can be credited as one success path. In addition, recovery actions that establish alternate electrical or water supplies through the use of non-safety equipment can also be credited, provided operator training, written procedures, and necessary equipment is appropriately staged and available. Inspectorjudgements with support from the SRAs (when crediting non-safety equipment)in estimating remaining mitigation capability will continue to be needed, with the basis for such judgements appropriately documented. When all scenarios have been assigned and associated likelihood and remaining mitigation capability estimated, the Table 2 matrix described in the next section can be used to estimate the potential significance of the degraded condition.
p y C
e f
lI
/
i Step 2.4 - Estimating the Rirk Significance ofInspection Findings
\\:
~,
e assesamt.rev3.wpd Rev. February 10,1999 (9:28AM)
...n..
_.. _ _. ~. -.._
10 The last step of the Phase 2 assessment process is to estimate the finding's relative risk significance. This i
risk estimation is performed by employing an evaluation matrix (see below) which utilizes the information gained from Steps 2.1 through 23 above. Simply stated, the matrix compares scenario likelihood derived in Step 2.2 with remaining mitigation capability determined in Step 23, and establishes an estimated risk significance for the particular finding. One of only four possible results can be obtained: Green, White, Yellow, or Red.
/#
4'
,f.
l >"
t
/;
/~ o 7
{'.
,/ R
/
p..
p.
s
.i
[..
/r
' s.,,
(.
o -
\\;
l
%q
,[.
g.'
s s.
'\\
s
/:
j.
[
\\
,s.
/
/
i*
N 4
?
f.
l-l l:
}.
s ".:
.\\{
N.;
assessmt.rev3.wpd Rev. February 10,1999 (9:28AM) aw d'
-'Wh.W.
e+ rM G
11 Risk Significance Estimation Matrix Remalr.ing Mitigation Capability (From Step 2.3)
Scenario f.
f Likelihood 23 trains or 1 redundant 2 trains I redundant
.Itrain 0
(From Step 2.2) 2 redundant system + 1
' ystem /
s i
systems train
/
,2
/
/
Green White Yello,w
' Red Red Red Green Green White Yellow Red Red B
Green Green Green White Yellow Red C
Green Green Green -
Green White Red D
~
Green Green -
Green Green Green Yellow E
o Green Green Green Green Green White F
Green,
Green Green Green Green Green G
Green' Green Green Green Green Green H
i:
s Example uses of the Phase I and 2 process are included in Appendix 1 of this document.
t.
j t.
. _../
\\ s assessmt.rev3.wpd Rev. February 10,1999 (9:28AM)
~... _..._.
1 1
12 Step 3 - Risk Significance Finalization and Justification
' If determined necessary, this phase is intended to confirm or modify the earlier screening results from Phase 1 and 2. Phase 3 analysis will utilize current PRA techniques and rely on the expertise of l
know:edgeable risk analysts. The Phase 3 assessment is not described in this document. j hn yy',c t
Work Remaininn U'is.
f.
j,
.?
f,.
y, f
Continue to perform sensitivity tests of the issue screen and decision matric process depicted in this G-paper, by evaluating known examples.
{[
p~
'Ng l
5
~';
%y l
The definitions of" remaining capability" needs continued refinement.A f.y Define the threshold required to dxument this process for specific inspection findings.
i t.
w Identify how to address inspection findings that could increase risk due to significant programmatic weakness not yet manifested by actual degraded performance, multiple performence issues that i
individually would be considered minor but collectively point to an uncorrected underlying cause, the safety of ex-core reactor fuel (e.g., spent fuel),' seismic qualification issues, core safety during shutdown conditions..
/
b;"
,s 1 3
j; f'
f)
/.
/
E' f-l'
, J' 4
/
L.
/.
w Q
'w s.
<g 4<
?
d'
's
.a
+..
r 4
y
~
ur.
N.
'y<
~l
/
T u j
f?
- . W f
e J'
/V 2 - - /af
(
?
.~~ %;.
assesamt.rev3.wpd Rev. February 10,1999 (9:28AM)
)
e t
a e +m e.
6% 'w www..
m
r
+
13 1
i Appendix 1 i
INSPECTION FINDING RISK ASSESSMENT PROCESS SENSITIVITY TEST
Background
The sensitivity of the inspection finding risk assessment screening in section - was measured by evaluating examples of reactor cornerstone items that represen a typicafinput to the process.
]
Additionally, several previous ASP events were pushed backwsds throrgh the proc 5ss to ensure that the screemng criteria were set at the appropriate level. It is also e'xpected that be6ause of th'e simplicity offthe model, the process has the potential to overestimate the risk lignifica req 6 iring
, a more refined evaluation before a final assessment can be ermined.I O d-hy yy
@y TYPE OFISSUES REVIEWED
]hk.
Issue - a Resir, issue) Aooendix "R" Safe Shutdown Pane 1 In NNM%
m This scenario involved a failure to meet 10 CFR part 50 Appendix "R" requireme~nts III.L.2.e and Ill.L3.
i These sections of Appendix "R" require that support functions bh capable ofproviding the process cooling necessary to permit the operation of equipment ' sed for pfe-shutdown' functions, and that alternative u
shutdown capability accommodate post-fire c'd5ditions Stien off site pMr is or is not available. De case study reviewed, postulated that a fire in the'c'ontrol roorn or in the cablispreading room could cause a loss of HVAC due to a fire induced loss of dffsjte power (LOOP).pis would result in a loss of HVAC to the electrical equipment room and the adjacen't hot shutdown control panel room.
((
fR When this item was screened tiy the' risk assessment m}odel, several as assumed that a fire in the, cable spre'ading roo'm cou d 'gr'ow sufficiently to require evacuation of the main control r6om and ~cause a' LOOP resulting inYloss of HVAC, which in turn results in the single safe shutdown' panel room being iloperable.'An additionalassumption was that the lo s of habitability of the alternative safe shutdoiva panel room resulted in the inability to shutdown and maintain the plant safely shutdown under the postulated conditions the item was$nsidered risk i$portanthc?When the staff asked the screening question ause of a lou of redundancyin the mitigating capability and the item pass 4d through the' screen thereby requiring additwnal review for risk significance.
. /Y.
htl W i
Determining risk importance oftfe Appendix "R" issue using the risk significance estimation matrix involved ent'ering the matrix to deterrnine the approximate frequency and estimated likelihood based on duration of the condition. The JPEEE and Licensee's Risk Analysis was used at this step. In the case reviewed, the fikelihood was rated /s 5 "E" event ( ie IE IE-05). Once the likelihood was categorized, a review of pmaining mitigati6n lapability was performed. Since the scenario resulted in the loss of the one and only safe shutdowd panil room and administrative operator intervention was not assumed, mitigation was not allowed by the~ matrix. This item would therefor be categorized as a yellow item.
assesamt.rev3.wpd Rev. February 10,1999 (9:28AM) i 1
i I
o i
w es-,.<
1 I
14
~ Issue - b (LER) Charminn/Hinh Head Safety Inlection Pumo Confinuration Outside Plant Desian Basis with All Three Pumos Technically Inonerable This issue involved the licensee's discovery that all three charging /high-head safety injection pumps (CCP) were in a configuration that was outside plant design basis. Specifically, the system design, included three l
CCP, trains A and B with the "C" pump being a swing pump that could be pow;e, red from'either emergency electrical bus. De design included a breaker interlock and trip feature to prevent potential elect-ical bus overloading while being powered by an EDG. The system configuration reIsulthd irithe "C" pump being unable to automatically stan because its hand switch was in pdil-tslock.fhe break'er interlock and tr@
feature would have tripped the "A" pump if a LOOP occurred a6d the "B" pump'was assunied to be the sin'gle failure. This condition existed for approximately ten days. ff Y
@Qh,
./
Vr s.
_: 29 When the staff asked the screening questions listed in step 1.2,' the item was considered,riskimportant because of a loss of redandancy and the item passed through the screen'thereby requiring additional review
~
for risk significance.
eq;Q w%
. Determining the risk imponance ofthe CCP system degradation using the risk significance estimation matrix involved entering the matrix to determine the approximiite frequency and' estimated likelihood based on a 10 day duration of the condition. In the case reviewed, the likelihood was rat]ed,a's an "E" based on duration, LOOP, combined with a Small LOCA or Steam Line Break event. Once the likelihood was categorized, a review of remaining mitigation capability wa's jierforme8.' Credit forthe "B" train CCP would mitigate the event to a "G" which is grees. Additionally, if analysis dem'onstratid that sufficient time would allow operator intervention to stan another puriiMased on EOP actiomi,irisk ofthe event would be funher reduced.
/k?'
f
//
Issue - c (Dual Unit Issue) Two of Thre' e Emernency Diesel Generators Technically Inocerable p
fU f ??
This issue invoived the discovery that with Umt I at 100% power and the Unit 2 in cold shutdown two of the three EDGs were found to be technically inoperable. Plant design has EDG#1 dedicated to Unit I and EDG#2 dedicated to Unit 2 w'ith EDG#3 being a swing diesel that will supply power to one or the other troubled units' EDG#2 had been r'emoved from service for maintenance three days before it was discovered j
that one of the tw?o fuel oil transfer p' umps for'EDG#3 was erroneously tagged shut when EDG#2 was taken j
out of service.' EDG#3 was considered inoperable because a single failure could have resulted in the loss of I
the ability to supply fue} oil to the' limited capacity day tank and therefore, sustain EDG operations was not assured?
Ded/
//'
.f When the staff asked the screening questions listed in step 1.2, the item was considered ofiow risk
$nsequence because there'was no loss ofredundancy other than the allowed outage ofEDG#2. EGD#3 was 1
h fact operable, anditier'efore, the item did not pass through the screen. Therefore, no further review was rformed. This event' would be considered green as to risk significance.
i
~
I L A., l l
Issue d..-i assesamt.rev3.wpd Rev. February 10, i999 (9:28AM)
Surrested Regulatory Information Conference Questions
- 1. What do you expect to learn from the pilot plant studies?
- 2. Previous NRC assessment approaches have resulted in plants being placed on the "Watchlist" with little or no warning. How will the revised process provide earlier warning of a plant that needs increased attention?
- 3. The cornerstones appear to weighted equally in the process, yet, some cornerstones are more important than others in terms of public health and safety. How will the assessment roll up account for this?
- 4. What will this process mean for my future as an NRC employee? Is this process just a reaction to fiscal pressures from Congress?
- 5. When willlicensee self assessments be credited as a further substitute or reduction to the baseline inspection program?
- 6. What do you see as the future role of the NRC Regional Offices under this revised approach?
- 7. How will Chairman Jackson's departure in June affect this process?
- 8. How has all the effort the industry and NRC put into the maintenance rule being factored into this process?
- 9. At the September workshops there was a lot of discussion about the " rebuttable presumption" concept where it would take considerable inspection findings to overturn the PIs. How will NRC management ensure that inspection does not default to a lead roh in the process based on a lot ofinsignificant inspection findings?
10.This process is reported to be risk-informed. Are utility PRAs sufficient to implement this process?
- 11. Manual scrams are not counted in the WANO indicator based on concerns for inhibiting operator actions. Why does the proposed indicator include manual scrams?
12.The green white thresholds are set at a level that is significantly above regulatory requirements. What is INPOs role in the new process if the NRC is monitoring performance well above standards?
13.This process appears to place a lot of responsibility on the Resident Inspectors.
When and how much training will the Resident Inspectors get to be able to implement this process?
14.How will enforcement be revised to be consistent with the risk-informed thresholds set for PIs and inspection findings?
15.The SECY acknowledges elimination of N+1 residents for multi-unit sites with j
high performance. What about single unit sites? Can the single unit sites also expect some inspection reduction based on performance? Could the second resident inspector be assigned some other duties (e.g., assist PM in licensing reviews)?
16.How will the assessment process accommodate a momentary red or yellow input that is restored quickly?
17.How will the overall performance results be communicated - colors? Numbers?
Or just words?
18.Will the assessment report be based on the latest quarters worth of PIs and inspection findings or will it reflect earlier performance over the evaluation period?
19.What if a plant has one white PI or inspection finding each quarter, but allin different cornerstones? How would the assessment be characterized and i
reported?
i i
NRC Pr: posed Pla t Assessme:t Process Ouestions and Comments M
Section OH#
PS Ouestion or Comment 1
A PM Scrams Why count rnanual scrams?
2 A
P/l How do P/l results on one unit affect dual unit sites?
3 A
P/l RS Scrams is retrofit of data required?
4 A,
PS Transients Weekend criteria are confusing. Clarification required.
D 5
A PM SSPI Does this reflect the Maintenance Rule in regard to numbers Ofsystems?
6 A
PS SSP 1 Will the proc:ss consider EPIX reporting requirements?
7 A
PA Links to other processes need to be identified and documented.
8 A
PM SSPI There are discontinuities between the WANO and NRC performance indicators.
9 A
P/l SSPl A consistent defmition of unavailability is needed.
10 A
PM SSPI The EDG PA is not consistent with the extended out of service times in some Licensees Technical Specifications.
I1 A
P/l SSF There is disagreement with the current method of classifying SSFs. An example of this is the amount of time considered when a failure is classified.
12 A
P/l is data reported monthly or quarterly?
13 A
PS RCS leak Why is the green / white at 50% TS when containment leakage is at 100% TS7 14 A,
P/l What is the time frame required to be con >idered when a failure is identified?
D 15 A
P/l ERO DEP What is an evaluated drill? Do we have to count all training?
16 A
P/l
~
ERO DEP Do all the measurable areas count towards the 75%?
17 A
PM Security Compensatory measures are capable of meeting the intended function of the i
security eqr pment.
18 A
P/l Security Does the security equipment performance P/l deviate from the mission of protecting the health and safety of the general public?
19 A
P/l Security The SECY 99-007 has inconsistencies related to the security equipment P/1.
20 A
P/l Security He definition of protected area security equipment includes all ccimponents. The use of the word all in this context raises the concern that insignificant components may be considered.
21 A
PM We need to avoid the inclusion of PJls just because they can be easily met.
22 A
P/l Security Do we need to consider the potential effect of making public information concerning the health of security systems at Nuclear power generating stations?
23 A
P/l 59 Security The personnel screening process P/l may create some perverse consequences due to different thresholds at various plants.
24 A
Pil 65 is there an effon to reduce reponing requirements?
25 A
PM 11 RS Scrams RS Scrams appear to be too prescriptive. Was the use of ASP considered?
26 A
P/l 66 Find out the frequency of the FEMA report related to reponing the status of ERO Alert and notification systems.
27 A
Inspect 12 Willthe RIMS be public documents?
~
28 A
Inspect 12 Will the RIMS be ready in time for the pilot plants?
29 A
Inspect 17 Would the significance of findings concept be epplied to all conditions? Example; would it be applied to planned maintenance configurations?
30 A
Inspect 19 If a plant requires other inspections beyond the RIBL1, where will the inspectors come from?
31 A
Inspect 19 During the pilots it will be very important to validate the number of inspection hours required to perform the RIBLI.
32 A
Inspect 19 The determination of the impact ofinspection on the ability of the comerstone to i
meet its objectives appears to be the remaining subjective piece of the process.
33 A
Inspect 19 is the inspection significance assessment process expected to be part of the pilot r
1 l
t
-.-.~
~.
~
I process?
34 A,
inspect 19 What happened to the cross cutting issues?
D 35 A
Assess How does the NRC plan on performing all assessments simultaneously?
36 A,
Assess Will the 6-week inspection reports continue?
D 37 A
Assess Will the PIM data still be sent to the licensee?
38 A,
Assess is the PIM under development?
D 39 A,
Assess 10 Will the annual assessment have a number, or color, or letter rating?
D 40 A
Assess 10 Why are there subtle differences in language between licensee action in columns 11 and 1117 4i A,
Assess 10 is there going to be clear definitions regarding the difference between corrective D
action and self-assessment? What is the role of self-assessment in this process?
42 A
Assess 13 How often will the assessment be reported?
44 A.
Assess 13 How will the conective action program be assessed? Will it be risk based?
D 45 A
Assess 13 Will there be there be credit for self identification?
46 A
Assess 13 Will there be a difference between how the process is implemented during extended outages verses nonnal plant operations?
47 A
Assess 13 Does tl'c Initiating events comerstone attribute chart represent a report card i
matrix for the cornerstone?
48 A
Assess 13 Will INPO change any of their processes to reflect this process?
j 49 A,
Enf.
10 ne criteria that propose to issue a NOV in instances where there is a failure to D
place an item in the corrective action system appears to be very subjective.
50 A
Enf.
10 How will the NRC disposition the results of corrective action system assessments?
51 A
Trans.
11 Will the pilot process be conducted with N or N+1 inspectors?
52 A,
Trans.
14 Dere is a lot of work remaining to develop the P/l manual. The manual design D
needs to ensure consistency.
53 A
Trans.
18 1s the NRC still interested in certification of PRAs?
54 A
Trans.
18 The NRC has not embraced ths PRA certification process.
55 A
Trans.
18 Has the NRC requested a standard process for corrective actions?
56 A
PM Scrams his measurement should be moved to transients.
57 A
PM Security Do pre-employment failures count?
$8 A,
Pa 6
Will there be future development of the thresholds based on plant specific PPAs?
D
$9 D
Over 10 Are there different levels of significance of the cornerstones?
60 D
Inspect How does the finding significance assessment work? Will it be tied to safety significance?
61 D
P/l 3
Are there separate conceptual risk models for the non-reactor an as?
62 D
Pa 4
What is the extent of the green band and will the thresholds shif. as industry performance continues to improve?
63 D
P/l 7
Scrams How would failure to insert the rods IAW procedure M icllected in the process?
64 D
P/l 8
Scrams Scrams should be counted as transients this v.G remove the controversy concerning manual verse automatic.
65 D
Pa 8
Scrams Why don't we count scrams directed by procedure?
66 D
P/l 13 RS Scrarr?
His indicator appears to be a poten ial multiple counting process when coupled with the P/Is for scrams and transients.
67 D
PM 13 RS Scrams is there data available on actual industry pertoi..' ance in this area?
68 D
P/l 13 RS Scrams The industry should expect NRC additional inspection if one of these events 1
2
.--__.__-~-.-..;.-_-..--
l
.=
occurs.
l 69 D
Pa 14 Trans.
How will this PM handle response to weather related events that require plant power reduction? An example of this would be a seaweed intrusion.
' 70 D
Pa 14 Trans.-
This PM appears to potentially create a negative impact on the commercial decision making responsibility of station management.
71 D
P/l 14 Trans What is the difference between a load follow transient and a transient required by I
maintenance related activity?
72 D.
PM 20 SSP 1 The RHR portion of the SSP 1 appears to be inconsistent between reactor types and also with the INPO indicators.
73 D
P/l -
22 SSFs --
We need a list of the 26 systems monitored by this indicator.
74 D
P/l 30 RCS leak is the P/l based on total leakage, and what is the green to white threshold based on?
75 D
P/l 33 Cont.
If we meet the TS requirements why do we need this indicator?
76 -
D PA 33 Cont.
Why don't we look at the active containment mitigation systems?
77 D
PM 37 ERO is this negative reinforcement for implementation ofless stringent critiques?
78 D
P/l 37 ERO The SECY document refers to timely and accurate notification, this requires more i
definition.-
79 D
P/l 37 ERO If you watch your indicators in this area you can be successful in improving your performance.
80 D
P/l _
39 ERO This indicator does not appear to drive us in the right direction. What are we l
trying to accomplish? The PM needs more definition.
81 D
P/l 45 RAD The occupational exposure control PM criteria needs more definition in regard to what constitutes the various elements of the indicator.
82 D
P/l 51 RAD The offsite release PM is the single most problematic area, based on plant design.
Can we move to a percentage-based indicator?
83 D
P/l 55 Sec.
In the security equipment performance areas are weather-related events considered?
II4 D
P/l Are the near term P/Is still being worked?
85 -
D P/l RCS leak This indicator may create unintended consequences because it cuts in half the perceived permissible RCS leakrate.
86 D
inspect.
I1 How is the RIM risk based?
87 -
D Inspect.
13 What is the relationship between RIM 1 and RIM 27 88-D inspect.
14 What is going to happen to the inspection module set?
89 D
inspect.
15 What is the difference in direct inspection hours for the specific inspection areas?
90 D
inspect.
Will the inspection process changes affect the current operator training inspection program?
91 D-Enf Do civil penalties have anyimpact?
92 D
Enf.
10 is there a thought to eliminate the failure to abate the cause of the violation?
93 D
Trans.
Why only run the pilots for 6 months?
3
l l
NUCLEAR ENERGY INSTITUTE l
lNDUSTRY CALENDAR 1999 January 27,1999 NEi Nuciear ruei suppiy Forum January 5-8,1999 Omni Shoreham eel CEO Committee Meetings Washington, DC eel Board of Directors Meeting eel CEO Conference February 7-101999 Scottsdale Princess Hotel PIME '99: 11th international Workshops on Nuclear Scottsdale, AZ Public Information in Practice Tony Anthony (202) 508-5454 Avignon, France.
European Nuclear Society.
Contact:
Iris Riesen.
January 7,1999 Tel: 41-31-320-61-11 NEl Governmental Affairs Advisory E-mail: iris.riesen@to.aey.ch Committee (in conjunction with eel CEO Conf)
February 8-9th,1999 Scottsdale Princess Hotel, Salon C CNA Nuclear industry Winter Seminar Scottsdale, Arizona Ottawa, Ontario.
Ms. Sylvie Caron - CNA/CNS Office January 19-20,1999 144 Front Street West, Suite 475 INPO Board of Directors Toronto Ontario. M5J 2L7, Canada INPO Council meetings Tel: (416) 977-6152 x 18 Atlanta, GA Fax: (416) 979-8356 email:carons@cna.ca January 21,1999 NEl Executive Committee February 10,1999 Waldorf Astoria NEl Strategic issues Advisory Committee New York, NY The Wyndham (Jan 22 Financial Analyst Briefing)
Washington, DC January 24 27th,1999 February 18,1999 Health Physics Society Symposium '99 NEi Communications Advisory Committee Albuquerque, New Mexico. USA NEl Office
Contact:
Mr. J.M. Hylko F: (505) 837-6870 Washington, DC email:jhylko@msn.com website: www.tli.org/rgetitle.htm February 21-24,1999 NEl Energy Info Centers January 25-27,1999 Sheraton West Palm Beach infoCast Conference West Palm Beach, FL
' Opportunities in the Competitive Nuclear Power Industry" March 4-5,1999 Washington, DC NRC Regulatory information Conference
Contact:
Claire Schoor (818)902-5405x36 The Capital Hilton January 24-27,1999 March 16-17,1999 Health Physics Society Symposium '99 INPO Annual Meeting l
Albuquerque, New Mexico INPO Board of Directors Meeting l
Contact:
J.M. Hylko Committee Meetings l
Fax: 505-837-6870 Atlanta, GA email: jhylko@msn.com i
i
March 24-25,1999 eel CEO Committee Meetings April 20-22,1999 i
eel Board of Directors Electric Power 99 eel CEO/ Governmental Affairs Conference Sponsored by: POWER Willard Hotel Baltimore Convention Center Washington, DC Baltimore, MD Tony Anthony (202) 508-5454 P (713) 463-9595 F (713) 463-9997 i
March 25,1999 www.electricpowerexpo.com NEl Governmental Affairs Advisory Committee April 26 29,1999 (In conjunction with eel CEO/ Govt. Affairs Conf) 3 International Exhibition on Nuclear Willard Hotel Power Industry Washington, DC Sanghai Mart, China Tel 852-2827-6766 March 30,1999 email: general @ coastal.com.hk NEI Executive Committee 12-3 pm NEl Offices May 2-5,1999 Washington, DC NEl Fire Protection Info Forum Cleveland Rennaissance April 6-8,1999 Cleveland, OH i
American Power Conference l
Marriott Downtown Chicago May 11-12,1999 Sponsored by the Illinois INPO Board of Directors Meeting Institute of Technology in Chicago Advisory Council Meeting Bob Porter. Phone (312) 567-3196; Atlanta, GA Email: apc@iit.edu j
May 17-19,1999 April 11-14,1999 International Symposium on Mox Fuel Cycle NEl Fuel Cycle Technologies for Medium and Long Term Austin Renaissance Deployment: Experience, Advance, Trends Austin, TX IAEA contact: P (+43) 2600 (0) plus extension F (+43) 2600 7 April 1214,1999 EMail: Official. Mail @iaea.org JAIF Annual Conference web: www.iaea.org April 13-14,1999 May 19-21,1999 EPRI Board of Directors & Committee NEl Nuclear Energy Assembly Four Seasons Hotel
' Monarch, Hotel Washington, DC Washington, D.C.
Lisa Steward (202)739-8006 April 15,1999 NEl Strategic issues Steering Group May 19,1999 NEl Offices NEl Executive Committee Washington, DC in conjunction with the Nuclear Energy Assembly Monarch Hotel April 18,1999 Washington, DC Intemational Conference on Lisa Steward (202) 739-8006 l
Nuclear Engineering (ICONE-7)
Tokyo May 19,1999 NEl Communications Advisory Committee in conjunction with the Nuclear Energy Assembly Monarch Hotel Washington, DC i
Lisa Steward (202) 739-8006 l
l 2
I
l l
l
\\
l May 19,1999 June 20-23,1999 NEl Strategic issues Advisory Committee NEl - Health Physics in conjunction with the Nuclear Energy Assembly indian River Plantation Monarch Hotel Stuart, FL Washington, DC i
July 14,1999 May 20,1999 INPO Board of Directors Meeting NEl Governmental Affairs Advisory Atlanta, GA Comr ttee July 20,1999 j
a (In conjunction with Nuclear Energy Assembly)
NEl Executive Committee Monarch Hotel NEl Offices Washington, DC Washington, DC May 21,1999 July 28,1999 NEl Board of Directors Nuclear Fuel Supply Forum In conjunction with the Nuclear Energy Assembly Willard Hotel Monarch Hotel Washington, DC Washington, DC August 10-11,1999 May 30th - June 2,1999 EPRI Board of Directors & Committeea, j
CNA / CNS Annual Conference Coronadn Island Marriott Resort Montreal, Quebec.
Coronado, CA Ms. Sylvie Caron, CNA/CNS Office 144 Front Street West, Suite 475 August 28,1999 Toronto, Ontario. M5J 2L7, Canada NEl Strategic issues Steering Group Tet (416) 977-6152 x 18 NEl Offices F: (416) 979-8356 Washington, DC emailcarons@cna.ca August 30 - Sept 3,1999 June 6-10,1999 International Symposium on Technologies for the ANS Annual Meeting Management of radioactive Waste from Nuclear Boston, MA Power Plants and Back-end Nuclear Fuel Cycles i
Activities June 7-8,1999 IAEA contact: P (+43) 2600 (0)
NEl Emergency Planning Forum F (+43) 2600 7 j
Don CeSar Hotel emait Official. Mail @iaea.org J
Saint Petersburg Beach, FL web: www.iaea.org June 1315,1999 September 7-9,1999 eel CEO Committee Meetings eel Chief Executive Conference eel Board of Directors Meeting Broadmoor Convention / Expo Ctr Colorado Springs, CO Long Beach, CA Tony Anthony (202) 508-5454 Septert ber 8-10,1999 Uranium institute 24* Annual Symposium June 14-18,1999 London, L. K Intemational Conference on the Strenghtening of
Contact:
44-171-225-0303 Strenthening of Nuclear Safety in Eastern Europe Fax: 44-171-225-1308 l
l IAEA contact: P (+43) 2600 (0)
F (+43) 2600 7 Septamber 14-15,1999 EMail: Official. Mail @iaea.org INPO Board Meeting web: www.iaea.org Advisory Council Meeting INPO Nominating Committee Atlanta, GA 3
[-
September 19 21,1999 October 19-22,1999 WANO Biennial General meeting international Conference on irradiation to Ensure Empress Hotel and Victoria Conference Center Safety and Quality of Food l
Victoria, BC, Canada Marrakesh, Morocco 1AEA contact: P (+43) 2600 (0) i September 26 29,1999 F (+43) 2600 7 L
NElInfo 99/ Crisis Comm Workshop emait Officia!. Mail @iaea.org San Antonio, TX web: www.iaea.org September 30,1999 October 28,1999 NEl Governmental Affairs NEl Communications Advisory Committee 9:30-11:30 am NEl Office l
NEl Offices Washington, DC Washington, DC i
November 3-5,1999 September 30,1999 INPO Board of Directors /CEO Meeting NEl Executive Committee Atlanta, GA 12:00-3:00pm NEl Offices November 10-12,1999 Washington, DC INPO CEO Conference INPO Board of Directors October 3-6,1999 Atlanta, GA NEl International Uranium Fuel Seminar The Sagamore on Lake George November 14 - 18th,1999 Bolton Landing, New York ANS Winter Meeting Long Beach, Califomia
Contact:
ANS Office October 17-20,1999 eel Financial Conference 555 N. Kensington Avenue Disney Dolphin La Grange Park, fils. 60526 Lake Buena Vista, FL Tel: (708) 579-8258 Tony Anthony (202) 508-5454 October 21,1999 NEl Executive Committee NEl Office j
NEl Strategic issues Advisory Committee Washington, DC Washington, DC October 17-22,1999 December 8,1999 NEl Fundamentals of Nuclear NEl Strategic issues Steering Group Communication - Training Seminar Washington, DC Bethesda Hyatt Bethesda, uD 2000 October 18-21,1999 January 4-5,2000 NEl Decommissioning Forum INPO Advisory Council Meeting Marriot at Sable Oaks INPO Personnel Development & Comp Mtg Portland, ME INPO Board of Directors Meeting Atlanta, GA October 18 21,1999 NEl Fire Protection Workshop January 12-14,2000 Don CeSar Hotel eel CEO Conference Saint Petersburg Beach, FL eel Board of Directors meeting eel CEO Committee Meetings Ritz Carlton, Rancho Mirage, CA 4
i i
l Tony Anthony (202) 508-5454 L.
January 19,2000 September 24-27,2000 NEl Nuclear Fuel Supply Forum NEl International Uranium Fuel Seminar The Peabody Memphis -
Resort at Squaw Creek l
Memphis, TN Olympic Valley, CA February 29,2000 October 29 -Nov 1,1999
)
l lNPO Audit Committee Meeting eel Financial Conference lNPO investment Review Committee Meeting San Francisco Hilton Atlanta, GA San Francisco, CA Tony Anthony (202) 508-5454 March 1,2000 INPO Annual Meeting October 31,2000 INPO Board of Directors Meeting INPO Board of Directors Meeting I
WANO-AC Governing Board Meeting Atlanta, GA i
Atlanta, GA November 2-3,2000 April 2-6,2000 INPO CEO Conference NEl Fuel Cycle 2000 Atlanta, GA The Peabody Memphis Memphis, TN 2001 May 8-9,2000 NRC Regulatory information Conference January 12-14,2001 eel Chief Executive Conference May 9-10,2000 Westin La Paloma i
INPO Advisory Council Meeting Tucson, AZ j
INPO Board of Directors Meeting Atlanta, GA April 1-4,2001 NEl Fuel Cycle June 18 21,2000 Grand Hyatt San Francisco eel Convention San Francisco, CA Palais des Contres
' Montreal, Canada June 3-6,2001 eel Convention / Expos July 12,2000 Hyatt Regency INPO Board of Directors New Orleans, LA Atlanta, GA 2002 S.Ptemb.r s 7,2000 eel CEO Committee meetings eel Board of Directors Meeting January 9-11,2002 Renaissance Chicago Hotel eel Chief Executive Conference '
Chicago, IL Scottsdale Princess Hotel Tony Anthony (202) 508-5454 Scottdale, AZ September 12-13,2000 2003 INPO Advisory Council Meeting INPO Nominating Committee INPO Board of Directors Meeting EE Ch ef thxe'cu ve Conference WANO-AC Governing Board Meeting Ritz Carlton
^ " " ' ^
Naples FL q
5
--~,
s
.o DESCRIPTION OF ACRONYMS ACORD American Committee on Radwaste Disposal AECL Atomic Energy of Canada Limited AEIC Association of Edison Illuminating Companies ANS American Nuclear Society APPA American Public Power Association i
CIP CongressionalInformation Program CNS Canadian Nuclear Society eel Edison Electric Institute Elc Electric Information Council ENS European Nuclear Society EPRI Electric Power Research Institute GNS German Nuclear Society lAEA International Atomic Energy Association INPO Institute of Nuclear Power Operations NEl.
Nuclear Energy Institute NARUC Nuclear Association of Regulatory Utility Commissioners NERC North American Electric Reliability Council NRC U.S. Nuclear Regulatory Commission NRECA National Rural Electric Cooperative Association NUFCOR Nuclear Fuels Corporation of South Africa SEE Southeastem Electric Exchange SNE Spanish Nuclear Society i
WANO World Association of Nuclear Operators I
6 l
l
./
PERFORMANCE INDIGATORW'
,:::?-
/. <
PILOT PROG
'2Rh REPORTING MA,R,AM
'y
/
7 NU "ihre
,. xy
- e
.:, s,.,
'i' $Y-
/
',b,_
.u
.. =8
.f,7,y. c/%
..i,e
.14.r'q s
- ;-x. d.
,f a
a
_.f.
. ~y;;w-7 1.j 9
y'y p.
T
.y
' -f
,. vtmi*
..~v-
.=
,.c%
%, February'1999
- s.,
- . n,
,, c.
. n..
.... n,
,..-y
-. my
,.,,; ; c:.,
. u.
y
,: n.
f
..:;, 7
&;r 4V
.cl
.,4
~,
y
..=
J
./
Office of Nuclear Reactor Regulation
~l' X ' g U.S. Nuclear Regulatory Commission
- ' : _'.,37 f
I i
I i
r
j Performance Indicator Pilot Program Reporting Manual
- 1. Introduction The purpose of this manual is to provide guidance to reactor licensees for repon the data necessary to support the NRC's performance assessment pilot program,i This has been developed through a cooperative effon among the NRC, the NucJe.
public, and other stakeholders during a series of public meetings. Pdforndnce indicators have been selected to monitor licensee performance in cedainireas.hesh61ds have been established for these indicators to provide clear boundapeshtweenglyecceptable,
/
declining, and unacceptable levels of performance. (Fgria detailed.descriplign.of ho~w y
d thresholds were established, see the appropriate appenyix toA ciunent 2 of SF{Y-93007,
" Recommendations for Reactor Oversight Process Improvements.) The purpose bf4he pilot program is to determine whether the performance idnicahrs and'their associated thresholds are appropriate for their intended use.
[
- -?GW y
iw!'p
- 2. General Reporting Guidance A.~A
%!/
4
- . g y
Licensees participating in the pilot program $ill c;ntinue to ubmi eports in accordance with should clearly identify the name of the# po'ns invpving a' cornerst 10 CFR 50.72 and 50.73. Any such re events, such as safety system failufes,4nay,elpiire engidering analyt.is to detennine if they r
are reportable. In these cases, it'is best tojnake a timeiy', conservative decision regarding reportability followed by,ainokthoroug,h analysishIfit can be established that an event that y
has been reponed is notg' din the 6tiginal reydri; that is if the report wa quired'to be reported,dicensees should retract the report by the same method used to sen NRC, it sliould be r'etracted'by a phone call rather than by a letter.
h.-K y -44 During the p,ilot' program, panicip~a(mglicensees should compile the cornerstone performance indicator, data'desciibed in this manu'al on a monthly basis and submit the data electronically within J ' days of the e'nd of each month in the format described in Appendix A. Each month'ly' report should include the plant name and the date of the report. For each indicator tha[was reponed in acchidance with 10 CFR 50.72 and/or 50.73, include the applicable 50.72 e/ent munber and/or tileLER number. Those indicators for which data are only available friodically (e.g., c/ntainment leakage or emergency response organization drill / exercise
~
performance). nee (be reported only for those months when new data become available.
Q'f
- 3. Cornerstone Performance Indicator Data Reporting Guidance This section describes the cornerstone performance indicators and the data that should be I
reported monthly for each of them.
I
~
O.80, which is the current industry average. This ensures that periods when the reactor was shut down are excluded from the calculation.
Critical means that the effective multiplication factor (k n) of the reactor at the time of the scram was essentially equal to one.
Data Elements: The following data are required to calculate this indicator:
the number of automatic scrams while critical in the last 12 mo e
jb the number of manual scrams while critical in the
.72 N/A
,%: x pu
?
the number of hours of critical operation in the '/=,,%
t 12 mo e
'9
)
/
'$%9 Yindic5 tor are determudi Calculation: The unit and industry average values for
.7:.y.%
- ~n (total manual and automatic ser while critical) x 7,000 (number of critical houis a9 40$$4,
.2 a
(total ofindust automatic and manual scrams) x 7,000 inaustry average =
e
,gf g g ;;g g
gf 9 p
/
lK;f?@ff Thresholds: The following thresholds "ve been tablished_forthis indicator:
increased Regulatory Res [nse'(gree white) l
=
hold - 3 e
,e J,-
. ?.
f Required Regulato sponse (w te-yell, threshold - 6 e gg 4..a f Q*
Unacceptable Perfo (yellow-red) threshold - 25 s
e.
,. 7h
. >7%
+
o,.s Becau'e rate indicators can produce misleadingly high Data Oualific ion Reauirementsg s
' values whenjhe denominator,is sq,7this performance indicator will not be calculated when there are wer thha400 critical hours in the last 12 months. Instead, performance will be through supplhnentalinspection.
asses kV Data Reportine Requirements: The following data should be reported by licensees monthl) fn
)
!4. \\the number automatic scrams in the last month j
- y i
.np
.the nu,gnber of manual scrams in the last month the number of critical hours in the last month the indicator value for the last 12 months (if there are fewer than 2,400 critical hours in e
. the last 12 months, report the indicator as N/A) 3 i
~
Fluctuations are transitory changes in reactor power that occur in response to changes in reactor or plant conditions. Unplanned fluctuations are those that are not an
)
expected part of a planned evolution or test.
The transient rate is calculated per 7,000 critical hours because that value is representative of the critical hours of operation in a year for most plants, i.e., an availability factor of 0.80, which is the current industry average. This ensures that periods of reactor shutdown are excluded from the calculation.
4 Data Elements: The following data are required to calculat this ind UEr: p
'6fD.
[S MC%
the number of transients in the last 12 months
' ~.%Q Y;h.my the number of hours of critical operation in th t 12 e
+,,
Calculation: The unit and industry average values for dind'iIiator are determined as follows:
I 4
,e g g w:n unit value = (number of transients) x 7,000
'=&.
i e
$e.ip (number of critical hours)
W
- M3~
Ng
\\
(total ofindustry transients)- ' 7,000
.mdustry average =
(-- l of industry criticil hours) x (niunber of operating plants) tota
[
[
)
Threshold: The following thresho d
, beenpstablished r this indicator. No Required Regulatory Response or Unacce
_1e Perfo nce thrediolds have been established because this indicator is not related to rip'k.'
d,j/
A f.f f +.?
}
- Increased RegulatoryResponse (gig-white) threshold - 8 Data Oualification Reau\\@ts:kSh.,
Nr -A iremen LBecau.se rate i;xlicators can produce misleadingly high values when tfie. denominator isbil this performance indicator will not be calculated when i
there are fe er'tha(n 2,400 critical %, ion
~
rs in the last 12 months. Instead, performance will be assessed ugh supplemental > inspect
}':iy Data enortine Requirementsi The following data should be reported by licensees monthly:
4
}l ients in the last month e4 the number of ns
-mwthe number,#of critical hours in the last month
' ( f,,, se
,3 Me' mdicator value for the last 12 months (if there are fewer than 2,400 critice hours in the last 12 months, report the indicator as N/A) 5
d Radiation Monitoring lostrumentation Residual Heat Removal Systems Reactor Coolant System Safety Valves Reactor Core Isolation Cooling System Spent Fuel Systems Reactor Trip System and Instnamentation Standby Liquid Control System I
Recire, Pump Trip Actuation Instrument.
Attributes of Hernw Perfm...- -: The following attributes oflicensee perfo are monitored by this indicator:
O..
the availability and reliability of the monitored SS k
'MN@g e
.;/
n7C %
a w, (
. the adequacy of maintenance and test procedur for those s
2 53, A
[
Effsf the performance of plant personnel in maintainpgthe. pability of the SSQtofulfill Jag their safety functions W[Ng4 Definition: The number of actual or potential failure,of the ety func, tion of the monitored SSCs during the past year.
- g g:7y i
Actual failures are those that occur
~~ - ;w :.A
~
ahlid de$and during operation or test.
They do not include those that r during/po'st-maintenairk esting before the SSC is t
declared operable and returned [' servi N
@/
/15 1
4 Potential failures include
'that c d have
rred upon a valid demand for the SSC to perform its safe fu'nction assu'mingjhe fecessary conditions were in place to cause the potential SsC, jlure to #
^ Wor c'xample, a safety system failure would be cou' tedif a comp,onent was found ' be~ environmentally unqualified so that, should a n
Ifigh energy.line brep occur.in the worst-case location, the component could fail, would' render the syiterin incapable of performing its safety function.
w "L; k Kfg The functio 6'of an SSQ'Mcludes any function (s) for which credit was taken in y.aciciden?dnalysis.%an W
y f
l ', A s v Data ements: The fol 4Wipg' data are required to calculate this indicator:
~i
.? -
e:
1
- 1 the number of fety system failures in the last 12 months
-n g
.y adatian Jheunit and industry average values for this indicator are calculated as follows:
- 2. s ;: &
n.
Nt alue = the number of safety system failures (total of industry safety system failures)
.mdustry average =
e (number of operating plants) 7
~ ~ ~.-
L
.c l
l Events involving the loss of one mode of operation of an SSC are considered to be l
safety system failures if the loss of that mode affects the system's ability to complete the safety function. For example, an event in which the ability to manually start the high-pressure coolant injection system is lost would be a failure even if the system could be started autornatically. The loss of manual swed contml of the same system, however, would not be a failure as long as the required flow rate could still attained.
- - When multiple occurrences of an SSC failure occur, the determination of number of iIe operable failures to be counted will depend upon whether the SSC was ar lem ex
.hrielto correct it, between occurrences. If the licensee knew that a prp'em w and considered the system to be operable, but thepyst gbseqsently found to have been inoperable the entire time, multiple failures will bgeW.9But if the pl,
knew that a potential problem existed and decla[ed the system ino was not declared operable in the interim. Similarly,cin,cadited a failures of the SSC for the same problem would, not be i
situations where
~
l did not rea'ize that a problem existed (and thus 'cquid not have intentionally declared l
the system inope,able or corrected the problem,%nly one fail,ure is counted.
t
-..m a;a A failure leading to an evaluation in whichadditional fai s are'found is only counted l
i as on failure; new problems found duririthe evaluation are ~ dunted, even if the causes or failure modes are differentdWintent is Io not cjamt additional events when problems are discovered while readiving the#cr'iginal.probledi.
f l fa)ilures as long as a completely
]
l Tnlin failures are not counted as safety stem f"
redundant train of the same system is4apable ofrforming the safety function. (Note that one consequence of hirule is[ hat failures df single train systems are counted as safety, system failures.)M Gm y [
l y
r
~..o.
<N. m b
s
- , n When a potential failure by an identified mechanism (i.e., not a random single failure) s That is, ' it is disco %C (all trains of the system), a safety system f could
- pacitate.m vered that." redundant" trains rely on a single component or are s
uninten65nally or incorrectlyscr[ss-connected, and a mech &nism is fcund that could i
incapacitatell. trains,in safety system failure is counted.
fW f(When a single try'mWils while the other train is inoperable i
I 4 m both trains bein'g simuhaneously inoperable, a safety system failure is counted, i
Tl Similarly, wh[' inoperable for any reason (including survei problem affectirig one train is identified, and it is determined that the 4 other train
. time.the lem existed, a safety system failure is counted, l
Nthe absence of an identified potential failure mechanism, it is not necessary to j
consider a single random failure. Licensees are not required to satisfy the single i
failure criterion for purposes of determination of a safety system failure. That is, events
}
involving only a single train of a multi-train system are not counted as safety system j
failures as long as the other train always remained operable.
i l
9
Conditions in which missile shields are determined to be inadequate (for example, the turbine building walls may not be able to withstand a tornado generated missile) are not necessarily safety system failures. Such conditions should be analyzed for reporting as a failure.
3.3 Barrier Integrity Cornerstone Reactor Coolant System (RCS) Activity Durnose: This indicator monitors the integrity of the fuelpladding, first?of the three barriers to the release of fission products. It measures radioactivi ;in the fuel as an indication of functionality of the cladding.
/
/
Attributes of Licensee nerformancq: The following a tiites licensee perforgiire monitored by this indicator:
the adequacy of the design control of fuel pinspnkas ' p; f
blies'and the nuclear core e
through physics testing
[
/
u the performance of plant personnel i
- *
- ing challenges, the iitegrity of th fuel cladding
/
f
't d
/
the adequacy of procedures thaf direct ivities thatinaN the potential to affect fuel cladding integrity
,1
/
/
g j
th! adequacy of configuration control progrins#for handling, storing, and positioning nuclear fuel, forpintaining proper control ' rod patterns, and for maintaining proper RCS water chemis y4
,' \\.
the perfotmance (integrity) of thefuel cladding
+
s f,
j Definition:-The~iiiaximum RCS activ ty each month as calculated per technical specifications.
.j
-;y Data ements: The fol data are required to calculate this indicator:
M All RCS activg[ calculations for the isst month Aculation: The unit.and industry average values for this indies.or are calculated as follows:
wwf a hnit value = the maximum value of calculated activity y
(total of industry unit values)
.mdusq average =
e (number of operating plants) 11
Required Regulatory Response (white-yellow) threshold - 100 percent of the technical specification limit Data Reoortine Requirements: The following data should be reported by licensees monthly:
the maximum value of the calculated RCS leakage in the last month e
Containment I4akage 4
Purpose:
This indicator monitors the integrity of the containment, third'of the ti x barriers to the release of fission products. It meast:res coltainmentieqkagelsa ptmge o the technical specification allowable leakage to provide 4tidicatio 4T ntainment integri /
j
/
Q
'./
Attributes of Licensee oerformance: The following a 'butes of< icensee pe nerare monitored by this indicator:
M..%
.y g
., u the originally designed containment structural
- giity and operational capability mA the 7%rmance (integrity) of the containmehlbarrier
- 4j7 Definition
- The estimated "as found" integra/s.ted E rate for't x
x; the design basis leak rate (L.).
/
/
~~
u m' jf 1
r The "as found" leak rate is,Isiesult pf, e latest tegrated leak rate test modified by l
the results of subsequent ocal' leak rate tests.
/
l f
?
p.
Type 1C tests should~be performed pt the beginning of each refueling outage to reflect j
+
the~ leak rate thatIxisted during the9tevious cycle.
2 s.
m Data Elements: The'followingeta are required to calculate this indicator:
l
}
\\
%j the resulta late'st integrated leak rate test l
l
,.7 4
e results of subsequent l'ocal leak iaa: tests l
.j
%y
, j the design basisddak rate, L.
l
.. a r
A
'i l
alculation: The t and industry average values for this indicator are calculated as follows:
a;Y i
,, ~, '.tTalue =
P'
("as found" leakage) e L,
where the "as found" leakage is the result of the latest integrated leak rate test modified by the results of subsequent local leak rate tests 13
.. ~..
t DRAFT 2/1o/97 a
!i INSPECTION NON-CONFORMANCE EVALUATION MATRIX i
Emergency Preparedness Comerstone: Ensure that the licensee is capable of implementing adequate measures to protect the public health and safety in the event of a radiological emergency i
OBSERVATION FINDING SIGNIFICANT FINDING NRC-Identified non-NRC or hcensee identified m-NRC orlicensee identified l
Key Attnbutes con'viiviance that has little conformance that, ifleft non-cuiJurir.rs, e that, if left or no immediate impact on uncorrected, compromises EP uncorrected, would t
EP capability.
capability.
significantly challenge EP j
capatzhty.
A l'
b l
[
.E I
e i
[
t i
- I
INSPECTION NON-CONFORMANCE EVALUATION MATRIX ERO Readiness individuals fai!
failure to meet orimplement a failure to meet oi implement augmentation test planning standard (other than the a risk significant planning risk significant planning standard e.g., 50.47 (b) 4,5,9, duty roster qualification standards) e.g., 50.47 (b)
& 10 lapses 1,2,3,6,7,8,11,12,13,14,15, & 16 Examples:
problem ID and resolution Examples:
l program failure ERO is unable to (e.g.,
l failure to conduct as dunng an actual i
required drills emergency) classify emergency conditions, CR, TSC or EOF can not perform notifications, i
be activated IAW Plan perform assessment
\\
due to augmentation test actions orimplement failures orlack of PAR procedures.
[
qualified personnel
{
ongoing failure ciproblem ID failure to staff CR, TSC or and resolution program EOF dunng an actual event i
i failure to conduct 50.54(t) audit systematic failure of problem ID and resolution program l
I s
[
h
, )
INSPECTION NON-CONFORMANCE EVALUATION MATRIX Facilities and Ewipment
- missed surveillance failure to meet orimplement a failure to meet or implement a
planning standard (other than the a risk significant planning
- equipment not IAW Plan risk significant planning standard e.g., 50.47 (b) 4,5,9, standards) e.g., 50.47 (b)
& 10
- communications 1,2,3,6,7,8,11,12,13,14,15, & 16 channels not IAW Plan Examples:
Examples
- ANS surveillancrtinissed degradation of systematic surveillance equipment is such i 3t
- problem ID and resolution program lapses the heensee can nca program failuie perform assessment equipment or activities communications channel lapses render CR, TSC or degradation of EOF unable to perform equipment is such that functions IAW E Plan the notification functions can not be ANS testing program perivi,veed does not meet guidance ongoing failure of problem a
- systematic failure of problem ID and resolution program ID and resolution program
+
INSPECTION NON-CONFORMANCE EVALUATION MATRIX Procedure Quality EPIP or supporting failure to meet orimplement a
- failure to meet orimplement procedure error planning standard (other than the a risk significant planning risk significant planning standard e.g., 50.47 (b) 4,5,9,
.t EPIP change not routed standards) e.g.,50.47 (b)
& 10 l,
to NRC IAW 10 CFR Part 1,2,3,6,7,8,11,12,13,14,15, & 16 50 Appendix E Examples Examples superseded revision of EPIP errors result in EPIP found in ERF EPIP errors result in failure to notify failure to activate facility problem ID and resolution EPIP errors result in program failure EPIP errors result in loss of the ability to failure to augment ERO property classify l
emergency conditions EPIP errors result in failure in prompt ongoing failure of problem ID communications among and resolution program ERFs EAL or Plan changes not IAW i!
50.54 (q)
- :;fstematic failure of problem t
ID and resolution program i[
l I
5.
3 l
INSPECT!ON NON-CONFORMANCE EVALUATION MATRIX ERO Performance inspector follow up items failure to meet orimplement a failure to meet orimplement planning standard (other than the a risk significant plar:ning failure of the licensee risk significant planning standard e.g., 50.47 (b) 4,5,9, critique to identify poor standards) e.g., 50.47 (b)
& 10.
exercise performance such 1,2,3,6,7,8,11,12,13,14,15, & 16 as:
Examples l
Examples:
failure to implement failure in actual event a planning standard failure to conduct to perform appropriate (other than the risk required drills and timely significant planning classification, standards)
Failure in an actual event notification, j
to activate facilities LAW assessment or PAR prompting exercise the E Plan activities participants failure of the licensee critique to Degradation of ERO drill control or identify poorexercise performance is such scenario problems performance such as:
that the licensee can j
not perform f
problem ID and resolution failure to implement a risk classification, program failure significant planning notification, standard e.g., 50.47 (b) assessment or PAR n
Please note: areas of poor 4,5,9, & 10 activities exercise performance, including failure to systematic failure of problem ongoing failure of problem ID implement planning ID and resolution program and resolution program standards, that are identified and corrected by the licensee are below the threshold of an observation.
This would include weaknesses.
i!!
((
iII l'
1' 1
e b
. (
a e.
f t
o n
n ata o g
hz n
c Tt i
i d
f i sw r
ine o
vteo e
i c
e n
l n
ps is ale o
f c
s a si k iy z
wr a u sgsd e e
nin i
rih t
r s hi t
i ds t doin t
h ha
- nsf w
t efi des i
l i e X
wl ot esi t
b n
d Aa n
rt a eim o
IR Mn ecdya i
i r
o sf ibr e
T Es ais g
p n ntao Fa egoh A
e r
n l i Psct p M
e r
tce N
p O
sn i
I r
)
T to o
?
A ns la y
t U
s ad o
e do t
L c
r i
g i
n0e n
e A
e2p n
il o
l i 1 o
V c
d r
z n
i ie E
ehf a
f w
i r
dt E
ewe y
lo i
ic e
le c
p C
ids o
y c e s
w e
N r
elvA d
e n
t A
xoM o
a d
n o
l o
o z
esE h
e r
z i
M rF s
b e
w e
e e
r s
p it lo R
h ih n
h le O
T t
o w
y i
d t
F c
lu e
d d
N s
o p
o o
e h
s i
i r
O ic s
n e
e r
n r
p p
i C
e o
a l
n n
i
(
in o
it N
ic o
l f
ia n
i e
t O
d n
ie c
c n
b e
e e
e p
p N
s a
s s
ib n
n ic n
i i
r a
N e
i l
l O
e in s
a a
i x
g n
n i
I g
n n
n T
i d
e ie i
n i
d n
b b
C n
f i
a a
E i
t f
n n
n t
P n
a i
i e
ic s
s S
c f
g g
i fi n
n n
N in g
d d
i i
I g
is i
f n
n i
is e
f t
s e
e r
r u
h g
n h
o g
n i
O T
F iE dn i
P F
E tn e
a s
t c
g i
i s
f n
f in d
i fO g
in iS F
I