ML20195H631

From kanterella
Jump to navigation Jump to search
Pilot Program Guidelines
ML20195H631
Person / Time
Issue date: 05/20/1999
From:
NRC
To:
Shared Package
ML20195H628 List:
References
PROC-990520, NUDOCS 9906170105
Download: ML20195H631 (27)


Text

r I

l l

i i

Pilot Program Guidelines i i

- 1

< 1 I

i l

l 4

l l

9906170105 990520 I

PDR t

} REVGP ERGNUMRC i PDR Attachment 1

'] ? ole 120 f oj,..

TABLE OF CONTENTS -

1 I NTRODU CTION . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 1.1 P u rpose . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 1.2 Scope........................................... ............. 1 1.3 Objectives . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 2 PILOT PROGRAM OVERVIEW . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. . 2 2.1 Objectives of the Pilot Program . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2 2.2 Pilot Program Major Milestones . . . . . . . . .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 3 PILOT PROGRAM GROUND RULES . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4 4 CONDUCT OF OVERSIGHT PROCESSES DURING PILOT . . . . . . . . . . . . . . . . . . . . . . . 6 5 PILOT PROGRAM SUPPORT ORGANIZATION . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10 5.1. Pilot Plant Support Staff . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10

' 5.2 Procedure Development Group . .. . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . 11 5.3. Pilot Program Evaluation Panel . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12 6 PILOT PROGRAM SUCCESS CRITER'A . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15 6.1 Performance Indicator Reporting . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15 6.2 Risk-informed Baseline inspection Program . . .....................15 6.3 ' Assessment . . . . . . . . . . . . . . . . . . . . . . . ................ .... 16 6.4 Enforcement . . . . . . . . . . . . . . . . . . . . . . . . ..................... 16 6.5 Information Management Systems . . . . . . . . .........,..........17 6.6 Ove ra ll . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 7 7 PILOT PROGRAM RITS GUIDANCE . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19 I

. 8 PILOT PLANT SELECTION . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23 t

t 1

i

)

I 1

I i

l

l 1 INTRODUCTION i 1.1 Purpose 1 The purpose of the pilot program is to apply the proposed new regulatory oversight process described in Commission papers SECY-99-007 and SECY-99-007A to a select number of plants. Performance indicator (PI) data reporting and the revised inspection, assessment, and I enforcement processes will be exercised at the pilot plants. Lessons learned from this pilot effort will allow the processes and procedures to be refined and revised as necessary prior to full implementation.

1.2 Scope The pilot program will be a 6-month effort that will involve two sites from each region. The plants selected, as shown in Table 8.1, represent a cross-section of design and licensee performance across the industry. The pilot plants will collect and report Pl data, be inspected by the NRC under the new risk-informed baseline inspection program, have enforcement action taken under the new enforcement policy, and be assessed under the new streamlined assessment process. i 1.3 Objectives 1

The objectives of the pilot program are to (1) exercise the new regulatory oversight processes to evaluate whether or not they can function efficiently, (2) identify process and procedure problems and make appropriate changes prior to full implementation, and (3) to the extent possible, evaluate the effectiveness of the new processes. The pilot program will also measure the agency and licensee resources required to implement the new PI reporting, inspection, assessment, and enforcement processes in order to quantify the resource changes. The results of the pilot program will be evaluated against pre-established success criteria. Full implementation of the new oversight process will commence pending successful completion of the pilot program, as measured against these success criteria.

l 1

1- l

F .;

)

2 PILOT PROGRAM OVERVIEW 2.1 Objectives of the Pilot Program i

i The objectives of the pilot program are to s pply the new Pl, inspection, assessment, and

. enforcement processes to a limited number of plants in order to (1) exercise the new regulatory oversight processes and evaluate whether or not they can function effici,ently, (2) identify process and procedure problems and make appropriate changes prior to full implementation, i and (3) to the extent possible, evaluate the effectiveness of the new processes. The pilot I program will also measure the agency and licensee resources required to implement the new Pl reporting, inspection, assessment, and enforcement processes in order to quantify the resource 4 I

changes. Ground rules for how these new processes will be applied to the pilot plants are discussed in Section 3. The conduct of the pilot program is described in section 5. As described in Section 6, pilot program success criteria have been established to measure the

. ability to meet these objectives. Full implementation of the new oversight process will i commence pending successful completion of the pilot program, as measured against these success criteria.- Specific objectives of the pilot program are as follows:

1. Perform a limited-scale exercise of the following processes to evaluate whether they can function efficiently, including:
  • Pi data reporting by the industry

- Performance of a risk-informed baseline inspection program by the NRC Evaluation of Pl and inspection results and determination of appropriate actions through the assessment process

  • ; implementation of a revised enforcement process that is integrated with the other new oversight processes

. NRC time reporting and information management systems i

2.- Identify problems with processes and implementing procedures and make appropriate  ;

. changes to support full implementation, including:

- Issuing final Pl collection and reporting guidance to the industry ,

l

- Issuing new or revised inspection program documentation (e.g., inspection  !

procedures, inspection manual chapters)

- Issue final significance determination process (SDP) j

= lasue final enforcement policy revisions  !

= NRC time reporting and information management systems ready 2

l

6 Assessment process management directive issued

3. To the extent possible, evaluate the effectiveness of the new regulatory oversight

~

processes to determine whether:

The Pls and their thresholds provide an appropriate, objective measure of plant performance

-* The baseline inspection program adequately supplements and complements the

- Pls so that the combination of Pls and inspection provide reasonable assurance that the comerstone objectives are being met The baseline inspection program is effective at independently verifying the -

accuracy of the Pls The SDP thresholds provide an appropriate, objective measure of the safety significance of inspection findings

consistent with the safety significance resulting from the significance determination process .

1

- The use of the new assessment process and action matrix results in more . . -

consistent and predictable NRC action decisions fer plants with varying levels of performance 2.2 Pilot Program Major Milestones . . .

Attachment 6 to Commission paper SECY-99-007 provided the plan that the NRC would use to transition through the implementation of the revised oversight processes. The following  ;

provides a summary of those transition plan activities related to the pilot program and an j updated schedule based on continued development work and coordination with the industry and ,

j public.

2.2.1 Pilot Activities l May 1999 - Pilot plants start Pl data collection June 1,1999 - Commence pilot program July 1999 . - Periodic NRC/ Industry public meetings to review pilot results September 1999 -

Industry /NRC public meeting on pilot program results November 1999 -

Mid-cycle assessment review December 1999 - Analysis of pilot results against success criteria i

3 1

3 PILOT PROGRAM GROUND RULES The following ground rules define how the pilot program will be performed for the participating sites; they were developed to ensure that the objectives of the pilot program would be met.

These ground rules were developed in conjunction with the regions and with headquarters proram offices; comments from the industry and the public were considered and incorporated as appropriate.

The pilot program ground rules are as follows:

  • The pilot plants will receive the new baseline inspection program in lieu of the current core program.
  • The pilot plants will be assessed under the new assessmen+ p.wess in lieu of the current processes (i.e., no August plant performance review (PPR) for the pilot plants). The pilot plants will undergo a periodic assessment at the mid-cycle review, scheduled to take place at the end of November 1999.

. L* . Pilot plants will be subject to the new enforcement policy, in lieu of the current ~

. enforcement policy.

  • All inspectors leading an inspection (senior resident and specialist inspectors) will receive training on the new oversight processes prior to conducting the inspection.
  • PI data collection for the pilot program will start in April 1999, and the first'PI report will be due from the participating licensees by May 14,1999. The pilot plants will be asked to collect and report two years worth of historical Pl data (when possible) to supplement the data collected during the pilot program.

v.

  • The historical Pl data submittal by the pilot plants will represent a "best effort" to collect and report the data by the participating licensees. This is intended to acknowledge the

' difficulties inherent in collecting some of new data required for the Pts that has not been .

previously maintained by the licensees. While the NRC will use this historical data to assess performance, the requirements of baseline inspection procedure 71151,

" Performance Indicator Verification," will not apply.

  • The risk-informed baseline inspection program will be conducted at the pilot plants as follows: ,

- The regions will plan the new baseline inspection program to be conducted for all pilot plants prior to commencing the pilot.

-- - ' Periodic adjustments will be made to the inspection schedule to add or remove initiative inspections.

- All new baseline inspection procedures will be performed in each region, where practical, during the pilot program. However, each procedure need not be 4

l i

l performed at each plant. For example, the periodic problem identification and resolution inspection procedure might be tested at only four pilot plants, one in each region; The Pl verification portion of the baseline inspection program will be tested at all of

' the pilot plants.

As many inspectable areas as possible will be inspected based on their intended frequency and the availability of associated activities. Some inspectable areas may not be covered because they will not be applicable to the pilot sites; such as the refueling and outage related activities.

Regional inspection planning meetings, with program office oversight and assistance, will be held for each pilot plant in May 1999. At that time, previously scheduled regional initiative inspections will be reevaluated to determine the continued need for the inspection under the new oversight framework. The following criteria will be used to determine the need for any regioral initiative inspection at the start of the pilot:

The initial submittal of Pis, based on historical data, indicate tripped thresholds and the need for follow up regional initiative.

Significant licensee performance concerns are not appropriately indicated in the initial submittal of Pls due to the limitations in collecting the historical data.

A significant inspection finding within the last year indicates a licensee performance concern that requires additional initiative inspection.

The need for additional regional initiative inspection during the pilot program will be determined based on a periodic review of the Pl results and baseline inspection findings.

The applicable portions of the inspection procedures contained in IMC 2515, Appendix B, will be used to perform any regional initiative and reactive inspections required during the pilot program. The supplemental inspection program will be used in lieu of the current regional initiative program when it is developed and ready to be piloted.

Subsequent to the completion of the pilot program, pilot plants will continue under the new oversight processes if full implementation is delayed for the short term (3 months or less).

If it is expected that full implementation will be delayed for more than 3 months, then the staff will evaluate restoring pilot plants to the current regulatory oversight processes.

l 5

l

4 CONDUCT OF OVERSIGitT PROCESSES DURING PILOT Each of the new oversight processes will be irtplemented and performed as described by its associated procedure. In addition, the followir g will apply while implementing these processes during the pilot program.

The pilot plants will report Pls to the NRC by the 14* calender day each month during the pilot program. The Pls will be forwa'ded to the appropriate regional Division of Reactor Projects Branch Chief for review, and then posted on the NRC Intemet Homepage within two weeks of submittal by the licensees.

. Routine resident inspections will continue to be conducted over approximately a 6-week time period. PIMs will be updated and posted on the NRC Internet Homepage within 30 days of the end of each of these inspection periods.

- A Significance Determination Process and Enforcement Review Panel will be established during the pilot program to ensure that the SDP is implemented in a consistent manner.

This panel will meet bi-weekly as necessary to discuss all proposed potential risk

.significant issues before they are issued in their final form, in addition, an Operational Support Team from the Probabilistic Safety Assessment Branch, NRR, will be assembled to support the regions in performing SDP evaluations. Details of these support -

organizations is provided in draft inspection manual chapter 06XX, " Significance Determination Process."

- Outstanding NRC open inspection items should be reviewed and closed out.during the pilot program using the following guidance. Each open item should be evaluated for significance by the significance determination process. Any issues that are screened out or result in a green finding should be entered in the licensee's corrective action program as appropriate, documented in an inspection report as appropriate, and administratively closed. Any issue that results in a white, yellow, or red finding should be handled as a significant finding by the established processes. . -

  • A mid-cycle review and inspection planning meeting, including the issuance of a 6-month inspection look-ahead letter, will be held for each pilot plant early December 1999. The assessment cycle for this mid-cycle review will end approximately November 13,1999, and will include the 5 months of pilot data (Pis and inspection findings) collected at that time.

- The pilot plants will be discussed as part of the April 2000 senior management meeting (SMM) process. At the SMM screening meetings, the pilot plant performance review and discussion of agency action will be based on the PI results and baseline inspection findings, as applied to the action matrix. The action matrix will be used to the extent practicable to determine which pilot plants need to be discussed further at the SMM. This effort will exercise the end-of cycle assessment process for the pilot plants, w:th lessons leamed incorporated into the assessment process procedures.

6 o a

All NRC process and procedure users should use the feedback form, shown as Figure 4.1, to record lessons learned, comments, and feedback. This form should be used to capture comments on any of the new oversight processes, including the inspection program, assessment process, and the SDP. As described below, comments on the inspection procedures should be recorded on Figure 4.2. This form should be filled out after the initial use of a procedure, and again on subsequent uses to capture additional comments and feedback. Completed forms should be forwarded to the NRC regional points of contact for further evaluation.

Feedback, comments, or lessons learned while performing the inspection procedures should be recorded on the inspection Procedure Comment Form, shown as Figure 4.2.

These forms should be filled out after the initial use of each inspection procedure or inspectable area attachment, and again on subsequent uses to capture additional comments and feedback. Completed forms should be forwarded to the NRC regional points of contact for further evaluation.

Any feedback or comments received from a licensee regarding the pilot of the new regulatory oversight process should be recorded on a Regulatory impact Report, NRC I Form 649, or equivalent. Completed forms regarding the pilot of the new oversight I process should be forwarded to the NRC regional points of contact for further evaluation.

l 7

OVERSIGHT PROCESS FEEDBACK / COMMENT FORM Document

Title:

Document No. (if applicable): Attachment No. (if applicable):

Inspection Report No. (if applicable):

Inspection Type (circle one): Resident / Specialist / Team Plant Name: Region: Date(s) of use:

Address the following statements using the following rating scale: 1= Disagree. 2= Moderately Disagree,3= Neutral.

4= Moderately Agree,5= Agree. Please identify errors and inconsistencies in the procedure and provide comments and suggestions for improving the procedure,

1. The procedure requirements, and the way they were implemented, D"8" 12 3 4 S ^a=

met the stated objectives. What changes, if any, are required?

2. The procedure guidance is clear and easy to use. D"S" 1 2 3 4 5 ^8" What changes, if any, are required?
3. The actions specified by the procedure are performed at a D**" 1 2 3 4 5 ^8" sufficient frequency. What changes, if any, are required?
4. Excessive resources were not required to implement the D"8" 1 2 3 4 5 ^8" procedure as written. What changes, if any, are required?
5. While implementing the significance determination process were any significant inspection issues inappropriately screened out?
6. Other Comments / Suggestions:

/

Procedure User (Print) Date

( If comments are extensive, please mark them on a copy of the procedure and attach)

Figure 4.1 - Oversight Process Feedback Form 8

I l

) INSPECTION PROCEDURE COMMENT FORM l Inspection Procedure No.: Attachment No.: ,

Title:

1 l Inspection Report No: Inspection Type: Resident /DRS/ Team i Plant Name: Region: Date(s): l l

Address the following statements using the following rating scale: 1= Disagree,2= Moderately Disagree,3= Neutral, I 4= Moderately Agree,5= Agree. Please identify errors and inconsistencies in the procedure and provide comments and suggestions for improving the procedure, J

1. The level of effort required for the inspection was consistent with that D**8" 1 2 3 4 5 ^8" in the IP. If not, what is the appropriate level of effort and basis for IP revision?

l

2. The procedure met the inspection objectives. **8" 1 2 3 4 5 Aa=

If applicable, describe how the procedure (or how it was implemented) failed to meet the inspection objectives.

3. The inspection requirements were appropriate based on risk. D**8" 1 2 3 4 5 ^8" What inspection requirements should be added, deleted or revised based on risk? Based on implementing the Significance Determination Process for inspection findings, how should the inspection requirements be revised to better identify risk significant findings?
4. The inspection procedure was clear and easy to use. If not, how **8" 1 2 3 4 5 ^8" should the inspection guidance be revised to better clarify the inspection requirements? -
5. The IP does not result in an unreasonable impact on licensees. D**8" 1 2 3 4 5 ^8" How should the IP be revised to preclude unreasonable licensee impact?

I Inspector or Date Inspection Area Lead Figure 4.2 - Inspection Procedure Comment Form 9

7 5 PILOT PROGRAM SUPPORT ORGANIZATION 5.1. Pilot Plant Support Staff The transition task force (TTF) will provide support to the pilot plant sites and regions throughout the pilot program. As shown in Table 5.1, two TTF members will be assigned to each region as the points of contact during the pilot program. These pilot plant support staff members will be the focal point for regional questions on program implementation, will make periodic site visits to monitor NRC and licensee implementation of the program, and will solicit NRC staff and licensee comments on program effectiveness. Specifically, the responsibilities of the support staff are to:

Visit each site at least once per inspection period to obtain feedback from the residents, licensee staff, and observe performance of inspection procedures.

Observe a sampling of resident inspector and specialist inspector exit meetings with licensees.

Observe regional assessment and planning meetings for the pilot plants to provide guidance and assistance as necessary.

Collect feedback and forward to the appropriate task lead for evaluation and incorporation.

The insights gained by the pilot program support staff will be part of the input that is considered ,

by the Pilot Program Evaluation Panel.

~

Table 5.1 - Points of Contact Primary Point of Contact Backup Point of Contact Region i Tim Frye Pete Kottay 4 301-415-1287 301-415-2957 I Region 11 Dave Gamberoni Jeff Jacobson i 301-415-1144 301-415-2977 Re~gion ill Morris Branch Roy Mathew 301-415-1279 301-415-2965 Region IV Armando Masciantonio Sam Malur 301-415-1290 301-415-2963 '

Oversight process development will continue throughout the pilot program as lessons are leamed. Table 5.2 shows the TTF task leads who may be contacted to answer questions regarding any of the oversight processes and work in progress. Many of these task leads, such-as the Pl task lead, will also make frequent visits to the pilot sites and regions to moiiitor the implementation of the processes.

10

i l Table 5.2 - Task Leads Lead Phone E-mail Transition Task Force Alan Madison 301-415-1490 ALM Pilot Program Coordinator Tim Frye 301-415-1287 TJF PI Reporting Don Hickman 301-415-6829 DEH2 Inspection Program Steve Stein 301-415-1296 SRS Significance Determination Process - Morris Branch 301-415-1279 MXB2 Assessment Dave Gamberoni 301-415-1144 DLG2 Enforcement Barry Westreich 301-415-3456 BCW 5.2 Procedure Development Group A procedure development group will be established during the pilot to revise the baseline l c inspection procedures to incorporate lessons leamed during their initial implementation. Each inspection procedure and inspection program document will have an assigned procedure owner.

This procedure owner will be responsible for collecting problems and lessons leamed from the

! procedure users, evaluate the need for a procedure change, and then update the procedures as

l. . .necessary, The procedure owners for each baseline inspection procedure are shown in Table 5.3.

Table 5.3 - Baseline inspection Procedure Owners IP No. Title Contact Phone E-Mail l Overall Coordination of Procedure Development Steve Stein 301-415-1296 SRS Group Efforts L

71111 Reactor Safety-Initiating Events, Sam Malur 301-415-2963 SKM l Mitigating Systems, Barrier Integrity 71114 Emergency Preparedness Roy Mathew 301-415-2965 RKM 71121 Occupational Radiation Safety A. Masciantonio 301-415-1290 ASM1 71122 Public Radiation Safety A. Masciantonio 301-415-1290 ASM1 71130 Physical Security A. Masciantonio 301-415-1290 ASM1 71150 Plant Status Peter Koltay 301-415-2957 PSK 71151 Pl Verification Jeff Jacobson 301-415-2977 JBJ 71152 Problem ident!5 cation and Jeff Jacobson 301-415-2977 r'

Verification ,

11

IP No. Title Contact Phone E-Mail 71153 Event Follow up Peter Koltay 301-415-2957 PSK IMC 2515* Light-Water Reactor inspection Steve Stein 301-415-1296 SRS Program - Operations Phase IMC 0610 Inspection Reports Steve Stein 301-415-1296 SRS The procedure development group will be coordinated by the TTF during the pilot program. The TTF will ensure that procedure comments and lessons leamed are collected and evaluated in a timely manner. The TTF will coordinate procedure revisions by the procedure development group, and ensure that revised procedures are forwarded to the procedure users.

5.3. Pilot Program Evaluation Panel 5.3.1 Purpose The Pilot Program Evaluation Panel (PPEP) will functidn as a management-level, cross-disciplinary oversight group to monitor and evaluate the success of the pilot effort. The purpose of the PPEP is to allow the individual panel members to provide an objective evaluation as to whether the success criteria have been met. The PPEP will not provide recommendations or advice to the agency, as a group, regarding the readiness to proceed with full implementation of '

the new oversight processes. However, the PPEP members are welcome to submit advice, comments, or recommendations regarding the readiness for full implementation on an individual basis, separate from the PPEP effort. -

it is important to note that Federal Advisory Committee Act (FACA) requirements restrict the ability of the PPEP to provide an overall group review of the success of the pilot program. The staff is working to establish the PPEP as a FACA compliant panel, in accordance with 10 CFR Part 7, which will allow panel recommendations on the success of the new oversight processes to be forwarded to the Commission for consideration.

5.3.2 Scope The PPEP will meet periodically during the pilot program to review the implementation of the oversight processes and the results generated by the PI reporting, baseline inspection, assessment, and enforcement activities. These meetings will be publically announced in j advance, open to the public, and all material reviewed placed in the public document room. A i meeting summary will be prepared following each meeting to document the results of the meeting.

i N

l l

t 5.3.3 . . Objectives The objective of the PPEP is to monitor and evaluate the implementation of the new regulatory oversight processes at the pilot sites. The PPEP will evaluate the pilot program results against pre-established pilot program success criteria. For those success criteria that are intended to measure the effectiveness of the processes, and that generally do not have a quantifiable performance measure, the PPEP will serve as an " expert panel" to review the results and

, evaluate how well' the success criteria were met. At the end of the pilot program, the individual PPEP members will provide an objective evaluation as to whether each of the success criteria hcve been met. The staff will use the PPEP evaluation to determine the need for any additional i process development or improvements prior to full implementation.

5.3.4 Organization The PPEP will be a cross-disciplinary group of about 12 people, with membership anticipated to be as follows:

PPEP Chairman - Deputy Director, Division of Inspection Program Management, NRR Three regional division directors (combination of Division of Reactor Safety and Division of Reactor Projects division directors)

  • TTF Executive Forum Chairman

- Offi.ce of Enforcement representative One Nuclear Energy Institute (NEI) representative

= Four pilot plant licensee representatives

= One member of the public One State regulatory agency representative 5.3.5 Schedule The PPEP will meet approximately every six weeks during the pilot program to review and monitor the implementation of the new regulatory oversight processes. A tentative schedule for _

PPEP meetings is as follows: .

June 1999 PPEP kick-off meeting to review purpose and objectives of the panel July 1999 First PPEP meeting to discuss and review the results of the pilot program. l l

September 1999 PPEP meeting to discuss and review pilot program results.

December 1999 Final PPEP meeting to evaluate the pilot program against the success criteria.

5.3.5 Reports The results of PPEP meetings will be recorded in a meeting summary placed in the public document room. These meeting summaries will include all material handed out at the meetings.

The final PPEP evaluation of the pilot program against the success criteria will be included as 1

13 a

1 1

)

I.

part of the staff recommendation to the Commission regarding full implementation of the new I regulatory oversight processes.

i j

1 1

)

14 l

6 PILOT PROGRAM SUCCESS CRITERIA The following success criteria will be used to evaluate the results of the regulatory oversight process improvement pilot program. These criteria will determine whether the overall objectives of the pilot program have been met, and whether the new oversight processes (1) ensure that plants continue to be operated safely. (2) enhance public confidence by increasing predictability, consistency and objectivity of the oversight process so that all constituents will be well served by the changes taking place, (3) improve the efficiency and effectiveness of regulatory oversight by focusing agency and licensee resources on those issues with the most safety significance, and (4) reduce unnecessary regulatory burden on licensees as the processes become more efficient and effective.

6.1 Performance Indicator Reporting The following criter a will measure the efficiency and effectiveness of Pl reporting.

Can Pi data be accurately reported by the industry, in accordance with reporting guidelines? They can, if by the end of the pilot program, each Pi is being. reported accurately for at least 8 out of the 9 pilot plants.

Can Pl data results be submitted by the industry in a timely manner? They can, if by the end of the pilot program, all plants submit Pl data within one business day of the due date.

6.2 Risk-informed Baseline inspection Program The following criteria will measure the efficiency and effectiveness of the baseline inspection program, including inspection planning, conduct of inspections, inspection finding evaluation, and inspection finding documentation.

  • Can the inspection planning process be efficiently performed to support the assessment cycle? It can, if the planning process supports issuing a 6-month inspection look-ahead letter within 4 weeks from the end of an assessment cycle for at least 8 out of the 9 pilot plants.

Are the inspection procedures clearly written so that the inspectors can consistently conduct the inspections as intended? They are, if by the end of the pilot program, resources expended to perform each routinely performed inspection procedure are within 25% of each other for at least 8 out of the 9 pilot plants. Similar data will be assessed for less frequently performed procedures (e.g., engineering and design inspection).

Inspection procedure quality will also be determined by a PPEP evaluation of feedback from the procedure users.

- Are less NRC resources required to provide adequate oversight of licensee activities through inspection? They are, if the direct inspection effort expended to perform baseline and regional initiative inspection activities are less than the resources expended under the current inspection program over the same time period. Review may be based on 15 i

7 1

!' l l

either the average for all plants, or a paired comparison between similar plants (e.g., one under the baseline and one in the core program).

Can the SDP be used by inspectors and regional management to efficiently categorize

( inspection findings in a timely manner? It can, if by the end of the pilot program, L inspection reports and updated plant issues matrices (PIMs) can be issued within 30 days

! of the end of an inspection period for at least 8 out of the 9 pilot plants. j

! I j -

Can inspection findings be properly assigned a safety significance rating in accordance with established guidance? They can, if a sample of inspection findings chosen for 95%

l assurance demonstrates that at least 95% of the findings were properly categorized by the SDP. Additionally, this sample review should confirm that no risk-significant inspection findings were screened out. The results will be reviewed by the PPEP.

Are the scope and frequencies of the baseline inspection procedures adequate to address their intended comerstone attributes? Success will be determined by an independent evaluation by the PPEP.

6.3 Assessment I -

The following criteria will measure the efficiency and effectiveness of the new assessment j processes.

! i l

. Can the assessment process be performed within the scheduled time? It can, if for at l l least 8 out of the 9 pilot plants, a mid-cycle assessment of the Pls and inspection findings l can be completed, with a letter forwarding the results and a 6-month inspection look-l ahead schedule, within 4 weeks of the end of the assessment cycle.

Can the action matrix be used to take appropriate NRC actions in response to indications of licensee performance? It can, if there is no more than one instance (with a goal of zero) in which an independent review by the PPEP concluded that action required for a pilot plant is different from the range of actions specifiexf by the action matrix.

1 l

. Does the combination of Pl results and inspection findings provide an adequate indication l of licensee performance? Does the process provide a reasonable assurance that the comerstone objectives are being met and safe plant operation is maintained? Success j will be determined by an independent evaluation by the PPEP.

i

- Are assessments of licensee performance performed for the pilot plants in a manner that )

! is consistent across the regions and that meets the objectives of the assessment program I

guidance? Success will be determined by an independent evaluation by the PPEP.

6.4 Enforcement -- -

i The following criteria will measure the effectiveness of the new enforcement policy.

16 i

L-

Are enforcement actions taken in a manner consistent with the assessment of inspection findings that results from the SDP7 Yes, as determined by the SDP operational support

! team and reviewed by the PPEP.

6.5 Information Management Systems The following criteria will determine whether the NRCs' information management systems are ready to support full implementation of the new regulatory oversight processes.

Are the' assessment data and results readily available to the public? They are, if by the end of the pilot program, the NRC information systems support receiving industry data, and if Pts and inspection findings are publicly available on the intemet within 30 days of the end of the data period for at least 8 out of the 9 pilot plants.-

l Are the time reporting and budget systems, such as the Regulatory Information Tracking System, ready to support the process changes? They are, if by the end of the pilot program, the time expended for regulatory oversight activities can be accurately recorded at least 95% of the time.

Are the NRC information support systems, such as the Reactor Program System (RPS) and its associated modules, ready to support full implementation of the new oversight processes? They are, as determined by an independent evaluation by the PPEP.

6.6 Overall The following criteria will measure the overall success of the pilot program, including an evaluation of the training provided and an evaluation of the regulatory burden imposed on licensees by the new processes.

. . Have inspectors and managers been adequately trained to successfully implement the new oversight processes? They have, as determined by a training effectiveness evaluation reviewed by the PPEP.

- Are the new regulatory oversight processes more efficient overall? They are, if by the end of the pilot program, the agency resources required to implement the inspection,-  ;

l assessment, and enforcement programs are about 15% less than currently required.

l Review may be based on either the average for all plants, or a paired comparison between similar plants (e.g., one under the baseline and one in the core program). l l

l

~

= Do the new oversight processes remove unnecessary regulatory burden, as appropriate,  !

from the licensees? They do, based on pilot plant licensee feedback obtained by both the J

_ NRC and industry, and reviewed by the PPEP. I l

  • Do the'new oversight processes result in NRC assessments of licensee performance and l NRC actions that are more understandable, predictable, consistent, and objective as perceived by both the industry and the general public? This may be accomplished by a i

17  !

+

L

I[

survey of stakeholders, including both the industry and the public, with the results reviewed by the PPEP.

l 1

e 1

l l

l  !

I i

l 1

i l

1  !

18

7 PILOT PROGRAM RITS GUIDANCE The following tables provide guidance for NRC staff to charge time related to pilot program activities. Table 7.1 describes how direct inspection effort under the baseline inspection program should be charged by the staff. Table 7.2 describes how staff time spent performing activities related to inspection preparation, documentation, assessment, and enforcement should be chargad. Table 7.3 describes how time spent on non-docket related activities should be charged.

Time spent on any activities not covered by tables 7.1,7.2, or 7.3 should be charged using existing RITS guidance.

Table 7.1 - Baseline Inspectable Area RITS Guidance (See Note 1)

IPAA NO. TITLE

](([ (MhMh 71111.01 Adverse Weather Preparations 71111.02 Changes to License Conditions and Safety Analysis Reports 71111.03 Emergent Work 71111.04 Equipment Alignment 71111.05 Fire Protection 71111.06 Flood Protection Measures 71111.07 Heat Sink Performance 71111.08 Inservice inspection Activities 71111.09 Inservice Testing of Pumps and Valves 71111.10 Large Containment isolation Valve Leak Rate and Status Verification 71111.11 Licensed Operator Requalifications 71111.12 Maintenance Rule implementation 71111.13 Maintenance Work Prioritization and Contro!

71111.14 Nonroutine Plant Evolutions 71111.15 Operability Evaluations 71111.16 Operator Workarounds 71111.17 Permanent Plant Modifications

'71111.18 Not Used 71111.19 Post Maintenance Testing 71111.20 Refueling and Outage Activities 19

1 1

1 l

l l

IP/lA NO. TITLE l 71111.21 Safety System Design and Performance Capability ,

1 71111.22 Surveillance Testing 71111.23 Temporary Plant Modifications 3

47 m,-,

,m ecs ug . wo, < v .uws x ~,p

.-/Nbib$

3 e s, , tip N b, .

e, i I ,

, da "" WFI W,0088 e - pq, N w!Sh bhb l

71114.01 Drill and Exercise inspection l l

71114.02 Alert Notification System Testing 71114.03 Emergency Response Organization Augmentation 71114.04 Emergency Action Level Changes k h $ h [ 2ksh! h 7NkS[ 'R M S$5Ehh kI$ M$$(5kNd@[h 71121.01 Access Control to Radiological!/ Significant Areas 71121.02 ALARA Planning and Controls 71121.03 Radiation Monitoring instrumentation

&QjQQR RjQ&^&'s&QQfy}Qffg 71122.01 Gaseous and Liquid Effluent Treatment Systems 71122.02 Radioactive Material Processing and Shipping 71122.03 Radiological Environmental Monitoring Program bNh,$ $ MNhkbNhhhNk 71130.01 Access Authorization 71130.02 Access Control 71130.03 Response to Contingency Events kk5%kkDh M dk h k N h y dtus@lj.$ h f((jfhk M k k kf($$ $? M 71150 Plant Status NYbh kN ) kbhfkh$ Nk 71151 Performance indicator Verification

                 %                                                 . e . . ww          %,%

hg 43 o,y$j@fjpg!dWl}[fM sgg

                       ;m=p w. w.wm$wewojdentifica on and Resolution of Problem.m.yg;,9.Mp        $d
                                                                                                $ow$e 71152              Problem Identification and Resolution of Problems s-      ww           m~--~~                                  pc ~r-       -

m4 v N . ppy$b5hphllWMM-T > . 5NM1.53Mf0N:.N-7e/Sd - hN,W6TkY$N$bMf8 71153 Event Follow Up _ Note 1 - Use *OA" as the IPE code for the above inspection related activities 20

r Table 7.2 - Non-inspection Activity RITS Guidance IMI CODE TITLE DESCRIPTION TBD Basehne inspection Preparation Time spent preparing for the conduct of baseline inspection activities. TBD Baseline Inspection Documentation Time spent documenting the results of

                         ,                             baseline inspections, preparation of inspection reports, and the preparation / update of the plant issues matrix TBD        Reactive / Initiative Inspection    Time spent preparing for the conduct of Preparation                         regional initiative and reactive inspection activities.

TBD Reactivellnitiative Inspection Time spent documenting the results of Documentation regionalinitiative and reactive inspections, preparation of inspection reports, and the preparation / update of the plant issues matrix TBD Significance Determination Process Time spent evaluating the'significanco of - allinspection findings using the significance determination process TBD Enforcement Time spent dispositioning inspection findings under the enforcement policy. TBD Assessment Time spent preparing for and conducting all assessment activities including, quarterly, mid-cycle, and end-of-cycle reviews. TBD-To Be Determined v 21

r- q i Table 7.3 - Non-Docket Activity RITS Guidance { TAC # PA # TITLE DESCRIPTION I I MA5132 131F 'lTF Change Coalition Activities Covers time spent in meetings, teleconferences, video conferences, etc..., associated with the TTF change coalition. . I MA5104 9A1E New Regulatory Oversight include both staff time spent in actual ) Process Training training courses / workshops and self-study time spent at sites, regions, or HQ ' MA3792 131F Development of Training HQ staff time spent to develop and present training I l l l l t 22 {

r 1 l-8 PILOT PLANT SELECTION , i The following criteria were used to identify potential sites for the pilot program: l . To the maximum extent possible, licensees were chosen that had either volunteered to participate in the pilot program, or that had participated in the NEl task group working on improving the regulatory oversight processes. A number of different licensees were chosen to participate in order to maximize industry exposure to the new processes. Plants were chosen to represent a broad spectrum of performance levels, but plants that were in extended shutdowns because of performance issues were not considered. A mix of pressurized-water reactors (PWRs) and boiling-water reactors (BWRs) was chosen.

  .       A mix of plant vendors and plant ages was chosen.

To the extent possible, two plants with different performance levels within each region were chosen. 1' NRC regional office concems, such as experience of NRC staff associated with pilot plants and transition issues (such as expected departure of key NRC personnel during ) the pilot program), were considered.

  • Licensee concems, such as their involvement with other significant NRC activities (license renewal, steam generator replacement, etc.), were considered.

These criteria, and potential candidate plants, were discussed with NRC headquarters and regional management, and with NEl. All potential plants selected to participate were first contacted by NEl, and all agreed to participate in the pilot program. Before publicly announcing which sites were participating in the pilot program, the NRC staff contacted each of the appropriate State organizations to notify them of the site's participation in the pilot program. After the State notifications were completed, a press release was issued on February 22,1999, to announce the pilot program and the participating sites. In conjunction with the pilot program, the staff has offered to participate in public meetings with State and local representative to discuss the pilot program and the revised oversight processes. The following table summarizes the sites that the NRC and the industry agreed would participate in the pilot program. It is important to note that there are actually nine pilot plants since Public Service Electric & Gas Company (PSE&G) requested that both Salem and Hope Creek participate in the pilot program. NRC headquarters and Region I management agreed with this request. 23

Table 8.1 - Pilot Plants hb 3 n ## h  ; >/ I Hope Creek Public Service Electric 2/2/2/1 BWR General

                                   & Gas (PSE&G)                              Electric (GE)

Type 4/ 13 years I Salem 1&2 PSE&G 1/2/2/1 PWR 4 Loop Westinghouse (Z0/ 20 years l l FitzPatrick New York Power 2/2/2/2 BWR GE Type 4/ l Authority 24 years ll Harris Carolina Power & Light 1/1/2/1 PWR 3 Loop _W/ Company 12 years ll Sequoyah Tennessee Valley 2/2/2/1 PWR 4 Loop _W/ 1&2 Authority 18 years lil Prairie Island Northern States Power 2/1/2/1 PM 2 Loop W/ 1&2 Company 25 years . l lil Quad Cities Cor.1monwealth Edison 2/3/3/2 BWR GE Type 3/ 1&2 Company e 26 years j IV Ft. Calhoun Omaha Public Power 2/2/1/2 PWR Combustion District , Engineering (CE)/ 26 years IV Cooper Nebraska Public Power 2/2/3/1 BWR GE Type 4/ District 25 years

SUMMARY

9 Plants 8 Licensees 1/1/2/1 5 PWRs 4_W plants to 4 BWRs 1 CE plant

                                                           ' 2/3/3/2          4 GE plants Note 1 - SALP scores correspond to the following SALP fur.ctional areas:

Operations / Maintenance / Engineering / Plant Support 24

inspection Manual Chapter 2515* 9 4 4 Attachment 2 , _ - _ - -}}