ML18114A049

From kanterella
Jump to navigation Jump to search
NEI White Paper, Performance Indicators for Adjusting the Frequency of Emergency Preparedness Program Reviews
ML18114A049
Person / Time
Site: Nuclear Energy Institute
Issue date: 03/29/2018
From: Young D
Nuclear Energy Institute
To: Robert Kahler
Office of Nuclear Security and Incident Response
References
Download: ML18114A049 (14)


Text

DAVID YOUNG Technical A dvisor, Nuclear Security and Incident Preparedness 1 201 F Street, NW, Suite 1100 Washington, DC 20004 P: 202.739.8127 d ly@nei.org n ei.org March 29, 2018 Mr. Robert Kahler Chief, Regulatory Policy and Oversight Branch Division of Preparedness and Response Office of Nuclear Security and Incident Response U.S. Nuclear Regulatory Commission Washington, DC 20555-0001

Subject:

NEI White Paper, Performance Indicators for Adjusting the Frequency of Emergency Preparedness Program Reviews Project Number: 689

Dear Mr. Kahler:

The Nuclear Energy Institute1 and representatives of member companies have developed the attached white paper to provide a set of performance indicators that may be used by a licensee to conduct periodic emergency preparedness program reviews at a 24-month frequency as allowed by 10 CFR 50.54(t)(1)(ii). We request a review of the white paper by the NRC staff followed by a discussion of any comments in a public meeting. The white paper will be revised to address NRC staff comments and then submitted for endorsement.

1 The Nuclear Energy Institute (NEI) is the organization responsible for establishing unified industry policy on matters affecting its members, including the regulatory aspects of generic operational and technical issues. NEI's members include entities licensed to operate commercial nuclear power plants in the United States, nuclear plant designers, major architect/engineering firms, fuel cycle facilities, suppliers and nuclear materials licensees, nuclear medicine and radiopharmaceutical companies, companies using nuclear technologies in the agricultural, food, and industrial sectors, universities and research laboratories, law firms, labor unions, and international electric utilities.

Mr. Robert Kahler March 29, 2018 Page 2 If you have questions or require additional information, please contact me at (202) 739-8127 or dly@nei.org.

Sincerely, David L. Young c:

Sue Perkins-Grew, NEI

WHITE PAPER PERFORMANCE INDICATORS FOR ADJUSTING THE FREQUENCY OF EMERGENCY PREPAREDNESS PROGRAM REVIEWS REV. A March 2018

Performance Indicators for Adjusting the Frequency of Emergency Preparedness Program Reviews

© NEI 2018. All rights reserved. For use by NEI members only.

Page 2 of 12 March 2018 This document was prepared by the Nuclear Energy Institute (NEI)1 and representatives of member companies.

NEI Lead: David Young Member Representatives:

Steve Barr - PSEG Nuclear John Costello - Dominion Energy Doug Walker - Exelon Nuclear Copyright Notice

© NEI 2018. All rights reserved. This material is protected by copyright law. It may not be copied, transmitted, stored, distributed, or excerpted, electronically or by other means, without NEIs advance written permission. This material may contain confidential information and is intended for distribution solely to NEI members, for their use. If you are not an NEI member, you may send a permission request to CopyrightAgent@nei.org. All copies must contain the NEI copyright notice and acknowledge that the use is with permission of NEI.

1 The Nuclear Energy Institute (NEI) is the organization responsible for establishing unified industry policy on matters affecting the nuclear energy industry, including the regulatory aspects of generic operational and technical issues. NEI's members include entities licensed to operate commercial nuclear power plants in the United States, nuclear plant designers, major architect/engineering firms, fuel cycle facilities, nuclear materials licensees, and other organizations and entities involved in the nuclear energy industry.

Performance Indicators for Adjusting the Frequency of Emergency Preparedness Program Reviews

© NEI 2018. All rights reserved. For use by NEI members only.

Page 3 of 12 March 2018 Purpose This white paper provides a set of performance indicators that may be used by a licensee to adopt the voluntary option of conducting periodic emergency preparedness (EP) program reviews at a 24-month frequency as allowed by 10 CFR 50.54(t)(1)(ii).

Background

Section 50.54(t) of Title 10 of the Code of Federal Regulations (CFR) requires that each nuclear power reactor licensee provide for a periodic independent review of their EP program. The entire section is presented below.

(t)(1) The licensee shall provide for the development, revision, implementation, and maintenance of its emergency preparedness program. The licensee shall ensure that all program elements are reviewed by persons who have no direct responsibility for the implementation of the emergency preparedness program either:

(i) At intervals not to exceed 12 months or, (ii) As necessary, based on an assessment by the licensee against performance indicators, and as soon as reasonably practicable after a change occurs in personnel, procedures, equipment, or facilities that potentially could adversely affect emergency preparedness, but no longer than 12 months after the change. In any case, all elements of the emergency preparedness program must be reviewed at least once every 24 months.

(2) The review must include an evaluation for adequacy of interfaces with State and local governments and of licensee drills, exercises, capabilities, and procedures. The results of the review, along with recommendations for improvements, must be documented, reported to the licensee's corporate and plant management, and retained for a period of 5 years. The part of the review involving the evaluation for adequacy of interface with State and local governments must be available to the appropriate State and local governments.

This Section was revised into its current form by Final Rule RIN 3150-AF63, Frequency of Reviews and Audits for Emergency Preparedness Programs, Safeguards Contingency Plans, and Security Programs for Nuclear Power Reactors.2 The rule amended U.S. Nuclear Regulatory Commission (NRC) regulations to give licensees the option to change the frequency of independent reviews and audits of their EP programs, safeguards contingency plans, and security programs. The amendment allows licensees to elect to conduct program reviews and audits either at intervals not to exceed 12 months, or as necessary, based on an assessment by the licensee against performance indicators, and as soon as reasonably practicable after a change occurs in personnel, procedures, equipment, or facilities that potentially could adversely affect the EP 2 Refer to 64 Fed. Reg. 14,814 (March 29, 1999) and a subsequently revised Final Rule in 64 Fed. Reg. 17,947 (April 13, 1999) that corrected erroneous citations. Additional background information is available in the Proposed Rule, 62 Fed. Reg. 40,978 (July 31, 1997).

Performance Indicators for Adjusting the Frequency of Emergency Preparedness Program Reviews

© NEI 2018. All rights reserved. For use by NEI members only.

Page 4 of 12 March 2018 program, the safeguards contingency plan, and security program, but no longer than 12 months after the change. In all cases, each element of the EP program, the safeguards contingency plan, and the security program must be reviewed at least every 24 months.

The performance indicators described in this paper were developed to support a licensee decision to implement the 24-month review option allowed by 10 CFR 50.54(t) and are thus applicable to EP programs only.

EP Program Performance Indicators As stated in the Comment Resolution section of the Final Rule, performance indicators are numerical parameters generally derived from quantitative data to monitor the performance and gain insight to the effectiveness of the emergency preparedness and security programs and provide a measurement of success in a summary fashion. If indicated performance falls below a prescribed level, then a review of the affected EP program area would be required. The Final Rule Statements of Consideration provided the following example performance indicators3 for an EP program:

1. Emergency response facility availability
2. Completeness of emergency preparedness duty roster personnel training
3. Quality of response to declared plant emergencies
4. Timeliness of corrective actions
5. Measures of state and local interface
6. Percentage of drill objectives successfully demonstrated From these examples, the following set of performance indicators was developed:
1. Emergency Response Facility and Equipment Readiness
2. Emergency Response Organization Staffing
3. Quality of Response to Actual Emergency
4. Timeliness of EP Corrective Actions
5. State and Local ORO Engagement
6. Emergency Response Organization Performance To ensure consistent implementation by NEI members and promote inspection predictability, each performance indicator includes the following attributes:
1. Purpose
2. Reporting Frequency
3. Indicator Definition
4. Data Reporting Elements
5. Calculation
6. EP Review Trigger Threshold
7. Clarifying Notes 3 For historical context, Final Rule RIN 3150-AF63 was published approximately 1 year prior to the implementation of the three EP performance indicators found in the NRCs Reactor Oversight Process (ROP).

Performance Indicators for Adjusting the Frequency of Emergency Preparedness Program Reviews

© NEI 2018. All rights reserved. For use by NEI members only.

Page 5 of 12 March 2018 The performance indicators and associated attributes are presented in Attachment 1.

Each performance indicator has a defined EP Review Trigger Threshold. If the actual measured level of performance exceeds a specified trigger threshold level, then a review of the affected EP program area is required. It is important to note that the threshold criteria do not represent boundaries between adequate and inadequate levels of program performance from a broader perspective; that is not the purpose of these indicators. Rather, the trigger threshold criteria establish levels of performance that are degraded sufficiently to warrant an accelerated independent review of the affected area.

Implementation The use of the performance indicators described in this document is voluntary and other approaches to performance monitoring may also be acceptable.

The monitoring of the performance indicators should be controlled by a procedure. The procedure should describe the performance indicators, responsibilities for data reporting and assessment, and the actions to take if an EP Review Trigger Threshold is exceeded.

Performance Indicators for Adjusting the Frequency of Emergency Preparedness Program Reviews

© NEI 2018. All rights reserved. For use by NEI members only.

Page 6 of 12 March 2018 EP Program Performance Indicator #1 INDICATOR NAME:

EMERGENCY RESPONSE FACILITY AND EQUIPMENT READINESS PURPOSE:

This indicator measures the ability to maintain emergency facilities and equipment in a state of functional readiness.

REPORTING FREQUENCY:

Quarterly INDICATOR DEFINITION:

The number of reports made pursuant to the requirements of 10 CFR 50.72(b)(3)(xiii) during the quarter.

DATA REPORTING ELEMENTS:

The following data are required to calculate this indicator:

The number of reports made pursuant to the requirements of 10 CFR 50.72(b)(3)(xiii) during the quarter.

CALCULATION:

Count the number of reports made pursuant to the requirements of 10 CFR 50.72(b)(3)(xiii) during the quarter.

EP REVIEW TRIGGER THRESHOLD:

2 reports made in one quarter CLARIFYING NOTES:

NUREG-1022, Revision 3, "Event Reporting Guidelines: 10 CFR 50.72 and 50.73," contains guidelines that the NRC staff considers acceptable for use in meeting the requirements of 10 CFR 50.72, "Immediate Notification Requirements for Operating Nuclear Power Reactors," and 50.73, Licensee Event Report System.

Section 3.2.13, Loss of Emergency Preparedness Capabilities, of NUREG-1022, Revision 3, contains guidance for reporting under 10 CFR 50.72(b)(3)(xiii). Regulations in 10 CFR 50.72(b)(3)(xiii) require reports for a major loss of emergency assessment capability, offsite response capability, or communications capability. Much of the guidance found in Section 3.2.13 of NUREG-1022, Revision 3, is subject to engineering judgment. Supplement 1 to NUREG-1022, Revision 3, endorses Nuclear Energy Institute (NEI) 13-01, "Reportable Action Levels for Loss of Emergency Preparedness Capabilities," dated July 2014. NEI 13-01 provides specific guidance for reporting under 10 CFR 50.72(b)(3)(xiii) and, as a result, reduces the need for engineering judgment. Guidance found in NEI 13-01 provides for an acceptable alternative to guidance found in Section 3.2.13 of NUREG-1022, Revision 3.

Performance Indicators for Adjusting the Frequency of Emergency Preparedness Program Reviews

© NEI 2018. All rights reserved. For use by NEI members only.

Page 7 of 12 March 2018 EP Program Performance Indicator #2 INDICATOR NAME:

EMERGENCY RESPONSE ORGANIZATION STAFFING PURPOSE:

This indicator measures the ability to maintain staffing of the Emergency Response Organization (ERO) with qualified personnel.

REPORTING FREQUENCY:

Quarterly INDICATOR DEFINITION:

The number of ERO positions that were not staffed with a sufficient number of qualified personnel to enable extended ERO activation at any time during the quarter.

DATA REPORTING ELEMENTS:

The following data are required to calculate this indicator:

Staffing levels for ERO positions that perform an emergency response function listed in Table B-1 of NUREG-0654/FEMA-REP-1, as described in the emergency plan.

CALCULATION:

Count the staffing level for each ERO position.

EP REVIEW TRIGGER THRESHOLD:

1 ERO position staffed 2 deep, OR 1 ERO position staffed 3 deep in two consecutive quarters CLARIFYING NOTES:

An ERO position is a position that performs an emergency response function listed in Table B-1 of NUREG-0654/FEMA-REP-1 (for the revision applicable to the site licensing basis) as described in the emergency plan.

Qualified means the individual assigned to a position meets all the applicable training and qualification requirements for that position.

For sites that use pooling to staff an ERO position (e.g., an OSC Radiation Protection Technician position), the EP Review Trigger Threshold should also include 1 pooled position is staffed < N where N is the number of position holders needed to support 24-hour coverage using a 12-hour shift schedule.

A staffing shortfall of less than 24 hours2.777778e-4 days <br />0.00667 hours <br />3.968254e-5 weeks <br />9.132e-6 months <br /> should not be counted.

To provide documentation for this indicator, a site procedure should require the generation of a condition/issue report when an ERO position is not adequately staffed.

Performance Indicators for Adjusting the Frequency of Emergency Preparedness Program Reviews

© NEI 2018. All rights reserved. For use by NEI members only.

Page 8 of 12 March 2018 EP Program Performance Indicator #3 INDICATOR NAME:

QUALITY OF RESPONSE TO AN ACTUAL EMERGENCY PURPOSE:

This indicator measures the effectiveness of a site response to an actual emergency.

REPORTING FREQUENCY:

Quarterly INDICATOR DEFINITION:

For a given event, the number of instances that one or more of the following emergency response functions were performed incorrectly or untimely.

Emergency declarations Emergency notifications Protective Action Recommendations (PAR)

Required ERO staff augmentation Required activation of Emergency Response Facilities (ERFs)

DATA REPORTING ELEMENTS:

The following data are required to calculate this indicator:

The number of response functions performed incorrectly or untimely during an actual emergency.

CALCULATION:

Count the number of response functions performed incorrectly or untimely during an actual emergency.

EP REVIEW TRIGGER THRESHOLD:

2 instances of inaccurate or untimely emergency declaration, emergency notification, or PARs, OR 3 augmenting ERO staff not responding within the required time, OR 1 ERF not activated within the required time CLARIFYING NOTES:

This indicator is assessed only in quarters during which there was a declared emergency; its status is not assessed in all other quarters.

Accurate and timely performance of emergency declarations, emergency notifications and PARs is determined by comparing actual performance to requirements in the emergency plan and implementing procedures. Emergency declaration downgrades and terminations, and associated notifications, are not counted. It is expected that the assessment of accurate and timely performance of

Performance Indicators for Adjusting the Frequency of Emergency Preparedness Program Reviews

© NEI 2018. All rights reserved. For use by NEI members only.

Page 9 of 12 March 2018 emergency declarations, emergency notifications and PARs for this indicator would match those performed for the Reactor Oversight Process (ROP) Drill/Exercise Performance (DEP) indicator.

The augmenting ERO staff are those positions performing a function listed in Table B-1 of NUREG-0654/FEMA-REP-1 (for the revision applicable to the site licensing basis) as described in the emergency plan.

The ERFs are the Technical Support Center (TSC), Operational Support Center (OSC), and Emergency Operations Facility (EOF).

A required ERO callout or ERF activation is required by the emergency plan for the event conditions, and not initiated on a discretionary/optional basis. A required time is that specified in the emergency plan, or in the absence of a plan-specified value, a procedure.

Missed emergency response functions should be documented in a corrective action program.

Performance Indicators for Adjusting the Frequency of Emergency Preparedness Program Reviews

© NEI 2018. All rights reserved. For use by NEI members only.

Page 10 of 12 March 2018 EP Program Performance Indicator #4 INDICATOR NAME:

TIMELINESS OF EP CORRECTIVE ACTIONS PURPOSE:

This indicator measures the timeliness of corrective actions to address weaknesses in the Emergency Preparedness (EP) Program.

REPORTING FREQUENCY:

Quarterly INDICATOR DEFINITION:

The number of instances where an EP Program weakness was not corrected in a timely manner.

DATA REPORTING ELEMENTS:

The following data are required to calculate this indicator:

EP Program weaknesses not corrected > 90 days after identification.

CALCULATION:

Identify which EP Program weaknesses are associated with a Risk Significant Planning Standard (RSPS) or a Planning Standard (PS).

EP REVIEW TRIGGER THRESHOLD:

Without an acceptable basis for delay; 1 weakness associated with a RSPS is not corrected within 90 days after identification.

1 weakness associated with a PS is not corrected within 180 days after identification.

CLARIFYING NOTES:

EP Program weaknesses should be documented in a corrective action program (CAP).

To maintain a consistent basis for this indicator, the definitions for the terms weakness, RSPS and PS are as presented in NRC Inspection Manual Chapter 609, Appendix B, Emergency Preparedness Significance Determination Process.

Correction of an issue means that the permanent corrective action has been implemented; interim compensatory measures are not credited. A planned post-implementation effectiveness review is not a corrective action for purposes of this indicator.

Documentation of an acceptable basis for delay is required and should include a summary of corrective actions taken or planned.

In many cases, this information may already be tracked in a CAP; if so, there is no need to create duplicate documentation.

Performance Indicators for Adjusting the Frequency of Emergency Preparedness Program Reviews

© NEI 2018. All rights reserved. For use by NEI members only.

Page 11 of 12 March 2018 EP Program Performance Indicator #5 INDICATOR NAME:

STATE AND LOCAL ORO ENGAGEMENT PURPOSE:

This indicator measures the level of interface between the licensees Emergency Preparedness (EP) Department and key offsite response organizations (OROs).

REPORTING FREQUENCY:

Quarterly INDICATOR DEFINITION:

The frequency of meetings between members of State and local emergency management agencies and the fleet/site EP staff.

DATA REPORTING ELEMENTS:

The following data are required to calculate this indicator:

The number and date of meetings between members of State and local emergency management agencies, and the fleet/site EP staff.

CALCULATION:

Count the number of meetings between the members of the lead State emergency management agency and the fleet/site EP staff within the past 6 months, and the number of meetings between the members of each local emergency management agency and the fleet/site EP staff within the past 12 months.

EP REVIEW TRIGGER THRESHOLD:

< 1 meeting occurred between the members of the lead State emergency management agency and the fleet/site EP staff within the past 6 months, OR

< 1 meeting occurred between the members of each local emergency management agency and the fleet/site EP staff within the past 12 months.

CLARIFYING NOTES:

The occurrence of a meeting should be documented, including the meeting date, time, location, attendees and purpose. Meeting minutes are not required.

Local means a jurisdiction within the 10-mile Emergency Planning Zone (e.g., a town or county emergency management agency).

Performance Indicators for Adjusting the Frequency of Emergency Preparedness Program Reviews

© NEI 2018. All rights reserved. For use by NEI members only.

Page 12 of 12 March 2018 EP Program Performance Indicator #6 INDICATOR NAME:

EMERGENCY RESPONSE ORGANIZATION PERFORMANCE PURPOSE:

This indicator measures the proficiency of the emergency response organization to perform emergency plan functions through demonstration of drill and exercise objectives.

REPORTING FREQUENCY:

Quarterly INDICATOR DEFINITION:

The ratio of the number of emergency response drill and exercise objectives assessed as acceptably demonstrated to the total number of objectives for all emergency response drills and exercises conducted during the previous four quarters.

DATA REPORTING ELEMENTS:

The following data are required to calculate this indicator:

The number of emergency response drill and exercise objectives assessed as acceptably demonstrated during the quarter.

The total number of objectives for drills and exercises conducted during the quarter.

CALCULATION:

(The number of emergency response drill and exercise objectives assessed as acceptably demonstrated within the previous four quarters)

_____________________________________________

  • 100 (The total number of objectives for drills and exercises conducted during the previous four quarters)

EP REVIEW TRIGGER THRESHOLD:

< 92%

CLARIFYING NOTES:

Credited drills and exercises must be performance enhancing experiences as discussed in NEI 99-02, Regulatory Assessment Performance Indicator Guideline.

Objectives to be counted are those related to the performance of emergency response functions described in the emergency plan.

The assessment of an Objective is typically performed by determining if the associated demonstration criteria were met, and documented in a drill or exercise report.