ML20205N853

From kanterella
Jump to navigation Jump to search
Requests Approval of NRC Plans on Performance Indicators
ML20205N853
Person / Time
Issue date: 05/05/1986
From: Stello V
NRC OFFICE OF THE EXECUTIVE DIRECTOR FOR OPERATIONS (EDO)
To:
References
TASK-PINV, TASK-SE SECY-86-144, NUDOCS 8605150350
Download: ML20205N853 (22)


Text

"o' a' PUBUCLY g rao RELEASED

_ 4

?. 4 N;v/ *****

May 5, 1986 SECY-86-144 POLICY ISSUE l (Notation Vote)

For: The Commissioners From: Victor Stello, Jr.

Executive Director for Opantions

Subject:

PERFORMANCE INDICATORS

Purpose:

To obtain approval for the staff's plans on performance indicators.

Summary: Several staff elements are using and developing performance indicators for various purposes. For purposes of this discussion, the term performance indicator reflects a set I of information and data that may be correlated with individual plant overall regulatory and safety performance.

Recently, the staff was directed to submit a proposal on performance indicators to the Commission for approval. The staff proposes a plan that continues using the evolving program of systematic assessment of licensee performance (SALP) as the cornerstone of its efforts to evaluate licensee performance. To augment the SALP process and to provide for a more timely identification of declining performance, a set of performance indicators will be trended on a more frequent basis to detect symptoms of declining performance and provide more objective input into SALP. The results will be fed into a decision process to determine appropriate NRC response to declining performance.

. IE will be responsible for coordinating development of.a set of regulatory / safety performance indicators by an inter-office task group. The task group will conduct a trial program to attempt to validate a minimum set of performance indicators. The task group will also develop a framework to aid in decisionmaking utilizing plant regulatory performance history and performance indicator trends.

An initial effort to discuss the management decision process for licensees with declining performance was conducted last month at the ED0's meeting with office directors and regional administrators. i l

Contact:

R. Singh, IE nQsdf?' PilBLICLY RELEASEC

b The Commissioners Existing AE00 data systems and analysis processes will be utilized to the extent possible. AE00 will be responsible for revising its Trend and Pattern program to collect appropriate information on a timely basis and to provide performance indicator data derived from operational experience data that is needed to support the decisionmaking process described above. Redundancies and overlap in data systems and performance analysis activities between the office ' fill be minimized.

There are a number of ongoing NRR activities that will also be integrated into this effort to the extent practicable.

These include, for example, knowledge gained through the more than two dozen probablistic assessments of plants, information regarding design and operation that will be gained through safety assessments developed as part of severe accident policy implementation, and the maintenance performance monitoring program that is being conducted as part of the Human Factors Program Plan.

1 IE will be responsible and accountable to assure that the decisionmaking process is initiated when appropriate and will '

work in close coordination with the Regions and NRR in implementing needed regulatory actions. In addition, IE will coordinate the trending program for individual plants that will use as parts of its input the performance indicator data supplied by AE00, the Regions, and NRR. RES will continue working on development of risk-based indicators. Other offices will continue to employ programmatic performance indicators, as appropriate.

Program implementation is contingent on a successful trial program by the task group and Commission approval of a final program plan. The final program plan is scheduled to be provided to the Commission as a recommendation by September 1986.

Background:

In the Secretary's memorandum of March 10, 1986 (COMLZ-86-3),

the Commission directed the staff to submit a proposal on performance indicators for final approval. Specifically, the memorandum required:

1. Status of the current use of performance indicators by various staff elements including the use of specific indicators by specific offices.
2. Status of work the staff is doing on the development and study of performance indicators.

The Commissioners 3. A plan and schedule which lists the performance indicators the staff believes should be monitored and a recommendation for a single staff element to be responsible and accountable for evaluating and monitoring those indicators.

The status of current use and development efforts is mentioned below and described in detail in Enclosure 1.

The staff is not able to recommend a set of performance indicators at this time. However, an interoffice group has been established to review previous work, select indicators, conduct a trial program, and provide a final recommendation by the end of September 1986. These plans are summarized below and described in detail in Enclosure 2.

Discussion: In response to the accident at Three Mile Island, the NRC developed the SALP program to aid in the identification of those licensees that were more likely than others to have safety problems. The SALP process has generally been effective in providing an integrated evaluation of the performance of all licensees and bringing into focus the poor performance of some licensees. There have, however, been instances where the SALP evaluations missed poor l performance in some areas, and we have not always been j effective in assuring that problems identified in the SALP

are corrected by the licensees. Secondly, SALP appraisals are generally conducted every 12 to 18 months and there can be changes in performance between SALP appraisals. Finally, the SALP assessment is inherently subjective and it is believed that additional objective data would help to improve SALP.

The staff believes the development of objective indicators of a licensee's safety performance would be very beneficial.

Ideally, the indicators would reflect safety performance.

One logical approach would be to base the indicators on Probabilistic Risk Assessment (PRA) and use indicators that 1 would be directly relatable to severe accident probability.

However, the technology has not yet been extended to this application. Therefore, the indicators will have to be

measures of performance in areas that are believed to correlate with safety performance. Use of such indicators j will serve to remove some of the. subjectivity inherent in the SALP process.

Current Status of Use and Development i Several staff elements are using a number of performance indicators for'various purposes. The regional offices, NRR, and IE use.several indicators to assist in evaluating licensee performance for SALP. NRR also uses a number of

_ ~ - . . , - - . _ _ . _ . - _ - _ _ _ _ . . _ _ . , . . _ . _ . , _ . _ _ _ , _ _ _ . _ . _ . . _ - ._ , _ . _ , , . _ , , , - , , _ _ _ - , .

e The Commissioners indicators to assist in specific evaluation of a licensee's -

operation and maintenance. AE0D has several programs for analysis of trends and patterns in operational experience data, provides input into SALP, and closely monitors the first two years of experience for new plants. RES is evaluating the feasibility and effectiveness of quantitative

. risk-based performance indicators and alert levels.

Future Plans The staff considers the evolving SALP program as the corner-stone of its efforts to evaluate licensee performance.

Parallel staff efforts are underway to improve the SALP program. These include a reevaluation of functional areas, evaluation criteria, office participation, and the integra-

, tion of performance indicators from this study. In order to supplement the regular SALP evaluations, the staff plans to conduct a more structured trending of several performance indicators on a periodic basis between SALP assessments.

This should assist in more rapidly recognizing poor perform-ance and initiating more timely actions in response. This trending also is expected to contribute toward improving the SALP process by providing more objective input into the subsequent assessments. The trending results will be fed into a decision process on about a quarterly basis. Senior j management will determine from this process what NRC actions are appropriate in response to lagging performance.

I The decision process for determining what NRC actions are appropriate in response to lagging performance is a critical element in this program. It is generally easier to identify problems than to decide on the appropriate response.

Trending and analysis of performance indicators are not likely to yield answers, but rather to raise questions for management consideration. Initial discussions regarding the decision process were conducted last month. The regional offices, IE, and NRR identified plants of most concern based on information and experience at hand. The problems at these plants were discussed with the EDO,. office directors and i

regional administrators at the NRC management meeting on April 23 and 24.

An interoffice task group has been formed to select performance

-indicators, develop a trial program of performance trending, identify improvements needed, and provide a final proposal by the end of September 1986. In this process the group will consider a number of issues such as the current status of NRC efforts, elimination of duplication, and validation of indicators. The group also will discuss coordination with industry groups such as INPO.

-- - -- , - ~ - _ _ - - . -

The Commissioners IE is the staff element responsible and accountable for coordinating and implementing the performance indicator program discussed above. Other offices are still very much involved. For example, NRR gathers several categories of information in connection with operation and maintenance and inputs this information to trending programs. AE00 is responsible for analysis of trends and patterns and main-taining operating experience data and providing performance indicator data that is derived from operational experience data. RES will continue to develop and validate risk-based indicators and alert levels. IE is responsible and account-able for collecting and evaluating individual plant performance indicator data from AE00, Regions, and NRR for determining if trends seen in the performance indicators are the result of lagging performance or other factors. Additionally, IE will coordinate with the regional and headquarters offices to determine what NRC actions are appropriate 'n response to lagging performance.

Enclosure 2 provides the charter of the interoffice task group on performance indicators.

Recommendation: That the Commission

1. Approve the staff plan on performance indicators as described in this paper.

Scheduling: I have directed the staff proceed with the initial steps of this plan pending Commission review. Commission direction can be incorporated into the plan before initiation of a trial program.

Vf Stello,fdr.

Executive Director for Operations

Enclosures:

1. Current Status
2. Charter of the Interoffice Task Group on Performance Indicators

l t

s Commissioners' comments or consent should be provided directly i to the Office of the Secretary by c.o.b. Wednesday, May 21, 1986.

Commission Staff Office comments, if any, should be submitted i

to the Commissioners NLT Wednesday, May 14, 1986, with an infor-

}. mation copy to the Office of the Secretary. If the paper is of such a nature that it requires additional time for analytical i

review and comment, the Commissioners and the Secretariat should be apprised of when comments may be expected.

DISTRIBUTION:

Commissioners i

OGC OPE OI OCA

, OIA OPA REGIONAL OFFICES EDO ELD ACRS ASLBP ASLAP SECY 4

4 5

ENCLOSURE 1 CURRENT STATUS This enclosure provides the status of the current use of performance indicators by various staff elements and the status of work the staff has been doing on the development or study of performance indicators. Some of the work the various staff elements have been doing on the development or study of performance indicators may be redirected as a result of efforts by the interoffice group on performance indicators described in Enclosure 2.

A specific listing of those performance indicators being monitored and under development b" each staff element is contained in Table 1. The listing includes categories of information, basic causes or areas of weakness associated with problems, plant measures and maintenance measures. The table distinguishes between. indicators that have been used for some time in varying degrees and those that are under development or have only recently come into use. The table also indicates some areas of apparent overlap or duplication of effort.

It should be recognized that the table does not imply that each of the indicators is used in exactly the same way by each office. For example, one office may examine all scrams per 1000 hours0.0116 days <br />0.278 hours <br />0.00165 weeks <br />3.805e-4 months <br /> of operation while another may look at scrams per calendar month above a particular power level. Some staff elements use performance indicators for certain specific concerns such as maintenance 4 performance and other staff elements use performance indicators for broad indications of overall plant performance. For many offices, the use of performance indicators has been a continually evolving process with the types and uses of performance indicators changing as experience would dictate. Any areas of real overlap or duplication will be addressed by the interoffice group discussed in Enclosure 2 to ensure that the staff's efforts are coordinated.

Regions In general, all regions have had a continuing program over the past few years for review of plant performance indicators in support of the SALP program.

Examples of indicators typically used in the SALP evaluation are unplanned reactor trips, engineered safety feature (ESF) actuations, entry into limiting conditions for operation (LCO) and unplanned exposures and radioactive releases.

The general indicators are the SALP ratings and the recent and long-term trends in each SALP category which identify plants that warrant a closer look. In most instances, a closer look is accomplished through the normal inspection program. In certain instances, where the SALP evaluation indicates serious

) problems, special inspection programs are employed.

More recently, some regions have undertaken ambitious programs for monthly tracking of a wide variety of performance indicators such as backlog of work requests. These indicators go beyond those traditionally used in the SALP l process. The efficacy of some of the new indicators is still being evaluated.  !

l Regardless of the number or type of performance indicator used in each region, '

the significant feature is how the regions use data in SALP evaluations to determine the root cause of an event or problem which is then related to one or more SALP functional areas. For example, reactor trips can result from a l l t

1

variety of causes, i.e., personnel error, equipment failure, or design error; however, each of these can be related to a SALP functional area. Personnel errors can be related to operations, maintenance, or surveillance. The number of trips is secondary in importance to the underlying reasons that they are occurring. In this regard, it is not sufficient to address solely the volume or quality of licensee event reports (LERs); rather the regions find that extensive discussions regarding each reactor trip and ESF actuation provides the needed perspective. Frequently, the regions find the "real" root cause to be different from that stated in the LER.

IE - Division of Inspection Programs (DI) and Division of Quality Assurance, Vendor, and Technical Training Center Programs (DQAVT)

IE is responsible for the SALP program and monitors the regional activities discussed above. Recently, changes have been made to the SALP program to indicate that Category 3 performance is acceptable only on a short-term basis.

Letters transmitting Category 3 SALP ratings will require licensees to respond and provide planned corrective actions to achieve improved performance in Category 3 areas. Further actions in response to Category 3 ratings are under study.

In September 1984, IE proposed a pilot program at WNP-2 to evaluate whether performance trend analysis could substitute for elements of current NRC licensing and inspection practices and provide a more effective indication of plant performance. That proposal was a response to a recomir.endation in the QA report to Congress (NUREG-1055). WPPSS agreed to participate. The NRC pilot program efforts subsequently were dropped in deference to industry efforts. However, WNP-2 has developed an extensive performance indicator and quality trend analysis program. That program reports monthly on about 90 performance indicators in the categories of operations, maintenance, technical, HP/ chemistry, administration, training, QA, support services, and material management. The program reports quarterly on trend analysis of about 41 corrective action processes in the categories of plant quality surveillance, nonconformance, plant deficiency, QA audit, and NRC inspection reports.

IE has underway a test of a methodology for assessing licensee performance by review and analysis of trends in selected available data. The information source initially being used is the IE Statistical Data Reporting System (766 Computer System) file from 1975 to 1985. A three-tier coding system is applied to the data to quantify and analyze each individual plant performance problem by SALP functional area by the activity which was flawed, and by problem cause category. The information is categorized, summarized, and presented graphically.

The objective of this effort is to provide usable, objective performance data for the SALP review process.

IE - Division of Emergency Preparedness and Engineering Response (DEPER) l Over the past few years, IE has had an ongoing program of screening and evaluating operating reactor events. In cooperation with NRR:0 RAS, IE has screened and evaluated the 50.72 and regional morning reports on a daily basis to identify

. _ __ _ ~- . ___ _ _ _ _ . _ . _ . _ . . _ . _. _

i 1

1 s l

1.

generic problems. Increasing emphasis in this review process has been placed i on assessing trends in plant safety performance and identifying problem plants and areas requiring improvements within the plants. IE also has evaluated

! data to detect trends in performance for some plants that may not have been 5

apparent during the short-term event screening or indepth investigation into i

a single event.
For this type of evaluation, significant events or prcalems and the associated j basic causes or areas of weaknesses are identified through a systematic review j of several types of documents, such as event reports (10 CFR 50.72, 10 CFR *
50.73, 10 CFR 50.55e, 10 CFR Part 21, regional daily, preliminary notifications),

t inspection reports, enforcement actions, operational status (gray books), and nuclear plant reliability data system (NPROS). The significant events or problems include scrams, ESF actuations, forced outages, LC0 action statements, enforcement actions, and safety system or component failures. The basic causes or areas of weaknesses include design, installation, random equipment failure, fabrication or manufacturing, procedure, maintenance, personnel error, and s management controls.

]' The results of these analyses are maintained on computerized data bases for trending the types of events or problems and the associated basic causes or areas of weakness. In a typical program, the analysis for a preselected ,

period, such as one fuel cycle or one year, are used as the base _line data.

New data from the review of incoming reports are entered and compared against the base line data to determine whether the performance is improving or declining.

Data bases for this type of trending have already been established for Browns Ferry 1, 2, and 3; and Sequoyah 1 and 2; and one for Davis-Besse is currently

being established. These are intended for monitoring performance after restart.

In conjunction with the Corporate Data Network (CDN) prototype events program, larger data bases with capabilities for performance trending are being compiled

! for Browns Ferry 1, 2, and 3; Sequoyah 1 and 2; Palisades; Monticello; Farley 1 and 2; Oconee 1, 2, and 3; and St. Lucie 1 and 2. '

NRR - Operating Reactors Assessment Staff (0 RAS) and Licensing Divisions f

In addition to the normal screening of 50.72 and regional morning reports with IE for significant problems, the NRR-0 RAS has monitored several indicators such as the number of scrams and engineered safety feature actuations. Review of the 50.72 reports and LERs for recurring problems, their corrective actions, and adequacy of reporting are used with these indicators to provide input to

SALP and to trend low power and other early operating' experience for new plants and plants starting up after a long outage. Such reviews may lead to i:cresed regulatory attention by NRR, IE, and the regions. Another use for this information has been periodic Operating Reactors Events Briefings for NRR and IE management. These briefings are intended to highlight significant safety issues and coordinate assignments of followup actions, such as IE Notices or Generic Letters.

_._ m . _ _ _ . _ . . _ _ _ _ _ _ _ _ _ _ _ _ _ _ . . - ~ _ _ _ _ . . _

4-More recently, NRR-0 RAS, in conjunction with the three Licensing Divisions of NRR, has undertaken an effort to evaluate all plants for the 1984-1985 period using several additional performance indicators such as scrams with additional equipment failures or operator errors, loss of safety system function, and technical specification required shutdowns. The additional indicators were selected with an effort to focus on the more significant events, including challenges to safety systems and safety system failures. In addition, a significant effort has been expended by the project manager and NRR Licensing Divisions' management to idertify poor performance. This effort was undertaken to support the ED0 Senior Management Meeting held on April 23 and 24, 1986. Following this effort, efficacy of these indicators will be evaluated for potential future use in the trending program discussed in Enclosure 2.

NRR - Division of Human Factors Technology (DHFT)

DHFT monitors performance indicators that reflect on plant maintenance.

Currently, DHFT is examining 31 measures on a trial basis (see Table 1). These measures, which vary from " percentage of unit availability time" to "mean time between equipnient forced outages" were selected from an initial inventory of 133 discrete performance data elements. The indicators, which are based on publicly available data, are intended to identify maintenance performance trends in the industry, demonstrate the impact of maintenance performance on plant safety, and relate plant performance to maintenance performance. A computerized maintenance and surveillance data base has been developed to support the indicators analysis effort.

NRR - Division of Safety Review and Oversight (DSRO)

Within NRR, DSR0 has the lead in developing a plan and procedures by which NRR will improve the way it performs safety reviews and assures plant safety through the conducting of integrated assessments of plant systems in the decisionmaking process.

In integrated assessments, the positive and potentially negative effects of design requirements will be considered. In addition, these integrated assessments will address balance-of plant systems and other non-safety-related systems since challenge rates and mitigation system reliability are related issues. In this regard, PRA was designed in part, to provide a more fully integrated treatment of complex safety issues, and it will continue to be considered as a factor in conjunction with engineering judgement and operational data in the development of this " integrated" approach to safety reviews.

The use of performance indicators to identify the need for changes at specific plants is one of the several triggers being considered that could initiate an integrated assessment of a particular safety issue including alternative system modifications both on the system requiring the change and on interactive systems.

l

5 AEOD NRC Manual Chapter 0515-032, Section 031 (e) states that AE00 " develops and implements a program for the' analysis of trends and patterns in operational experience data, including transient initiators, event frequencies, and com-ponent failure data." NRC Policy and Planning Guidance for 1985, item 2 states that "the NRC will continue to closely monitor the first 2 years of operation of new plants coming on line, particularly those of licensees who have no prior experience with nuclear plants." To meet these requirements, AE0D has undertaken several studies which include performance indicators.

Beginning in 1984, AEOD initiated studies of reactor protection system (RPS) actuations and engineered safety feature (ESF) actuations. The principal data sources utilized for both studies are the reports filed under 10 CFR 50.72 and the LERs submitted under 10 CFR 50.73. The performance indicators included in these analyses have been included in AEOD's input to the SALP report for each plant and in AE0D's periodic reports to the Commission on the results of analysis of operational data.

In 1985, AEOD initieted a study of the unavailability of systems designed to fulfill safety functions. Data analysis, based primarily on LERs, is being performed. This effort continues and expands on a recently completed AE00 study (AE0D/C504) of the loss of safety system function which covered the 1981-83 period. The results of the study of 1984 data are scheduled to be issued in April 1986 and will consider safety system unavailability as a performance indicator.

A continuing study of technical specification violations has recently been initiated with preliminary results due in June 1986. This study continues the analysis of 1984 data conducted by NRR as part of the Technical Specification Improvement Program (TSIP). The scope of this study includes Grey Book data on power reductions required by technical specifications, which are not reported in LERs, in order to continue the TSIP review of the impact of technical speci-fications on unit performance. AE0D expects that the frequency of technical specification violations and, possibly, the frequency and magnitude of power reductions required by technical specifications would constitute meaningful performance indicators.

AE0D also has initiated a study of the performance of units that have just received their operating license. The initial report on the findings of this study are scheduled to be released in June 1986. One of the prime objectives of this study has been to determine, from existing data sources, operational trends which may be used in the analysis of the performance of a nuclear unit or utility. All units that received a license for power operation in the 1983-85 period are included in the study. This has permitted the establishment of a 2 year baseline of performance data for recently licensed units. The data sources have been restricted to computerized reactor event data bases for

.immediate notifications (50.72), LERs, and monthly operating reports (Gray Book). Some of the items being tracked from these sources which relate to unit performance include RPS and ESF actuations, LC0 and security events, and causal factors including human factors.

l 6-As the studies described above proceed from the pilot-study phase to a "produc-tion" mode, AE00 expects the timeliness of the reports to improve substantially.

In addition, AE00 is considering establishing a multitier program for data analysis. In such a program, very preliminary indicators would be produced frequently (e.g., monthly) and within a few days of the end of the period.

Such indicators may not include analysis of factors such as root cause which are generally not available in the preliminary report from the licensee (e.g.,

the 50.72 phone call). Subsequently, AE00 would produce more detailed reports, comparable to the reports described above, which would included the more detailed .

data contained in the LERs and in the NPRDS, and would include analysis of the data for such factors as the root cause. Finally, the Accident Sequence Precursor (ASP) program, which is administered by AE00, assigns a quantitative probablistic significance measure to operational events which may serve as a source of a more sophisticated indicator, or may provide some insight into possible weighting schemes for existing indicators.

AE00's trend and pattern analysis program is documented in the Trend and Pattern Program Plan (AEOD/P601) which was issued in January 1986.

In summary, AE00 in meeting its program planning requirements has developed an integrated program for the analysis of trends and patterns in operational data. A principal focus of this analysis has been measures of plant performance. To date, AE00 has completed several studies and has several ongoing studies which include performance indicators. These indicators include RPS and ESF actuation rates, technical specification violation data, system unavailability data, and security events. The results of these studies are being supplied to all concerned NRC offices so that prompt and informed decisions may be made if actions are needed to rectify existing or emerging trends that indicate that corrective action is required to improve the performance of specific plants.

RES RES is evaluating the feasibility and effectiveness of risk-based performance indicators and associated alert levels. This includes: (1) developing a methodology to evaluate currently used performance indicators' and associated alert levels' relationship to risk, and (2) evaluating the feasibility of integrated, risk-based performance indicators and alert levels. These risk-based indicators will reflect both the frequency of initiating events and the unavail-ability of safety systems, and will integrate available information from various types of failures throughout the plant. The purpose is to supplement other faster, simpler performance indicators and alert levels to assist in the evaluation and decisionmakimg process.

Earlier research on performance indicators evaluated an approach for assessing organizational performance at operating plants (NUREG/CR-3215 and -4378).

Other research on operational safety reliability also performed a preliminary l evaluation of an approach to reliability and risk-based performance indicators  !

and alert levels. The research discussed above to support the development of j risk-based performance indicators will extend appropriate parts of these earlier l research activities.

l

Table 1 Current Use and Development of Performance Indicators OFFICE / REGION IE NRR RES AE00 RI RII RIII RIV RV CATEGORY OF INFORMATION SALP ratings X X X X X X X X Scrams X X 0 X X X X X ESF actuations X X 0 X 0 X Safety system failures and challenges X X 0 X

  • Component failures in safety systems X 0 0 X LC0 action statements entered X 0 0 X
  • Forced outages X X 0 X 0 X 50.72 reports X X X X Enforcement actions X X X X X X X Licensee event reports X 0 X X X X X X Tech spec violations X X Security events X X X Allegations X X X Unplanned releases 0 X No. of unplanned exposure incidents 0 X 0 X Unit availability 0 X 0 X 0 X Plant chemistry problems O Power reductions X X = Performance indicator is being utilized .

0 = Performance indicator is under development or has only recently come into use

Table 1 0FFICE/ REGION IE NRR RES AE00 RI RII RIII RIV RV

  • Maintenance backlog 0 X Inop. and continously alarming C.R.

annunciators 0 X X No. of items out of

, service 0 No. of out of service tags in the C.R. X Use of reliability techn. to identify design weaknesses 0 Monitor and analyze operational reli-ability data (NPRDS) X 0 X 0 Reliability importance analysis 0 0 Quantitative comparison with alert levels 0 0 Mgt. involvement in evaluation of technical amd operating issues X X Plant management changes 0 Treatment of NRC requirements by operators, supervisors and managers involved in operation X Planning detail exercised by licensee X X = Performance indicator is being utilized 0 = Performance indicator is under development or has only recently come into use f

t Table 1 0FFICE/ REGION IE NRR RES AE00 RI RII RIII RIV RV Licensed operator exam performance (pass / fail) 0 X X X Resident inspector input X X X Site visite lay region management X X X Personnel exposures 0 X X X 0 Personnel contamina-tions X Contaminated areas of the plant X Radwaste produced /

processed for shipping 0 X 0 Gaseous and liquid activity released /yr. X 0 Construction def.

reports [50.55(e)] X X X 50.59 reports X Significant events selected for operating reactors briefings X BASIC CAUSE/ AREA 0F WEAKNESS Maintenance X X 0 X Personnel error X X 0 X X Procedure X X 0 X X ,

l X = Performance indicator is being utilized l 0 = Performance indicator is under development or has only recently come into use l 1

Table 1 0FFICE/ REGION IE NRR RES AE0D RI RII RIII RIV RV

  • Management control X X Design X X 0 X
  • Random equipment failure X 0 X Fabrication /

manufacturing X X Installation X X PERFORMANCE INDICATORS USED BY THE NRR DIVISION OF HUMAN FACTORS TECHNOLOGY (DHFT)*

Plant Measures Percent of unit availability time X Total number of forced outages X l Adjusted percent of forced outage time

per year X
  • Number of scrams per 1000 hours0.0116 days <br />0.278 hours <br />0.00165 weeks <br />3.805e-4 months <br /> critical X Number of ESF actuations X Number of major violations X X = Performance indicator is being utilized 0 = Performance indicator is under development or has only recently come into use
  • The detailed performance indicators that appear on this list are encompassed by the general categories of performance indicators provided above. However, j they are also listed here for clarity. '

l

Table 1 0FFICE/ REGION IE NRR RES AE00 RI RII RIII RIV RV Number of minor

violations X
  • Gross heat rate X Total number of LERs X Total man-rems of exposure X Quality programs SALP rating X Operations SALP rating X Maintenance Measures Number of component failure forced outages X Percent of total forced outage time due to component failures X Number of scrams due to component failures X

. Number of component failure related scrams per 1000 hours0.0116 days <br />0.278 hours <br />0.00165 weeks <br />3.805e-4 months <br /> reactor i critical X

Number of scrams due to maintenance and

< testing X Number of maintenance and testing related scrams per 1000 hours0.0116 days <br />0.278 hours <br />0.00165 weeks <br />3.805e-4 months <br /> reactor critical X i X = Performance indicator is being utilized 0 = Performance indicator is under development or has only recently come into use

.~

Table 1 0FFICE/ REGION IE NRR RES AE00 RI RII RIII RIV RV Number of ESF actua-tions due to mainte-nance or surveillance or component failure X Percent of component failure related LERs X Number of component failure LERs X Percent of component failure related LERs X Number of maintenance related LERs X Maintenance SALP rating X

  • Surveillance SALP rating X Percent of maintenance related violations of total violations X Number of maintenance related violations X Mean time between equipment failure forced outages X Mean time to return unit to service from equipment failure forced outages X Man-rems exposure due to maintenance X X = Performance indicator is being utilized 0 = Performance indicator is under development or has only recently come into use

ENCLOSURE 2 CHARTER OF THE INTEROFFICE TASK GROUP ON PERFORMANCE INDICATORS An interoffice task group on licensee performance has been created by the staff.

The group's task will be to develop a list of safety performance indicators and recommend methods for using this information. The principal assumptions of the task group are:

1. Continued use of the evolving systematic assessment of licensee performance (SALP) program every 12 to 18 months is the cornerstone of the effort to evaluate overall licensee performance.
2. Trending a set of regulatory and safety performance indicators on a periodic basis for specific plants is necessary to detect symptoms of declining overall licensee performance in between SALP evaluations as well as to provide additional objective input into the SALP process.
3. Structured decisionmaking in response to declining performance is necessary for a coherent and effective program.
4. The development and use of programmatic performance indicators (i.e.,

indicators to monitor performance in specific program areas such as maintenance) will remain with the responsible program office.

At the present time, the SALP program represents the staff's best efforts to assess licensees' regulatory performance. The current SALP program is not set in concrete; changes and improvements are expected in the future. Feedback from the interoffice group on performance indicators is expected to contribute to improvements. Nevertheless, the SALP process is expected to be the cornerstone of the staff's future efforts to evaluate overall licensee performance. The SALP generally covers a period from 1 year to 18 months and represents an integrated evaluation of licensee performance. The SALP process is not structured to identify relatively rapid changes in performance, and changes in performance occurring between SALP evaluations may not always be recognized by normal methods such as inspection, enforcement action, and subjective review of reported events.

Therefore, a more structured trending of several performance indicators on a per' odic basis between SALP evaluations is necessary.

More frequent trending should assist in more rapidly recognizing lagging performance and in initiating more timely actions in response. The results of the trending will be fed inte a decision process where the senior staff management determines what NRC actions are appropriate in response to lagging performance and informs the Commission. This trending also is expected to contribute toward improving the SALP process by providing additional objective input into the subsequent assessments. The added effort for real time data collection and periodic analysis may be offset by reduced effort required in preparation for each SALP.

The staff recognizes that the above trending and evaluation of performance indicators is not likely to provide answers on what regulatory actions are appropriate. Subjective judgments must be made using the results of the trending as one of the inputs for determining the actions needed. The decisionmaking is an important element of the staff plan.

2-The selection of a set of performance indicators fer trending is expected to be derived from many indicators already in use or being developed by different staff elements for various purposes. The use and developmental work at different offices are described in detail in Enclosure 1.

Many of the programs conducted by various staff elements for trending performance indicators are newly developed and have been anly recently instituted. These programs are expected to undergo refinements and a reliable set of indicators will emerge from the process as more experience is gained. The interoffice task group was established to expedite the development of a set of individual plant performance indicators and coordinate the efforts of all staff elements.

The interoffice task group will consider the following matters as part of its activities:

1. Review of previous NRC efforts on performance trending.
2. Review of the current activities by all offices including development of risk-based indicators.
3. Development of a trial program of trending a number of indicators, selectively utilizing data from current activities to the extent possible.

I 4. Validation or confirmation of the minimum set of performance indicators.

5. Identification of needed improvements in indicators and their confirmation or validation.
6. Coordination of staff efforts and elimination of unnecessary work or wasteful duplication.
7. Development of a framework for decisions when declining performance is detected.
8. Coordination with industry programs such as those of INP0.

Figure 1 shows the process of selecting and confirming or validating performance indicators.

The interoffice task group will conclude its activities by the end of September 1986 and will develop a final plan on performance indicators that meets the following commitments:

1. Complete the review of existing performance indicators by May 1986.
2. Conduct a trial program of performance trending during June-July 1986.
3. Recommend the management decision process for responding to declining trend.s during June-September 1986.

'- - p ._

e 3-1 i

4. Validate or confirm the results of the trial program against SALP and the

, results of NRC management meeting of April 23 and 24, 1986 and other performance assessment techniques, as appropriate, by August 1986.

! 5. Recommend a minimum set of performance indicators and identify data collection and input tasks for trending by September 1986.

6. Recommend the method of periodic analysis of the trends and responsibilities for computer program development by September 1986.

i

! 7. Recommend plans for improvements and additional confirmation or validation j of risk-based indicators and alert levels by September 1986.

8. . Recommend changes in licensee reporting requirements, if necessary, by September 1986.

i 9. Identify the resources and costs for implementing the trending programs by September 1986.

10. Develop plans for the coordination of staff efforts and elimination of unnecessary work or wasteful duplication by September 1986.

One of the most important elements of the staff's overall program on licensee i performance is the decisionmaking in response to deficient or declining i performance. Initial discussions regarding the management decision process for determining what NRC actions are appropriate were conducted at the EDO manage-

ment meeting on April 23 and 24, 1986. The regional offices, IE, and NRR identified the plants of most concern based on information and experience at hand. TN problems at these plants were discussed with the EDO, regional

! administrators, and office directors. The major portion of the meeting was i dedicated to this decision process.

i The primary emphasis of the decision process is a structured method of bringing up substantial problems for decision. The staff does not expect to develop a l set of rules to prescribe specific actions because the circumstances of each

case will be different. However, it is expected that some general guidelines will evolve. For example, if the latest SALP rating in maintenance is Category 3 and the trending indicates that maintenance problems are increasing there

{ would be a pretty clear signal that some action is needed. The specific action l l would likely depend strongly on past history. For instarce, if a problem is i longstanding and previous improvement efforts have not been effective, the '

j staff would be considering stronger enforcement action.

Completion of the interoffice group's work by September 1986 does not mean an 4

end to further improvements. For example, RES may continue to work on risk-based indicators. However, it is expected that the staff will have accomplished what the Commission has directed; i.e., the staff will provide a list of indicators that the staff believes should be monitored and a concrete plan for monitoring j j and evaluating these. indicators.

I 1

4

--v- , , n. ,,-a --., w._. -

---n,e -- e-., ,. p,,.. --e, - , . - - ,, ,,..-,,,.,,,n_.nm.....m,p.. . - , ..

---e---

~

i

^

Figure 1 DEVELOPMENT OF PERFORMANCE INDICATOR SYSTEM I I Establish charter I

I for interoffice group I 4

I I I

I i I Review I I o previous efforts I I o current use I I I I

I I Develop trial program I I I

I

, I 4

I Initial efforts at I F Validate or I I Coordinate staff I 4 I decision process I I confirm minimum I I efforts I I I I set I I I I I I I I I 4 I I I i I I I l I I Identify needed I I Consider coordination I I I improvements I I with industry efforts I l I I I I I I

I I I I

I Identify costs and resources I

I I I

I I I Recommend I I o selected indicators I I o data collection and input tasks I I o method of periodic analysis I I o responsibilities for computer I

~

I programs I j I o plans for improvement and additional I I validation I I o changes in reporting requirements I I o elimination of duplication I I I i

l l

I i

i

-r- , - -- ,-. ., __, e .r. .m-. , , , . . -e., , , - - . ..--*---,e - - , - - - , - - - -