ML20195G294: Difference between revisions

From kanterella
Jump to navigation Jump to search
(StriderTol Bot insert)
 
(StriderTol Bot change)
Line 16: Line 16:


=Text=
=Text=
{{#Wiki_filter:}}
{{#Wiki_filter:?
aerg'o                                  UNITED STATES                                      ACRSR-1834 g
8            o                NUCLEAR REGULATORY COMMISSION                                        PDR
    $                          ADVISORY COMMITTEE ON REACTOR SAFEGUARDS                                        i
"/ - o                                        WASHINGTON, D. C. 20666 i
June 10,1999                                                  1 1
Dr. William D. Travers Executive Director for Operations                                                                        '
U.S. Nuclear Regulatory Commission Washington, DC 20555-0001 l
 
==Dear Dr. Travers:==
 
==SUBJECT:==
PILOT APPLICATION OF THE REVISED INSPECTION AND ASSESSMENT PROGRAMS, RISK-BASED PERFORMANCE INDICATORS, AND                                          !
PERFORMANCE-BASED REGULATORY INITIATIVES AND RELATED                                      1 MATTERS During the 463"' meeting of the Advisory Committee on Reactor Safeguards, June 2 4,1999, we heard briefings by and held discussions with representatives of the NRC staff regarding the pilot applications of the revised inspection and assessment programs, risk-based performance indicators (Pis), and performance-based regulatory initiatives and related matters. Our Subcommittees on Reliability and Probabilistic Risk Assessment and on Regulatory Policies and Practices also met on April 21,1999, to discuss performance-based regulatory initiatives. We              !
had the benefit of the documents referenced.
In February 1999, we reviewed proposed revisions to the inspection and assessment programs,                )
including the proposed use of Pls, and provided a report to the Commission dated February 23, 1999. We previously reviewed staff efforts to develop risk-based Pls as Program for Risk-                l Based Analysis of Reactor Operating Experience of the former Office for Analysis and                      j Evaluation of Operational Data. In April 1998, we reviewed staff plans to increase the use of            ;
performance-based approaches in regulatory activities (SECY-98-132) and issued a report                    l' dated April 9,1998.
Recommendations
: 1.      The PI thresholds should be plant- or design-specific.
: 2.      The staff should explain the technical basis for the choice of sampling intervals of Pls used to select a value for comparison with the thresholds.                                  fb
: 3.      Prior to implementation of the pilot applications of the revised inspection and assessment programs, the pilot applications should be reviewed to make explicit what information will be collected and what hypotheses will be tested, 9                            :
d d M - 7 b C ($
p ,9061 g 4 990610 R-1834                PDR    -
              *t  ;, Q O *d
 
4:
2 V'
4.-      Tne sta*f should examine domestic and intemational studies to determine whether it is possible to develop useful Pls for safety culture.
: 5.      The action levels should be related explicitly to the risk metrics such as core damage frequency (CDF) and large, early release frequency (LERF), where possible.
: 6.      The current performance-based initiatives program should document the lessons leamed from current NRC activities in order to focus the diverse NRC activities related to performance-based regulation.
Discussion A major lesson loamed from probabilistic risk assessments (PRAs) is that the risk profile of each plant is unique. The major accident sequences and their contributions to the various risk metrics vary from plant to plant. A consequence of tnis lesson is that the importance of a PRA              ;
parameter, e.g., the unavailability of a system train, with respect to Pls can be assessed only in the context of the integrated risk profile that the PRA provides.
The intent of Pls is to provide objective measures for monitoring and assessing system, facility, and licensee performance. The performance metrics of the chosen set of Pls should assist in making better informed decisions regarding deviations in licensee performance from expectations. This information, combined with the PRA lesson noted above, leads us to the conclusion that the PI thresholds must be plant-specific or design-specific, where practicable.
The staff has recognized this in at least one instance, the white-yellow threshold (substantially declining performance) for emergency diesel generator unavailability (SECY-99-007).
In the proposed reactor oversight process, however, most of the thresholds are based on generic industry averages. For example, the 95* percentile of the plant-to-plant variability curve for a given parameter, e.g., system unavailability, is defined as the green-white threshold (declining performance). There are two fundamental problems with this approach:
: 1.      Selection of this criterion automatically results in about five plants being above the threshold. This creates an impetus for the licensee to bring the Pl below the threshold simply because other plants are doing "better." This may, in effect, create the perception that new regulatory requirements are being imposed on licensees. We do not believe that the oversight process should ratchet expectations for plants which already meet the requirements for adequate protaction. We note that this potential for ratcheting, whether actual or perceived, deviates from the intent of identifying declining plant performance.
: 2.      Establishing generic thresholds would not account for plant-specific features that may compensate for the risk impact of any particular parameter. For example, setting the threshold for the unavailability of a system on a generic basis without looking at each
                  - plant to understand why a particular value is achieved is contrary to the PRA lesson mentioned above.
The staff has acknowledged that there are both epistemic and aleatory uncertainties in the Pls and that the threshold values must account for both.- it is not clear how the staff intends to
. .. .~.            --..-c                  .
 
a 3
Y account for these uncertainties. How does the aleatory variability in an unavailability enter into an asses >, ment? What is the sample that is used to calculate this unavailability? Is it calculated every month? Is the average value computed over a year? How does the sampling method affect the establishment of threshold values? We believe that the sta# should prepare technical bases for these choices and develop altemative sampling methods to be tested in the pilot applications of the revised inspection and assessment programs.
This latter observation leads us to the issue of designing pilot applications. We would like to see a well-defined set of questions to be answered and hypotheses to be tested before the pilot applications of the revised inspection and assessment programs are implemented. For example, we would like to see in the pilot applications a staff evaluation of the administrative burden placed on inspectors. Although we agree that the proposed revisions to the assessment program are intended to enhance safety decisions and allocation of inspection resources, we are concemed that the proposed changes may adversely affect in-plant inspection time.
The staff has told us that it does not plan to develop Pls for the ' cross-cutting" issue of safety conscious work environment (safety culture). The principal reason stated by the staff is that "if a  l licensee had a poor safety conscious work environment, problems and events would continue to        !
occur at that facility to the point where either they would result in exceeding thresholds for various performance indicators, or they would be surfaced during NRC baseline inspection              l activities, or both." We believe that more justification is required for this argument. Safety        >
culture has been recognized as an important determinant of good plant performance. For                I example, the Intemational Atomic Energy Agency has developed an inspection manual that includes indicators of safety culture. Also, the Swedish Nuclear Power inspectorate recently published a report describing a systematic procedure using elicitation of expert judgment to produce Pls for safety culture.
The values of the Pls that trigger regulatory action seem to be only qualitatively related to risk metrics (CDF and LERF). We believe that action levels should have a more quantitative relationship to risk metrics consistent with the guidelines in Regulatory Guide 1.174.
The NRC has several activities in the area of performance based regulation that are either completed or ongoing. We believe that it would be useful to collect the lessons leamed from these activities and develop a set of principles and recommendations for future programs. The staff should document these results. This should be the objective of the current program on performance-based approaches to regulation.                            .
We commend the staff for its progress on these challenging matters.
Sincerely,
                                                                  \ ceA                    W44 tm Dana A. Powers -
Chairman t_
 
I 4
.p Referem es: ,,- -
: 1.      Memorar.dum dated March 22,1999, SECY-99-007A, from William D. Travers, Executive Director for Operations, NRC, for the Commissioners,
 
==Subject:==
 
Recommendations for Reactor Oversight Process improvements (Follow-up to SECY-99-007).
: 2.      Memorandum dated January 8,1999, SECY-99-007, fmm William D. Travers, Executive Director for Operations, NRC, for the Commissioners,
 
==Subject:==
Recommendations for Reactor Oversight Process Improvements.
              ' 3.      Memorandum dated April 16,1999, from Annette Vetti-Cook, Secretary of the Commission, tc William D. Travers, Executive Director for Operations, NRC,
 
==Subject:==
 
Staff Requirements - SECY-99-086 - Recommendations Regarding the Senior Management Meeting Process and Ongoing improvements to Existing Licensee Performance Assessment Processes.
: 4.      Report dated February 23,1999, from Dana A. Powers, Chairman, ACRS, to Shirley Arin Jackson, Chairman, NRC,
 
==Subject:==
Proposed improvements to the NRC Inspection and Assessment Programs.
: 5.    . Draft paper entitled, " Development of Risk-Based Performance indicators," by Patrick W.
Baranowsky, Steven E. Mays, and Thomas R. Wolf, NRC, received May 26,1999 (Predecisional).
: 6.      Draft memorandum, from William D. Travers, Executive Director for Operations, NRC, for the Commissioners,
 
==Subject:==
Plans for Pursuing Performance-Based initiatives, received May 12,1999 (Predecisional).
: 7.      Memorandum cated February 11,1999, from Annette L. Vietti-Cook, Secretary of the Commission, to William D. Travers, Executive Director for Operations, NRC,
 
==Subject:==
 
Staff Requirements - SECY-98-132 - Plans to increase Performance-Based Approaches in Regulatory Activities.                                      4
: 8.      Report dated April 9,1998, from R. L. Seale, Chairman, ACRS, to L. Joseph Callan, Executive Director for Operations, NRC,
 
==Subject:==
Plans to increase Performance-Based Approaches in Regulatory Activities.
: 9.      U. S. Nuclear regulatory Commission, Regulatory Guide 1.174, "An Approach for Using Probabilistic Risk Assessment in Risk-Informed Decisions on Plant-Specific Changes to the Licensing Basis," July 1998.
: 10.      International Atomic Energy Agency,IAEA-TECDOC-743," ASCOT Guidelines, Guidelines for organizational assessment of safety culture and for reviews by the Assessment of Safety Culture in Organizations Team," March 1994.                        i
: 11.      Swedish Nuclear Power Inspectorate, SKI Report 99:19, *Research Project                l Implementation of a Risk-Based Performance Monitoring System for Nuclear Power
                      - Plants: Phase ll - Type-D Indicators," February 1999.
l l
I l
p 'e S .        e      'm N
                          -    s -ac n - ~ *
* A *    **            *}}

Revision as of 11:27, 16 December 2020

Informs That During 463rd Meeting of ACRS on 990602-04, Committee Heard Briefings by & Held Discussions with Representatives of NRC Staff Re Pilot Applications of Revised Insp & Assessment Programs
ML20195G294
Person / Time
Issue date: 06/10/1999
From: Powers D
Advisory Committee on Reactor Safeguards
To: Travers W
NRC OFFICE OF THE EXECUTIVE DIRECTOR FOR OPERATIONS (EDO)
References
ACRS-R-1834, NUDOCS 9906150314
Download: ML20195G294 (4)


Text

?

aerg'o UNITED STATES ACRSR-1834 g

8 o NUCLEAR REGULATORY COMMISSION PDR

$ ADVISORY COMMITTEE ON REACTOR SAFEGUARDS i

"/ - o WASHINGTON, D. C. 20666 i

June 10,1999 1 1

Dr. William D. Travers Executive Director for Operations '

U.S. Nuclear Regulatory Commission Washington, DC 20555-0001 l

Dear Dr. Travers:

SUBJECT:

PILOT APPLICATION OF THE REVISED INSPECTION AND ASSESSMENT PROGRAMS, RISK-BASED PERFORMANCE INDICATORS, AND  !

PERFORMANCE-BASED REGULATORY INITIATIVES AND RELATED 1 MATTERS During the 463"' meeting of the Advisory Committee on Reactor Safeguards, June 2 4,1999, we heard briefings by and held discussions with representatives of the NRC staff regarding the pilot applications of the revised inspection and assessment programs, risk-based performance indicators (Pis), and performance-based regulatory initiatives and related matters. Our Subcommittees on Reliability and Probabilistic Risk Assessment and on Regulatory Policies and Practices also met on April 21,1999, to discuss performance-based regulatory initiatives. We  !

had the benefit of the documents referenced.

In February 1999, we reviewed proposed revisions to the inspection and assessment programs, )

including the proposed use of Pls, and provided a report to the Commission dated February 23, 1999. We previously reviewed staff efforts to develop risk-based Pls as Program for Risk- l Based Analysis of Reactor Operating Experience of the former Office for Analysis and j Evaluation of Operational Data. In April 1998, we reviewed staff plans to increase the use of  ;

performance-based approaches in regulatory activities (SECY-98-132) and issued a report l' dated April 9,1998.

Recommendations

1. The PI thresholds should be plant- or design-specific.
2. The staff should explain the technical basis for the choice of sampling intervals of Pls used to select a value for comparison with the thresholds. fb
3. Prior to implementation of the pilot applications of the revised inspection and assessment programs, the pilot applications should be reviewed to make explicit what information will be collected and what hypotheses will be tested, 9  :

d d M - 7 b C ($

p ,9061 g 4 990610 R-1834 PDR -

  • t  ;, Q O *d

4:

2 V'

4.- Tne sta*f should examine domestic and intemational studies to determine whether it is possible to develop useful Pls for safety culture.

5. The action levels should be related explicitly to the risk metrics such as core damage frequency (CDF) and large, early release frequency (LERF), where possible.
6. The current performance-based initiatives program should document the lessons leamed from current NRC activities in order to focus the diverse NRC activities related to performance-based regulation.

Discussion A major lesson loamed from probabilistic risk assessments (PRAs) is that the risk profile of each plant is unique. The major accident sequences and their contributions to the various risk metrics vary from plant to plant. A consequence of tnis lesson is that the importance of a PRA  ;

parameter, e.g., the unavailability of a system train, with respect to Pls can be assessed only in the context of the integrated risk profile that the PRA provides.

The intent of Pls is to provide objective measures for monitoring and assessing system, facility, and licensee performance. The performance metrics of the chosen set of Pls should assist in making better informed decisions regarding deviations in licensee performance from expectations. This information, combined with the PRA lesson noted above, leads us to the conclusion that the PI thresholds must be plant-specific or design-specific, where practicable.

The staff has recognized this in at least one instance, the white-yellow threshold (substantially declining performance) for emergency diesel generator unavailability (SECY-99-007).

In the proposed reactor oversight process, however, most of the thresholds are based on generic industry averages. For example, the 95* percentile of the plant-to-plant variability curve for a given parameter, e.g., system unavailability, is defined as the green-white threshold (declining performance). There are two fundamental problems with this approach:

1. Selection of this criterion automatically results in about five plants being above the threshold. This creates an impetus for the licensee to bring the Pl below the threshold simply because other plants are doing "better." This may, in effect, create the perception that new regulatory requirements are being imposed on licensees. We do not believe that the oversight process should ratchet expectations for plants which already meet the requirements for adequate protaction. We note that this potential for ratcheting, whether actual or perceived, deviates from the intent of identifying declining plant performance.
2. Establishing generic thresholds would not account for plant-specific features that may compensate for the risk impact of any particular parameter. For example, setting the threshold for the unavailability of a system on a generic basis without looking at each

- plant to understand why a particular value is achieved is contrary to the PRA lesson mentioned above.

The staff has acknowledged that there are both epistemic and aleatory uncertainties in the Pls and that the threshold values must account for both.- it is not clear how the staff intends to

. .. .~. --..-c .

a 3

Y account for these uncertainties. How does the aleatory variability in an unavailability enter into an asses >, ment? What is the sample that is used to calculate this unavailability? Is it calculated every month? Is the average value computed over a year? How does the sampling method affect the establishment of threshold values? We believe that the sta# should prepare technical bases for these choices and develop altemative sampling methods to be tested in the pilot applications of the revised inspection and assessment programs.

This latter observation leads us to the issue of designing pilot applications. We would like to see a well-defined set of questions to be answered and hypotheses to be tested before the pilot applications of the revised inspection and assessment programs are implemented. For example, we would like to see in the pilot applications a staff evaluation of the administrative burden placed on inspectors. Although we agree that the proposed revisions to the assessment program are intended to enhance safety decisions and allocation of inspection resources, we are concemed that the proposed changes may adversely affect in-plant inspection time.

The staff has told us that it does not plan to develop Pls for the ' cross-cutting" issue of safety conscious work environment (safety culture). The principal reason stated by the staff is that "if a l licensee had a poor safety conscious work environment, problems and events would continue to  !

occur at that facility to the point where either they would result in exceeding thresholds for various performance indicators, or they would be surfaced during NRC baseline inspection l activities, or both." We believe that more justification is required for this argument. Safety >

culture has been recognized as an important determinant of good plant performance. For I example, the Intemational Atomic Energy Agency has developed an inspection manual that includes indicators of safety culture. Also, the Swedish Nuclear Power inspectorate recently published a report describing a systematic procedure using elicitation of expert judgment to produce Pls for safety culture.

The values of the Pls that trigger regulatory action seem to be only qualitatively related to risk metrics (CDF and LERF). We believe that action levels should have a more quantitative relationship to risk metrics consistent with the guidelines in Regulatory Guide 1.174.

The NRC has several activities in the area of performance based regulation that are either completed or ongoing. We believe that it would be useful to collect the lessons leamed from these activities and develop a set of principles and recommendations for future programs. The staff should document these results. This should be the objective of the current program on performance-based approaches to regulation. .

We commend the staff for its progress on these challenging matters.

Sincerely,

\ ceA W44 tm Dana A. Powers -

Chairman t_

I 4

.p Referem es: ,,- -

1. Memorar.dum dated March 22,1999, SECY-99-007A, from William D. Travers, Executive Director for Operations, NRC, for the Commissioners,

Subject:

Recommendations for Reactor Oversight Process improvements (Follow-up to SECY-99-007).

2. Memorandum dated January 8,1999, SECY-99-007, fmm William D. Travers, Executive Director for Operations, NRC, for the Commissioners,

Subject:

Recommendations for Reactor Oversight Process Improvements.

' 3. Memorandum dated April 16,1999, from Annette Vetti-Cook, Secretary of the Commission, tc William D. Travers, Executive Director for Operations, NRC,

Subject:

Staff Requirements - SECY-99-086 - Recommendations Regarding the Senior Management Meeting Process and Ongoing improvements to Existing Licensee Performance Assessment Processes.

4. Report dated February 23,1999, from Dana A. Powers, Chairman, ACRS, to Shirley Arin Jackson, Chairman, NRC,

Subject:

Proposed improvements to the NRC Inspection and Assessment Programs.

5. . Draft paper entitled, " Development of Risk-Based Performance indicators," by Patrick W.

Baranowsky, Steven E. Mays, and Thomas R. Wolf, NRC, received May 26,1999 (Predecisional).

6. Draft memorandum, from William D. Travers, Executive Director for Operations, NRC, for the Commissioners,

Subject:

Plans for Pursuing Performance-Based initiatives, received May 12,1999 (Predecisional).

7. Memorandum cated February 11,1999, from Annette L. Vietti-Cook, Secretary of the Commission, to William D. Travers, Executive Director for Operations, NRC,

Subject:

Staff Requirements - SECY-98-132 - Plans to increase Performance-Based Approaches in Regulatory Activities. 4

8. Report dated April 9,1998, from R. L. Seale, Chairman, ACRS, to L. Joseph Callan, Executive Director for Operations, NRC,

Subject:

Plans to increase Performance-Based Approaches in Regulatory Activities.

9. U. S. Nuclear regulatory Commission, Regulatory Guide 1.174, "An Approach for Using Probabilistic Risk Assessment in Risk-Informed Decisions on Plant-Specific Changes to the Licensing Basis," July 1998.
10. International Atomic Energy Agency,IAEA-TECDOC-743," ASCOT Guidelines, Guidelines for organizational assessment of safety culture and for reviews by the Assessment of Safety Culture in Organizations Team," March 1994. i
11. Swedish Nuclear Power Inspectorate, SKI Report 99:19, *Research Project l Implementation of a Risk-Based Performance Monitoring System for Nuclear Power

- Plants: Phase ll - Type-D Indicators," February 1999.

l l

I l

p 'e S . e 'm N

- s -ac n - ~ *

  • A * ** *