ML19011A425

From kanterella
Jump to navigation Jump to search
Lecture 2-3 Point-Counterpoint 2019-01-16
ML19011A425
Person / Time
Issue date: 01/16/2019
From:
Office of Nuclear Regulatory Research
To:
Nathan Siu 415-0744
Shared Package
ML19011A416 List:
References
Download: ML19011A425 (34)


Text

PRA and RIDM Point-Counterpoint Lecture 2-3 1

Key Topics

  • Progress in RIDM: slow pace and some reasons
  • Sample concerns - point/counterpoint/comment

- Matching operational experience

- Matching intuition

- Subjectivity

- Complexity

- Uncertainty 2

Overview

Resources T.R. Wellock, A figure of merit: quantifying the probability of a nuclear reactor accident, Technology and Culture, 58, No. 3, 678-721, July 2017.

J. Downer, The unknowable ceilings of safety: three ways that nuclear accidents escape the calculus of risk assessments, The Ethics of Nuclear Energy: Risk, Justice, and Democracy in the Post-Fukushima Era, B. Taebi and S. Roeser (eds.), Cambridge University Press, 2015.

J. Downer, Disowning Fukushima: managing the credibility of nuclear reliability assessment in the wake of disaster, Regulation and Governance, Wiley, 2013.

Pietrangelo, A.R., Nuclear Energy Institute, Industry support and use of PRA and risk-informed regulation, letter to A.M. Macfarlane, Chairman, U.S.

Nuclear Regulatory Commission, December 19, 2013. (ADAMS ML13354B997)

D.A. Ward, The Consistent Use of Probabilistic Risk Assessment, letter from U.S. Nuclear Regulatory Commission Advisory Committee on Reactor Safeguards to Chairman I. Selin, July 19, 1991. (Provided as an exhibit in NUREG-1489, 1994.)

3 Overview

Other References U.S. Nuclear Regulatory Commission, Achieving modern risk-informed regulation, SECY-18-0060, May 23, 2018. (ADAMS ML18110A187)

U.S. Nuclear Regulatory Commission, A Review of NRC Staff Uses of Probabilistic Risk Assessment, NUREG-1489, March 1994.

B. Fischoff, P. Slovic, and S. Lichtenstein, Fault trees: sensitivity of estimated failure probabilities to problem representation, Journal of Experimental Psychology: Human Perception and Performance, 4, No. 2, 330-344, 1978.

D. Kahneman, P. Slovic, and A. Tversky (eds.), Judgment Under Uncertainty: Heuristics and Biases, Cambridge University Press, Cambridge, MA, 1982.

M. Granger Morgan, Use (and abuse) of expert elicitation in support of decision making for public policy, National Academy of Sciences Proceedings (NASP), 111, No. 20, 7176-7184, May 20, 2014.

D. Lochbaum, Nuclear Plant Risk Studies: Then and Now, September 29, 2017. Available from: https://allthingsnuclear.org/dlochbaum/nuclear-plant-risk-studies 4

Overview

Other References (cont.)

N. Siu, et al., Probabilistic Risk Assessment and Regulatory Decisionmaking: Some Frequently Asked Questions, NUREG-2201, U.S.

Nuclear Regulatory Commission, September 2016.

N. Siu, K. Coyne, and N. Melly, Fire PRA maturity and realism: a technical evaluation, U.S. Nuclear Regulatory Commission, March 2017. (ADAMS ML17089A537)

I. Flatow, Truth, Deception, and the Myth of the One-Handed Scientist, October 18, 2012. Available from:

https://thehumanist.com/magazine/november-december-2012/features/truth-deception-and-the-myth-of-the-one-handed-scientist M. Drouin, et al., Guidance on the Treatment of Uncertainties Associated with PRAs in Risk-Informed Decisionmaking, NUREG-1855, Rev. 1, March 2017.

M. Fuchida and M. Okumiya, Midway: The Battle that Doomed Japan, The Japanese Navys Story, U.S. Naval Institute, Annapolis, MD, 1955.

5 Overview

Slow Progress 6

1975 WASH-1400 1979 Commission withdraws endorsement of Executive Summary TMI-2 1981-1982 Zion and Indian Point hearings 1991 ACRS letter identifies a few examples of problems with the use of PRA in NRC, all common enough to be disturbing, and increasing in frequency 1995 PRA Policy Statement 1998 White paper on risk-informed decision making (SECY-98-144)

Regulatory Guide 1.174 2002 Risk-informed climate survey 2013 NEI letter: progress in applying risk-informed insights is discouragingly slow.

2018 Staff paper on NRC transformation identifies need for systematic and expanded use of risk and safety insights in decisionmaking. (SECY-18-0060)

RIDM Progress

10 CFR 50.59 - An Example We are considering, both internally and in joint efforts with the nuclear power industry, how we could make our reactor regulations in 10 CFR Part 50 more risk-informed, with particular emphasis on developing a more risk-informed 10 CFR 50.59 process.

Chairman Jackson (1998) the staff recommends that the Commission direct the staff to revise Title 10 of the Code of Federal Regulations (10 CFR) 50.59, Changes, Tests, and Experiments, and comparable sections, as needed, to allow licensees additional flexibility to make facility changes without prior NRC approval while ensuring safety and security.

- EDO McCree (2018) 7 RIDM Progress

Influencing Factors

  • Technical challenges (Lecture 9-1)
  • Changing targets (biggest bang for buck, low hanging fruit), organizational inertia
  • Disruptive events

- 9/11

- Fukushima

  • Culture 8

RIDM Progress

On Culture Nuclear Energy Institute (2013): risk-informed regulatory activities have stagnated

- Licensees have picked low-hanging fruit

- Risk-informed ROP biased by desired outcomes

- Cultural issues (deterministic thinking) persist

- PRA standards and associated activities (e.g., peer review) do not return commensurate value NRC (2018): there is a need for systematic and expanded use of risk and safety insights in decisionmaking to:

- Focus resources on the most safety-significant issues

- Consider performance of system as a whole

- Acknowledge potential of new technologies lacking operational experience Cultural change is needed to enable/support transformation 9

RIDM Progress

On Culture (cont.)

  • In the context of organizations, culture includes attitudes, beliefs, core values, and behaviors
  • Observations

- Sparseness/lack of objective data => need for analyst judgment => potential for cultural influences on performance and acceptance of PRA and RIDM

- RIDM involves a large number and variety of stakeholders, and PRA is a multidisciplinary enterprise =>

cultural challenges are multiplied

- Organizational literature indicates that efforts to change culture face very large challenges, need time and resources, and are frequently unsuccessful 10 RIDM Progress

On Culture (cont.)

  • Potential direct effects of culture on PRA/RIDM

- Technical, e.g.,

  • Biases on likely sources of risk
  • Use of conservative modeling assumptions
  • Use of subjective probability and Bayesian methods
  • Use of non-objective evidence
  • Validation/good enough standards

- More fundamental, e.g.,

  • Use of risk (vs. possibility)
  • Enterprise risk in use of risk (slippery slope)
  • Balancing risk/efficiency vs. other regulatory principles
  • Distrust of entire system (including performers) 11 RIDM Progress

Principles of Good Regulation*

12 Principle Commentary Independence Final decisions must be based on objective, unbiased assessments of all information, and must be documented with reasons explicitly stated.

Openness The public must be informed about and have the opportunity to participate in the regulatory processes. Open channels of communication must be maintained with all stakeholders.

Efficiency The highest technical and managerial competence must be a constant agency goal. Regulatory activities should be consistent with the degree of risk reduction they achieve. Regulatory decisions should be made without undue delay.

Clarity Regulations should be coherent, logical, and practical. There should be a clear nexus between regulations and agency goals and objectives whether explicitly or implicitly stated.

Reliability Regulations should be based on the best available knowledge. Once established, regulation should be perceived to be reliable and not unjustifiably in a state of transition. Regulatory actions should be promptly, fairly, and decisively administered so as to lend stability to the nuclear operational and planning processes.

On Culture (cont.)

  • Potential effects of culture on organizational climate

- Positive, e.g.,

  • Encouragement of risk-informed thinking
  • Willingness to act on findings

- Negative

  • Pressure against contrary results
  • Requirements to justify use of PRA
  • Requirements to include solutions as well as problems 13 RIDM Progress

Criticisms of PRA/RIDM Principles and Practice Myriad sources:

  • Extensive external literature - see Wellock (2017)
  • Industry and NRC reviews Principal sources for this lecture:
  • Downer (2013)
  • Pietrangelo (2013)
  • Ward (1991)
  • Personal experience 14 Typical Issues Matching operational experience Matching intuition Subjectivity Complexity Uncertainties RIDM Progress

Matching Operational Experience Critic: Current NPP PRAs are not realistic

- PRA didnt predict the Fukushima accident. Rationalizations for continued use are based on disowning event (Downer, 2013):

  • Interpretive defense - event was not a failure within analysis scope
  • Relevance defense - event is not applicable to NPPs of interest
  • Compliance defense - event involved non-compliant behaviors outside scope of analysis
  • Redemption defense - event revealed problems that have been (or will be) fixed

- PRA accident rate estimates (e.g., for CDF) are lower than those directly estimated from global experience

- Some estimates for intermediate damage states (e.g., fire-induced losses of equipment) dont match operational experience 15 Operational Experience

Matching Operational Experience (cont.)

Supporter: PRAs are sufficiently realistic for their intended use

- PRAs dont predict - they identify and rank possibilities. Possibilities considered unlikely by the broad technical community (not just the PRA community) can occur.*

- PRAs and global statistics models are models. The latter typically ignore plant-and organizational-specific features that are important to risk, including improvements since past accidents.

- Current PRAs are not aimed at intermediate damage states.

16

  • See Lecture 7-2 for further discussion of the Fukushima Dai-ichi reactor accidents Operational Experience

Matching Operational Experience (cont.)

Additional Comment: Dont undervalue comparisons with OpE

- Comparisons Provide alerts against complacency, hubris Identify potential modeling improvements (scenarios/mechanisms not addressed, scenarios/mechanisms that have been overemphasized)

Arent meant to predict future performance (kidie soccer team)

- Plants have made significant improvements but past lessons can still be relevant today (see Lecture 7-2).

- Monitoring and adjustment is a fundamental function for nuclear safety in general and RIDM in particular.

17 Operational Experience

Matching Intuition Critic: PRA-identified contributors to risk dont match expectations

- View is typically not stated directly, but is implicit in (sometimes derogatory) comments, complaints about lack of realism

- Typically applied to internal fires and external hazards (overly conservative analyses)

Supporter: Prior expectations should not govern results

- For complex systems, risk comes from unlikely events or combinations of events (e.g., beyond design basis, overlooked interactions); intuition-based expectations are not necessarily correct

- Classic example: assumption that designing for a worst case event addresses the risk from less severe events (e.g., LLOCA)

- R&D can address topics where analyses are demonstrably unrealistic.

18 Intuition

Matching Intuition (cont.)

Additional Comments

- Biases regarding expected risk contributors (which might be well founded given current knowledge) can affect project direction and resource allocations (e.g., other external hazards)

- Intuition and expectations vary with technical discipline and expertise; PRA and the treatment of rare events provide an extreme challenge

- The use of conservatism to address uncertainties is a long-standing practice in engineering; reducing this use is a cultural issue for analysts and other stakeholders

- Subjectivity is a fundamental feature of PRA and RIDM (see later slides) but analysts and users need to be mindful of limitations of intuition 19 Intuition

Intuition - An Everyday Example 20 ABS hydraulic unit The Anti-lock Braking System (ABS) hydraulic unit was included in the Lecture 2-2 fault tree. Do you agree with this? Why or why not?

Factors:

a)

Personal knowledge b)

Desired level of conservatism c)

Decision problem i.

Risk metrics ii.

Assumptions about driver(s)

Do you expect that the addition of ABS has reduced the risk of run-off-road fatal crashes?

Intuition

Intuition - A Flooding Example 21

~6 ft

~50 ft Intuition

Flooding (cont.)

22 Date Harpers Ferry, WV*

Great Falls, VA**

3/19/1936 36.5 50.0 6/1/1889 34.8 10/16/1942 33.8 49.6 10/1/1896 33.0 11/6/1985 30.1 41.6 9/8/1996 29.8 42.5 1/21/1996 29.4 42.5 11/25/1877 29.2 4/27/1937 29.0 44.4 6/23/1972 27.7 46.0 Flood Crests (ft)

  • From www.weather.gov/media/marfc/Top20/POT/HarpersFerry.pdf
    • Estimated Intuition

Flooding (cont.)

23

~6 ft

~50 ft Intuition

Subjectivity Critic: PRAs are inherently subjective and shouldnt be used to support decision making

- Normal human biases

- Biases from socio-political environment => can get any answer desired

- Numbers provide a false veneer of objectivity Supporter: RIDM should use best available information (including technical judgments of appropriate experts)

- PRA provides a structured and (in principle) traceable scheme to integrate available information; supports what-if analyses of subjective judgments.

- Numerical analysis provides discipline; insights are more important than bottom line numbers*

- Weaknesses in PRA are addressed by RIDM backstops (e.g., peer reviews, NRC review, safety margins, defense-in-depth, monitoring) 24

  • View commonly stated but not always followed in practice.

Subjectivity

Subjectivity (cont.)

Additional Comments

- No argument: biases can influence practical PRA and RIDM. Need to:

Be aware of sources of bias Bring in/use objective information to the extent practicable*

Try to limit influence of organizational/societal goals on analysis

- A long-standing lesson thats still ignored by many: comparisons with safety goals to prove plants are safe dont work with critics Uncertainties in analysis Uncertainties outside of analyses (completeness uncertainties)

Distrust of source

- Another ignored lesson: overstatement of numerical precision

- Subjectivity + proper attitude (searching for failures) enables departures from cookbook exercises, use of imagination => improved understanding of safety 25

  • Decision support context => extent of effort subject to value-added considerations (also subjective)

Subjectivity

Adapted from: B. Fischoff, P. Slovic, and S. Lichtenstein, Fault trees: sensitivity of estimated failure probabilities to problem representation, Journal of Experimental Psychology: Human Perception and Performance, 4, No. 2, 330-344, 1978 On Biases - A Fault Tree Example Experiments: effect of modeling on likelihood of failing to start (Fischoff et al., 1978)

Comparison of results from Full fault tree Pruned fault tree Experiment 6: experienced auto mechanics Strongly affected by availability heuristic: omission of major branches triggered only minimal awareness Out of sight = out of mind Similar to results for lay subjects (non-experts) 26 Subjectivity Heuristics for dealing with uncertainty:

Representativeness Availability Anchoring and adjustment D. Kahneman, P. Slovic, and A. Tversky (eds.), Judgment Under Uncertainty: Heuristics and Biases, Cambridge University Press, Cambridge, MA, 1982

On Biases: A Non-PRA Example Battle of Midway (June 1942, World War II)

Pre-battle war game: umpire rules land-based American bombers scored 9 hits on 2 Japanese aircraft carriers, sinking both. Presiding admiral overrules umpire, declaring only 3 hits, sinking 1 carrier but only slightly damaging other.

Actual battle: U.S. Navy bombers (with considerable sacrifice and luck) sink all 4 Japanese carriers. One U.S. carrier is lost. Battle is turning point in the Pacific theater.

27 Carrier Hiryu after dive bombing (Photo from S. Fukui, in M. Fuchida and M. Okumiya, Midway: The Battle that Doomed Japan, The Japanese Navys Story, U.S. Naval Institute, Annapolis, MD, 1955.)

Subjectivity

Complexity Critic: PRA is too complex Requires specialized knowledge (PRA is for my PhDs)

Product is inscrutable Disenfranchises stakeholders, establishes gatekeepers (I dont want to have to go through you guys)

Supporter: provides structure to deal with complexities in a fundamentally complex problem Not an easy button Documentation provides information to aid understanding (data used, key assumptions, intermediate as well as final results)

RIDM processes are open to stakeholder input (internal and external); stakeholders need to engage Apparent complexity can be reduced through training 28 Complexity NUREG-1150

Complexity (cont.)

Critic: PRAs are not complex enough

- Doesnt adequately address key issues

  • Implicit modeling - no knobs and dials to adjust
  • Generic models
  • Intermediate modeling assumptions (e.g., success criteria)

- Doesnt address important gaps requiring advanced methods (completeness uncertainty)

Supporter: Level of modeling has proven useful in applications; proposed improvements need to balance benefits with costs and risks 29 Complexity More Detail More realistic PRA?

Better information for decision?

Appropriate resource diversion?

Complexity (cont.)

  • Additional Comments

- Complexities in approaches to deal with technical issues are real, not always easy to understand

  • Rare events
  • Myriads of combinations of events
  • Forces and phenomena beyond common experience

- Perception of complexity is subjective.

  • Artificial measures based on system state counts useful for some applications, but humans organize (chunk) information in personal ways.
  • Addressing complexity: need to account for personal information needs and preferences => no one size fits all solution; not just a matter of education on probability

- Lack of understanding is a concern

  • Lack of trust, blind trust, frustration
  • Preference for simple solutions that may/may not be appropriate, inhibition of efforts to improve analysis 30 Complexity

Complexity (cont.)

  • Additional Comments (cont.)

- Documentation is difficult to scrutinize and understand

  • Thousands of pages.
  • Models (event trees, fault trees) spanning several displays (hard copy, electronic)
  • Linking text (e.g., modeling assumptions) with models
  • Search tools: hits on facts but not synthesis
  • Separated documents increase difficulty in tracing; document design is a result of a model (what users want to see, how they want to see).
  • Underlying models and data might not be easily accessible

- Fundamental trust is affected by complexity but also other factors. See Lecture 8-4.

31 Complexity

Uncertainty Critic: PRA results are too uncertain to be useful Very large magnitudes Whats left out is more important than whats analyzed Should use worst case/possibilistic approaches Supporter: PRAs dont create uncertainty, they illuminate Large uncertainties are a condition of the current state of knowledge; PRAs represent rather than hide In many practical situations, uncertainties arent a driving factor in choosing between options RIDM includes considerations and processes to account for unmodeled concerns There is no such thing as a worst case; actual worst cases have implicit considerations of likelihood Possibilistic approaches can work for narrowly defined decision problems but be wrong for broader situations 32 Uncertainty Will somebody find me a one-handed scientist?!

- Senator Edmund Muskie (Concorde hearings, 1976)

I. Flatow, Truth, Deception, and the Myth of the One-Handed Scientist, October 18, 2012. Available from:

https://thehumanist.com/magazine/november-december-2012/features/truth-deception-and-the-myth-of-the-one-handed-scientist If a fire can occur, it will.

Deal with it.

Uncertainty (cont.)

Additional Comments

- Decision making under large uncertainty is a challenge; available guidance (e.g., NUREG-1855) is useful but not a cookbook

- Uncertainty analysis has long been (see Ward, 1991) and remains underemphasized in many risk-informed applications

- Unquantified model uncertainties can be as important as displayed uncertainties

- Be careful about implied model of best estimate plus uncertainty (BEPU)

- Uncertainties are subjective Views on importance of sources vary with individuals, technical disciplines Perceptions depend on framing (e.g., absolute vs. relative)

Be sparing with stock phrases; PRA (the process as well as results) affects state-of-knowledge 33 Uncertainty

Importance of Uncertainty Uncertainty Plots scaled from historical plots: https://www.sfwmd.gov/