ML19011A430

From kanterella
Jump to navigation Jump to search
Lecture 5-1 Evidence 2019-01-18
ML19011A430
Person / Time
Issue date: 01/16/2019
From:
Office of Nuclear Regulatory Research
To:
Nathan Siu 415-0744
Shared Package
ML19011A416 List:
References
Download: ML19011A430 (31)


Text

Evidence and Estimation Lecture 5-1 1

Schedule 2

Course Overview Wednesday 1/16 Thursday 1/17 Friday 1/18 Tuesday 1/22 Wednesday 1/23 Module 1: Introduction 3: Characterizing Uncertainty 5: Basic Events 7: Learning from Operational Events 9: The PRA Frontier 9:00-9:45 L1-1: What is RIDM?

L3-1: Probabilistic modeling for NPP PRA L5-1: Evidence and estimation L7-1: Retrospective PRA L9-1: Challenges for NPP PRA 9:45-10:00 Break Break Break Break Break 10:00-11:00 L1-2: RIDM in the nuclear industry L3-2: Uncertainty and uncertainties L5-2: Human Reliability Analysis (HRA)

L7-2: Notable events and lessons for PRA L9-2: Improved PRA using existing technology 11:00-12:00 W1: Risk-informed thinking W2: Characterizing uncertainties W4: Bayesian estimation W6: Retrospective Analysis L9-3: The frontier: grand challenges and advanced methods 12:00-1:30 Lunch Lunch Lunch Lunch Lunch Module 2: PRA Overview 4: Accident Sequence Modeling 6: Special Technical Topics 8: Applications and Challenges 10: Recap 1:30-2:15 L2-1: NPP PRA and RIDM:

early history L4-1: Initiating events L6-1: Dependent failures L8-1: Risk-informed regulatory applications L10-1: Summary and closing remarks L8-2: PRA and RIDM infrastructure 2:15-2:30 Break Break Break Break 2:30-3:30 L2-2: NPP PRA models and results L4-2: Modeling plant and system response L6-2: Spatial hazards and dependencies L8-3: Risk-informed fire protection Discussion: course feedback 3:30-4:30 L2-3: PRA and RIDM:

point-counterpoint W3: Plant systems modeling L6-3: Other operational modes L8-4: Risk communication Open Discussion L6-4: Level 2/3 PRA:

beyond core damage 4:30-4:45 Break Break Break Break 4:45-5:30 Open Discussion W3: Plant systems modeling (cont.)

W5: External Hazards modeling Open Discussion 5:30-6:00 Open Discussion Open Discussion

Learning Objectives

  • Range of evidence used in NPP PRA
  • Sources of operational data
  • Bayesian estimation
  • Treatment of model predictions and expert judgment 3

Overview

Resources J. Lane, U.S. NRC Operational Experience Data Collection Program, NEA Workshop on the Use of Operational Experience in PSA, Boulogne-Billancourt, France, April 26-27, 2018. (ADAMS ML18123A479)

U.S. Nuclear Regulatory Commission, Reliability and Availability Data System (RADS) https://nrcoe.inl.gov/resultsdb/RADS/

U.S. Nuclear Regulatory Commission, Industry Average Parameter Estimates, https://nrcoe.inl.gov/resultsdb/AvgPerf/

N. Siu and D.L. Kelly, "Bayesian parameter estimation in probabilistic risk assessment," Reliability Engineering and System Safety, 62,89-116, 1998.

C.L. Atwood, et al., Handbook of Parameter Estimation for Probabilistic Risk Assessment, NUREG/CR-6823, September 2003.

R. J. Budnitz, et al., Recommendations for Probabilistic Seismic Hazard Analysis: Guidance on Uncertainty and Use of Experts, NUREG/CR-6372, 1997.

4 Overview

Other References IEEE Guide to the Collection and Presentation of Electrical, Electronic, Sensing Component, and Mechanical Equipment Reliability for Nuclear-Power Generating Stations, IEEE Std 500-1984, Institute of Electrical and Electronics Engineers, New York, 1983.

Center for Chemical Process Safety, Guidelines for Process Equipment Reliability Data with Data Tables, American Institute of Chemical Engineers, New York, 1989.

G.E.P. Box and G.C. Tiao, Bayesian Inference in Statistical Analysis, Addison-Wesley, Reading, MA, 1973.

D. Kahneman, P. Slovic, and A. Tversky (eds.), Judgment Under Uncertainty:

Heuristics and Biases, Cambridge University Press, Cambridge, MA, 1982.

M. Granger Morgan, Use (and abuse) of expert elicitation in support of decision making for public policy, National Academy of Sciences Proceedings (NASP), 111, No. 20, 7176-7184, May 20, 2014.

J. Xing and S. Morrow, White Paper: Practical Insights and Lessons Learned on Implementing Expert Elicitation, U.S. Nuclear Regulatory Commission, October 13, 2016. (ADAMS ML16287A734) 5 Overview

Basic event probabilities reflect state of knowledge P = Probability X = Proposition of concern (e.g., SI pump failure rate < 10-3 per demand)

C = Conditions of assessment (e.g., key assumptions)

H = State of knowledge (dependent on assessor) 6 P{XlC,H}

Introduction

State of Knowledge (About X)

Affected by evidence; common forms:

- Data (operational, tests, simulator exercises, experiments)

- Generic estimates

- Model predictions

- Expert judgment Changes in H lead to changes in the probability distributions for model parameters Bayes Theorem is the formal tool for updating 7

0 0 1 1 Introduction

On Data

  • Data (plural of datum)

- Facts, information, statistics, or the like, either historical or derived by calculation or experimentation

- Any facts assumed to be a matter of direct observation

  • PRA community uses both views:

- System analyst: input parameters for PRA model

  • NPP-specific

- Data analyst: empirical observations used to estimate input parameters 8

Data

Operational Data See Lane (2018): history of NRC operational experience data collection and current programs.

Key sources

- Licensee Event Reports See https://lersearch.inl.gov

  • >54,000 records (1980-)
  • Rich source, but level of detail and scope can vary

- INPO Consolidated Events Database (ICES) - proprietary

  • Required (MSPI program) + voluntary reporting
  • Includes some demand and runtime data 9

Data

NRC Data Summaries

Industry average estimates, trends, and summary data for PRA model parameters:

https://nrcoe.inl.gov/resultsdb/AvgPerf/

- Initiating events

- Component reliabilities Common cause failures:

https://nrcoe.inl.gov/resultsdb/ParamEstSpar/

10 Event reports often require interpretation Plant-specific terminology Severity - truly a failure?

Data

11 Operating Experience Data Data J. Lane, U.S. NRC Operational Experience Data Collection Program, NEA Workshop on the Use of Operational Experience in PSA, Boulogne-Billancourt, France, April 26-27, 2018. (ADAMS ML18123A479)

Bayesian Estimation - Principles Bayes Theorem: an expression of conditional probability Prior distribution quantifies belief before new evidence E Likelihood function quantifies probability of seeing E, given q Posterior distribution quantifies belief given E k = normalization constant = 0 12 1 = 1 0

posterior distribution prior distribution likelihood function Bayesian Estimation

Likelihood Functions - Examples

  • General: =
  • Poisson process (frequency = l)

- Evidence: n events in time t

- Likelihood function:, l = l

l

- Evidence: occurrence times {t1,, tn}

- Likelihood function: 1,, l = =1

ll

  • Bernoulli process (probability = f)

- Evidence: n events in m trials

- Likelihood function:, f =

f1 f 13 Bayesian Estimation

Likelihood Functions - Another Example Expert judgment

- Evidence = estimate l for failure frequency l

- A possible likelihood function: lognormal B = bias s = measure of confidence in expert 14 l l =

1 2l 1

2 ll+

2 s = 0, B = 0 => l l is delta function about l => perfect expert s = => l l is flat => completely non-expert Likelihood Function = model of the evidence-generating process Bayesian Estimation

Prior Distributions

  • General: characterizes state of knowledge regarding uncertain parameter(s)
  • Informative

- Preferred in principle

- Takes effort to develop

  • Non-informative

- Objective

- Low effort

- Conservative but can be good enough 15 Bayesian Estimation

Informative Prior Distribution Construction Methods Direct quantification

- Percentiles

- Parametric form + select characteristics (e.g., moments, percentiles)

Hierarchical Bayes (notably two-stage Bayes)

- Model plant-to-plant (population) variability

- Use population variability result as prior for plant of interest Reverse engineering: make judgments about generated samples (vs. model parameters)

Be cognizant of biases from common heuristics (Lecture 2-3)

- Representativeness

- Availability

- Anchoring and adjustment 16 Bayesian Estimation

Non-Informative Prior Distributions Based upon mathematical definitions of relative ignorance (relative to data) - not generally flat/uniform Examples (Jeffreys Rule priors)

- Bernoulli process: 0 f 1

f 1f

- Poisson process: 0 l 1

l Other non-informative distributions: maximum entropy, constrained non-informative See Siu and Kelly (1998) and Atwood et al. (NUREG/CR-6823) for further discussion of forms used in NPP PRAs Computational caution: non-informative prior distributions are often unbounded at one or both extremes; need to be careful if using numerical integration 17 Bayesian Estimation

Knowledge Check If l is distributed according to a Jeffreys prior, what is the mean value for l?

18

Conjugate Likelihood-Prior Pairs

  • Result in analytical solution of Bayes Theorem => often used for computational convenience
  • Can be informative or non-informative
  • Examples:*

- Binomial likelihood and beta prior => beta posterior

  • Assume n failures in m trials, and a beta prior with parameters a and b
  • Posterior distribution is beta with parameters a = a + n and b = b + m

- Poisson likelihood and gamma prior => gamma posterior

  • Assume n failures in time t and a gamma prior with parameters a and b
  • Posterior distribution is gamma with parameters a = n + a and b = t + b 19 Bayesian Estimation
  • See Probability Math background slides for more information on the beta and gamma distributions

Sample Results 20 0

5 10 15 20 25 0.00 0.10 0.20 0.30 0.40 0.50 Probability Density Function Frequency (/yr)

Actual Data (10 events in 100 years)

Non-Informative Prior Non-Informative Posterior Informed Prior Informed Posterior Overconfident Prior Overconfident Posterior 0

5 10 15 20 25 0.00 0.10 0.20 0.30 0.40 0.50 Probability Density Function Frequency (/yr)

Weak Data (1 event in 10 years)

Non-Informative Prior Non-Informative Posterior Informed Prior Informed Posterior Overconfident Prior Overconfident Posterior maximum likelihood maximum likelihood Bayesian Estimation l05 = 0.05 l95 = 0.5 l05 = 0.2 l95 = 0.4 l05 = 0.05 l95 = 0.5 l05 = 0.2 l95 = 0.4

Comments

  • When data are plentiful (strong), posterior is relatively insensitive to reasonable priors
  • When data are weak, prior is important => important to construct carefully

- Overly broad (undervaluing state of knowledge) => overly conservative results

- Overly narrow (too confident in state of knowledge) =>

insensitive to data when obtained

- Need to be wary of biases from heuristics 21 Bayesian Estimation

Model Predictions Common aleatory model: stress vs. strength

- Time-reliability

- Fire-induced damage

- Seismically-induced damage Uncertainties in parameters can be quantified and propagated through models; uncertainties in model outputs can also be quantified (Lecture 3-2) 22

= < l

= < l

= < l Evidence from Models

Mechanistic Modeling - Comments Results and even models used in NPP PRAs

- Component behavior (e.g., reactor coolant pump seals)

- Success criteria

- Internal and external hazards Physics of Failure models proposed for various issues (e.g.,

aging, CCF)

Appealing approach to better account for current, relevant state of knowledge (what we know)

Need to account for parameter and model uncertainties Increases vulnerability to completeness uncertainty 23 Evidence from Models

Fire Model Uncertainty Example 24 Evidence from Models

Fire Model Uncertainty - Results 25 Evidence from Models

Cautionary Examples

  • Multi-unit trip due to loss of communication
  • Capacitor failure from operation at below-design voltage (non-nuclear)
  • Increased accident consequences of a stronger pressure vessel (non-nuclear)
  • Reactivity accident from rocking 26 Evidence from Models

Expert Judgment

  • Fundamental component of PRA

- Modeling

- Data selection and analysis

- Direct elicitation (qualitative and quantitative)

  • Justification: decision support 27 P{XlC,H}

what we believe conditions of probability statement what we know proposition/event of concern For RIDM, we = informed technical community (not just analyst/analysis team)

Evidence from Experts

Direct Elicitation Aim

- Take advantage of human ability to consider/integrate complex information

- Engage a wide variety of expert viewpoints Key PRA applications

- Problem formulation, planning of experiments and analyses (Phenomena Identification Ranking Tables)

- Scenario development (logic trees)

- Estimation of model parameters Key guidance documents

- Processes designed to address known sources of biases

- NUREG/CR-6825 and subsequent documents (Senior Seismic Hazard Analysis Committee: SSHAC) 28 Evidence from Experts

SSHAC Overview Designed with recognition of potential individual and social biases, e.g.,

- Underestimation of uncertainties from heuristics (availability, anchoring and adjustment, representativeness, etc.)

- Social influences on individual judgment (e.g., group dynamics, organizational influences)

Emphasizes characterizing full community point of view (center, body, and range); documentation (and ownership) of bases for judgments Different levels for different needs 29 Level Characteristics 1

TI only (literature review, personal experience) 2 TI interacts with proponents and resource experts 3

TI brings together proponents and resource experts 4

TFI organizes expert panel to develop estimates Evidence from Experts TI = Technical Integrator TFI = Technical Facilitator/Integrator

Level 4 SSHAC 30 General Process

1) Preparation
2) Piloting/Training
3) Interactions (Workshops) a)

Evaluate evidence b)

Develop, defend, and revise judgments c)

Integrate judgments

4) Participatory Peer Review Adapted from: R. J. Budnitz, et al., Recommendations for Probabilistic Seismic Hazard Analysis: Guidance on Uncertainty and Use of Experts, NUREG/CR-6372, 1997 Evidence from Experts

Direct Elicitation - Cautions

  • Experts

- Key experts might not be available during project

- Difficult to get undivided attention for extended periods of time

- Different perspectives/frameworks => considerable effort to develop common understanding of problem

- Inclination to provide the right answer based on individual point of view

- Subject to usual human biases

  • Results can be viewed as the final solution 31 A serious, important activity, but not the Easy Button Evidence from Experts