ML19011A430: Difference between revisions

From kanterella
Jump to navigation Jump to search
(Created page by program invented by StriderTol)
(Created page by program invented by StriderTol)
 
Line 98: Line 98:


Bayesian Estimation Bayesian Estimation - Principles 1
Bayesian Estimation Bayesian Estimation - Principles 1
1  =    0  
1  =    0 posterior      likelihood  prior distribution      function distribution
 
posterior      likelihood  prior distribution      function distribution
* Bayes Theorem: an expression of conditional probability
* Bayes Theorem: an expression of conditional probability
* Prior distribution quantifies belief before new evidence E
* Prior distribution quantifies belief before new evidence E
Line 111: Line 109:
* Poisson process (frequency = l)
* Poisson process (frequency = l)
   - Evidence: n events in time t l
   - Evidence: n events in time t l
   - Likelihood function:  ,  l =            l  
   - Likelihood function:  ,  l =            l  
                                          !
   - Evidence: occurrence times {t1,  , tn}
   - Evidence: occurrence times {t1,  , tn}
   - Likelihood function:  1 ,  ,  l = =1 l l
   - Likelihood function:  1 ,  ,  l = =1 l l
* Bernoulli process (probability = f)
* Bernoulli process (probability = f)
   - Evidence: n events in m trials
   - Evidence: n events in m trials
 
   - Likelihood function:  ,  f =          f 1f 13
   - Likelihood function:  ,  f =          f 1f  
 
13


Bayesian Estimation Likelihood Functions - Another Example
Bayesian Estimation Likelihood Functions - Another Example
Line 128: Line 121:
     - A possible likelihood function: lognormal 2
     - A possible likelihood function: lognormal 2
1 l l+
1 l l+
1 l l =          2 2l
1 l l =          2 2l
     - B = bias
     - B = bias

Latest revision as of 14:23, 2 February 2020

Lecture 5-1 Evidence 2019-01-18
ML19011A430
Person / Time
Issue date: 01/16/2019
From:
Office of Nuclear Regulatory Research
To:
Nathan Siu 415-0744
Shared Package
ML19011A416 List:
References
Download: ML19011A430 (31)


Text

Evidence and Estimation Lecture 5-1 1

Course Overview Schedule Wednesday 1/16 Thursday 1/17 Friday 1/18 Tuesday 1/22 Wednesday 1/23 3: Characterizing 7: Learning from Module 1: Introduction Uncertainty 5: Basic Events Operational Events 9: The PRA Frontier L3-1: Probabilistic L5-1: Evidence and L9-1: Challenges for NPP 9:00-9:45 L1-1: What is RIDM?

modeling for NPP PRA estimation L7-1: Retrospective PRA PRA 9:45-10:00 Break Break Break Break Break L1-2: RIDM in the nuclear L3-2: Uncertainty and L5-2: Human Reliability L7-2: Notable events and L9-2: Improved PRA using 10:00-11:00 industry uncertainties Analysis (HRA) lessons for PRA existing technology L9-3: The frontier: grand W1: Risk-informed W2: Characterizing W6: Retrospective 11:00-12:00 thinking uncertainties W4: Bayesian estimation Analysis challenges and advanced methods 12:00-1:30 Lunch Lunch Lunch Lunch Lunch 4: Accident 6: Special Technical 8: Applications and Module 2: PRA Overview 10: Recap Sequence Modeling Topics Challenges L8-1: Risk-informed L2-1: NPP PRA and RIDM: regulatory applications 1:30-2:15 early history L4-1: Initiating events L6-1: Dependent failures L8-2: PRA and RIDM L10-1: Summary and closing remarks infrastructure 2:15-2:30 Break Break Break Break L2-2: NPP PRA models L4-2: Modeling plant and L6-2: Spatial hazards and L8-3: Risk-informed fire Discussion: course 2:30-3:30 and results system response dependencies protection feedback L6-3: Other operational L2-3: PRA and RIDM: W3: Plant systems modes 3:30-4:30 point-counterpoint modeling L6-4: Level 2/3 PRA:

L8-4: Risk communication Open Discussion beyond core damage 4:30-4:45 Break Break Break Break W3: Plant systems W5: External Hazards 4:45-5:30 modeling (cont.) modeling Open Discussion Open Discussion 5:30-6:00 Open Discussion Open Discussion 2

Overview Learning Objectives

  • Range of evidence used in NPP PRA
  • Sources of operational data
  • Bayesian estimation
  • Treatment of model predictions and expert judgment 3

Overview Resources

  • J. Lane, U.S. NRC Operational Experience Data Collection Program, NEA Workshop on the Use of Operational Experience in PSA, Boulogne-Billancourt, France, April 26-27, 2018. (ADAMS ML18123A479)
  • R. J. Budnitz, et al., Recommendations for Probabilistic Seismic Hazard Analysis: Guidance on Uncertainty and Use of Experts, NUREG/CR-6372, 1997.

4

Overview Other References

  • IEEE Guide to the Collection and Presentation of Electrical, Electronic, Sensing Component, and Mechanical Equipment Reliability for Nuclear-Power Generating Stations, IEEE Std 500-1984, Institute of Electrical and Electronics Engineers, New York, 1983.
  • Center for Chemical Process Safety, Guidelines for Process Equipment Reliability Data with Data Tables, American Institute of Chemical Engineers, New York, 1989.
  • G.E.P. Box and G.C. Tiao, Bayesian Inference in Statistical Analysis, Addison-Wesley, Reading, MA, 1973.
  • D. Kahneman, P. Slovic, and A. Tversky (eds.), Judgment Under Uncertainty:

Heuristics and Biases, Cambridge University Press, Cambridge, MA, 1982.

  • M. Granger Morgan, Use (and abuse) of expert elicitation in support of decision making for public policy, National Academy of Sciences Proceedings (NASP), 111, No. 20, 7176-7184, May 20, 2014.
  • J. Xing and S. Morrow, White Paper: Practical Insights and Lessons Learned on Implementing Expert Elicitation, U.S. Nuclear Regulatory Commission, October 13, 2016. (ADAMS ML16287A734) 5

Introduction Basic event probabilities reflect state of knowledge P{XlC,H}

  • P = Probability
  • X = Proposition of concern (e.g., SI pump failure rate < 10-3 per demand)
  • C = Conditions of assessment (e.g., key assumptions)
  • H = State of knowledge (dependent on assessor) 6

Introduction State of Knowledge (About X)

  • Affected by evidence; common forms:

- Data (operational, tests, simulator exercises, experiments)

- Generic estimates

- Model predictions

- Expert judgment

  • Changes in H lead to changes in the probability distributions for model parameters 0 0 1 1
  • Bayes Theorem is the formal tool for updating 7

Data On Data

  • Data (plural of datum)

- Facts, information, statistics, or the like, either historical or derived by calculation or experimentation

- Any facts assumed to be a matter of direct observation

  • PRA community uses both views:

- System analyst: input parameters for PRA model

  • NPP-specific

- Data analyst: empirical observations used to estimate input parameters 8

Data Operational Data

  • See Lane (2018): history of NRC operational experience data collection and current programs.
  • Key sources

- Licensee Event Reports

  • >54,000 records (1980-)
  • Rich source, but level of detail and scope can vary

- INPO Consolidated Events Database (ICES) - proprietary

  • Required (MSPI program) + voluntary reporting
  • Includes some demand and runtime data 9

Data NRC Data Summaries

  • Industry average estimates, trends, and summary data for PRA model parameters:

https://nrcoe.inl.gov/resultsdb/AvgPerf/

- Initiating events

- Component reliabilities

  • Common cause failures:

https://nrcoe.inl.gov/resultsdb/ParamEstSpar/

Event reports often require interpretation

  • Plant-specific terminology
  • Severity - truly a failure?

10

Data Operating Experience Data J. Lane, U.S. NRC Operational Experience Data Collection Program, NEA Workshop on the Use of Operational Experience in PSA, Boulogne-Billancourt, France, April 26-27, 2018. (ADAMS ML18123A479) 11

Bayesian Estimation Bayesian Estimation - Principles 1

1 = 0 posterior likelihood prior distribution function distribution

  • Bayes Theorem: an expression of conditional probability
  • Prior distribution quantifies belief before new evidence E
  • Likelihood function quantifies probability of seeing E, given q
  • Posterior distribution quantifies belief given E
  • k = normalization constant = 0 12

Bayesian Estimation Likelihood Functions - Examples

  • General: =
  • Poisson process (frequency = l)

- Evidence: n events in time t l

- Likelihood function: , l = l

- Evidence: occurrence times {t1, , tn}

- Likelihood function: 1 , , l = =1 l l

  • Bernoulli process (probability = f)

- Evidence: n events in m trials

- Likelihood function: , f = f 1f 13

Bayesian Estimation Likelihood Functions - Another Example

  • Expert judgment

- Evidence = estimate l for failure frequency l

- A possible likelihood function: lognormal 2

1 l l+

1 l l = 2 2l

- B = bias

- s = measure of confidence in expert s = 0, B = 0 => l l is delta function about l => perfect expert s = => l l is flat => completely non-expert Likelihood Function = model of the evidence-generating process 14

Bayesian Estimation Prior Distributions

  • General: characterizes state of knowledge regarding uncertain parameter(s)
  • Informative

- Preferred in principle

- Takes effort to develop

  • Non-informative

- Objective

- Low effort

- Conservative but can be good enough 15

Bayesian Estimation Informative Prior Distribution Construction Methods

  • Direct quantification

- Percentiles

- Parametric form + select characteristics (e.g., moments, percentiles)

  • Hierarchical Bayes (notably two-stage Bayes)

- Model plant-to-plant (population) variability

- Use population variability result as prior for plant of interest

  • Reverse engineering: make judgments about generated samples (vs. model parameters)
  • Be cognizant of biases from common heuristics (Lecture 2-3)

- Representativeness

- Availability

- Anchoring and adjustment 16

Bayesian Estimation Non-Informative Prior Distributions

  • Based upon mathematical definitions of relative ignorance (relative to data) - not generally flat/uniform
  • Examples (Jeffreys Rule priors) 1

- Bernoulli process: 0 f f 1f 1

- Poisson process: 0 l l

  • Other non-informative distributions: maximum entropy, constrained non-informative
  • See Siu and Kelly (1998) and Atwood et al. (NUREG/CR-6823) for further discussion of forms used in NPP PRAs
  • Computational caution: non-informative prior distributions are often unbounded at one or both extremes; need to be careful if using numerical integration 17

Knowledge Check If l is distributed according to a Jeffreys prior, what is the mean value for l?

18

Bayesian Estimation Conjugate Likelihood-Prior Pairs

  • Result in analytical solution of Bayes Theorem => often used for computational convenience
  • Can be informative or non-informative
  • Examples:*

- Binomial likelihood and beta prior => beta posterior

  • Assume n failures in m trials, and a beta prior with parameters a and b
  • Posterior distribution is beta with parameters a = a + n and b = b + m

- Poisson likelihood and gamma prior => gamma posterior

  • Assume n failures in time t and a gamma prior with parameters a and b
  • Posterior distribution is gamma with parameters a = n + a and b = t + b
  • See Probability Math background slides for more information on the beta and gamma distributions 19

Bayesian Estimation Sample Results Actual Data (10 events in 100 years)

Probability Density Function 25 Non-Informative Prior 20 maximum Non-Informative Posterior likelihood l05 = 0.05 15 Informed Prior l95 = 0.5 Informed Posterior 10 Overconfident Prior l05 = 0.2 l95 = 0.4 5 Overconfident Posterior 0

0.00 0.10 0.20 0.30 0.40 0.50 Frequency (/yr)

Weak Data (1 event in 10 years)

Probability Density Function 25 Non-Informative Prior 20 Non-Informative Posterior maximum likelihood Informed Prior l05 = 0.05 15 l95 = 0.5 Informed Posterior 10 Overconfident Prior l05 = 0.2 l95 = 0.4 5 Overconfident Posterior 0

0.00 0.10 0.20 0.30 0.40 0.50 Frequency (/yr) 20

Bayesian Estimation Comments

  • When data are plentiful (strong), posterior is relatively insensitive to reasonable priors
  • When data are weak, prior is important => important to construct carefully

- Overly broad (undervaluing state of knowledge) => overly conservative results

- Overly narrow (too confident in state of knowledge) =>

insensitive to data when obtained

- Need to be wary of biases from heuristics 21

Evidence from Models Model Predictions

  • Common aleatory model: stress vs. strength

- Time-reliability

= < l

- Fire-induced damage

= < l

- Seismically-induced damage

= < l

  • Uncertainties in parameters can be quantified and propagated through models; uncertainties in model outputs can also be quantified (Lecture 3-2) 22

Evidence from Models Mechanistic Modeling - Comments

  • Results and even models used in NPP PRAs

- Component behavior (e.g., reactor coolant pump seals)

- Success criteria

- Internal and external hazards

  • Physics of Failure models proposed for various issues (e.g.,

aging, CCF)

  • Appealing approach to better account for current, relevant state of knowledge (what we know)
  • Need to account for parameter and model uncertainties
  • Increases vulnerability to completeness uncertainty 23

Evidence from Models Fire Model Uncertainty Example 24

Evidence from Models Fire Model Uncertainty - Results 25

Evidence from Models Cautionary Examples

  • Multi-unit trip due to loss of communication
  • Capacitor failure from operation at below-design voltage (non-nuclear)
  • Increased accident consequences of a stronger pressure vessel (non-nuclear)
  • Reactivity accident from rocking 26

Evidence from Experts Expert Judgment

  • Fundamental component of PRA

- Modeling

- Data selection and analysis

- Direct elicitation (qualitative and quantitative)

  • Justification: decision support what we know what we believe P{XlC,H}

proposition/event conditions of of concern probability statement For RIDM, we = informed technical community (not just analyst/analysis team) 27

Evidence from Experts Direct Elicitation

  • Aim

- Take advantage of human ability to consider/integrate complex information

- Engage a wide variety of expert viewpoints

  • Key PRA applications

- Problem formulation, planning of experiments and analyses (Phenomena Identification Ranking Tables)

- Scenario development (logic trees)

- Estimation of model parameters

  • Key guidance documents

- Processes designed to address known sources of biases

- NUREG/CR-6825 and subsequent documents (Senior Seismic Hazard Analysis Committee: SSHAC) 28

Evidence from Experts SSHAC Overview

  • Designed with recognition of potential Level Characteristics individual and social biases, e.g., 1 TI only (literature review,

- Underestimation of uncertainties from personal experience) heuristics (availability, anchoring and 2 TI interacts with proponents and adjustment, representativeness, etc.) resource experts

- Social influences on individual judgment 3 TI brings together proponents and resource experts (e.g., group dynamics, organizational influences) 4 TFI organizes expert panel to develop estimates

  • Emphasizes characterizing full TI = Technical Integrator community point of view (center, body, TFI = Technical Facilitator/Integrator and range); documentation (and ownership) of bases for judgments
  • Different levels for different needs 29

Evidence from Experts General Process

1) Preparation Level 4 SSHAC 2) Piloting/Training
3) Interactions (Workshops) a) Evaluate evidence b) Develop, defend, and revise judgments c) Integrate judgments
4) Participatory Peer Review Adapted from: R. J. Budnitz, et al., Recommendations for Probabilistic Seismic Hazard Analysis: Guidance on Uncertainty and Use of Experts, NUREG/CR-6372, 1997 30

Evidence from Experts Direct Elicitation - Cautions

  • Experts

- Key experts might not be available during project

- Difficult to get undivided attention for extended periods of time

- Different perspectives/frameworks => considerable effort to develop common understanding of problem

- Inclination to provide the right answer based on individual point of view

- Subject to usual human biases

  • Results can be viewed as the final solution A serious, important activity, but not the Easy Button 31