ML19011A427

From kanterella
Jump to navigation Jump to search
Lecture 3-2 Uncertainties 2019-01-17
ML19011A427
Person / Time
Issue date: 01/16/2019
From:
Office of Nuclear Regulatory Research
To:
Nathan Siu 415-0744
Shared Package
ML19011A416 List:
References
Download: ML19011A427 (32)


Text

Uncertainty and Uncertainties Lecture 3-2 1

Key Topics

  • Types of uncertainty (as treated in NPP PRA):

aleatory and epistemic

  • Epistemic uncertainty

- Characterization (subjective probability)

- Magnitude for typical parameters

- Types: completeness, model, parameter

- Characterization for overall system (propagation) 2 Overview

Resources G. Apostolakis, Probability and risk assessment: the subjectivistic viewpoint and some suggestions, Nuclear Safety, 9, 305-315, 1978.

G. Apostolakis, The concept of probability in safety assessments of technological systems, Science, 250, 1359-1364, 1990.

N. Siu, et al., Probabilistic Risk Assessment and Regulatory Decisionmaking: Some Frequently Asked Questions, NUREG-2201, U.S.

Nuclear Regulatory Commission, September 2016.

U.S. Nuclear Regulatory Commission, Guidance on the Treatment of Uncertainties Associated with PRAs in Risk-Informed Decision Making, NUREG-1855, Revision 1, March 2013.

M. Granger Morgan, Use (and abuse) of expert elicitation in support of decision making for public policy, National Academy of Sciences Proceedings (NASP), 111, No. 20, 7176-7184, May 20, 2014.

3 Overview

Other References A. Mosleh, et al., Model Uncertainty: Its Characterization and Quantification, Center for Reliability Engineering, University of Maryland, College Park, MD, 1995. (Also available as NUREG/CP-0138)

N. Siu, D. Karydas, and J. Temple, "Bayesian Assessment of Modeling Uncertainty: Application to Fire Risk Assessment," in Analysis and Management of Uncertainty: Theory and Application, B.M. Ayyub, M.M.

Gupta, and L.N. Kanal, eds., North-Holland, 1992, pp. 351-361.

E. Droguett and A. Mosleh, Bayesian methodology for model uncertainty using model performance data, Risk Analysis, 28, No. 5, 1457-1476, 2008.

U.S. Nuclear Regulatory Commission, Reliability and Availability Data System (RADS) https://nrcoe.inl.gov/resultsdb/RADS/

U.S. Nuclear Regulatory Commission, Industry Average Parameter Estimates, https://nrcoe.inl.gov/resultsdb/AvgPerf/

4 Overview

Other References (cont.)

W.E. Vesely, et al., Measures of Risk Importance and Their Applications, NUREG/CR-3385, July 1983.

Siu, N. and D.L. Kelly, On the Use of Importance Measures for Prioritizing Systems, Structures, and Components, Proceedings 5th International Topical Meeting on Nuclear Thermal Hydraulics, Operations, and Safety (NUTHOS-5), Beijing, China, April 14-18, 1997, pp. L.4-1 through L.4-6.

American Nuclear Society and the Institute of Electrical and Electronics Engineers, PRA Procedures Guide, NUREG/CR-2300, January 1983.

E. Zio and N. Pedroni, How to effectively compute the reliability of a thermal-hydraulic passive system, Nuclear Engineering and Design, 241, 310-327, 2011.

5 Overview

The Big Picture (Level 1)

CDF is a measure of uncertainty Random variables:

number of core damage events time to a core damage event 6

How well do we know CDF?

=

=

=

= 2 + 4 +

Motivation

Types of Uncertainty Addressed by NPP PRA/RIDM

  • Aleatory (random, stochastic)

- Irreducible

- Examples:

  • How long it takes for operators to initiate a fire response procedure

- Modeling addressed in Lecture 3-1

  • Epistemic (state-of-knowledge)

- Reducible

- Examples:

  • True value of SRV failure rate
  • Best model for operator response 7

Types of Uncertainty

Indicators of Uncertainty

  • Percentiles

- Bounds (e.g., 5th, 95th)

  • Moments

- General

- Mean

- Variance 8

2

= =

=

Epistemic Uncertainty: The Probability of Frequency Model 9

50%

95%

l could easily be above or below 4.6E-5 (the median)

Very confident l is below 7.8E-5 (the 95th percentile)

Subjective Probability Probability is an internal measure of degree of belief in a proposition To be useful, need:

- Consistent scale for comparing judgments

- Convergence with classical (relative frequency) probabilities given enough data Coherent analyst:

probabilities conform to the laws of probability Epistemic Uncertainty: Characterization

Qualitative to Quantitative: Examples 10 M. Granger Morgan, Use (and abuse) of expert elicitation in support of decision making for public policy, National Academy of Sciences Proceedings (NASP), 111, No. 20, 7176-7184, May 20, 2014.

Epistemic Uncertainty: Characterization NUREG/CR-6771

Aleatory vs. Epistemic - Relationship Subjective view: all uncertainties are epistemic. Aleatory model brings in analysts understanding of underlying processes supports quantification supports communication regarding model parameters Coin toss example Uncertain proposition X: observation of n heads on next m tosses (trials)

P{XlC,H} can include consideration of conditions: fairness of coin, tossing, measurement; whether outcomes are truly binary If conditions are ideal, epistemic uncertainty (for a coherent analyst) is given by aleatory (binomial) distribution:

If there is uncertainty in the conditions, as reflected by a probability distribution for f, the total probability is the expectation of the aleatory result:

11

lf

f1 f

lC, H

0 1

f1 f F f f epistemic distribution for f Types of Uncertainty

Aleatory vs. Epistemic Uncertainty -

Commentary and Observations Distinction depends on frame of reference (model of the world).

Example: Weld crack propagation depends on local material conditions (e.g., Cu content, flaw geometry).

Aleatory model: conditions have a statistical distribution.

Epistemic model: local conditions are knowable (in principle).

Appropriate model depends on question being asked (e.g., probability of conditions at a randomly sampled location vs. conditions at a specific location).

Practice of distinction is generally accepted, has proven useful in RIDM.

NPP PRA community (particularly Level 1) uses probability of frequency modeling convention (used in these lectures).

Terminology has been (and remains) a source of angst.

Some arguments in the broader risk community for alternate measures of epistemic uncertainty (e.g., possibility, degree of membership) 12 Types of Uncertainty

PRA Uncertainties - Various Studies 13 Epistemic Uncertainty: Magnitude

NUREG-1150 Uncertainties - Another View 14 0.0E+00 5.0E+04 1.0E+05 1.5E+05 2.0E+05 2.5E+05 1.E-11 1.E-10 1.E-09 1.E-08 1.E-07 1.E-06 1.E-05 1.E-04 1.E-03 Probability Density Function CDF (/ry)

Surry Internal Surry EQ Surry Fire Surry Total Estimated Epistemic Uncertainty: Magnitude

NUREG-1150 Uncertainties - Another View 15 0.0E+00 5.0E+04 1.0E+05 1.5E+05 2.0E+05 2.5E+05 0.E+00 2.E-05 4.E-05 6.E-05 8.E-05 1.E-04 Probability Density Function CDF (/ry)

Surry Internal Surry EQ Surry Fire Surry Total Estimated Epistemic Uncertainty: Magnitude

PRA Uncertainties - Indian Point (1980s) 16 Epistemic Uncertainty: Magnitude

Knowledge Check

  • Why do the NUREG-1150 curves get progressively taller with smaller values of CDF?
  • What does this tell you about the Indian Point graph?

17 Epistemic Uncertainty: Characterization

Types of Epistemic Uncertainty

  • Completeness

- Recognized (known unknowns)

- Unrecognized (unknown unknowns)

  • Model

- Competing models

- Difference between predictions and reality

  • Parameter

- Conditioned on model

- Estimated (quantified) based on available evidence 18 Epistemic Uncertainty: Types

Parameter Uncertainty

  • Typically characterized using parametric distributions (e.g., beta, gamma, lognormal)
  • Parameters estimated using Bayes Theorem (see Lecture 5-1)

- Statistical evidence (operational experience, test results);

engineering judgment in processing

- Large amounts of data => subjective probabilities converge with evidence

- Small amounts of data => large uncertainties

  • Note: pooling data increases confidence in estimates for population, but not necessarily for NPP studied.

19 Epistemic Uncertainty: Types

Generic Demand Failure Probabilities 2015 Industry-wide estimates from: https://nrcoe.inl.gov/resultsdb/AvgPerf/

Explosive valves: 3 failures in 713 trials SBO EDGs: 12 failures in 419 trials EDGs: 214 failures in 75,452 trials 20 0

100 200 300 400 500 0.00 0.01 0.02 0.03 0.04 0.05 0.06 Probability Density Function Failure Probability Demand Failures Explosive Valves SBO EDG EDG 0

100 200 300 400 500 1.00E-04 1.00E-03 1.00E-02 1.00E-01 Probability Density Function Failure Probability Demand Failures Explosive Valves SBO EDG EDG Epistemic Uncertainty: Magnitude

Generic Runtime Failure Rates 21 2015 Industry-wide estimates from: https://nrcoe.inl.gov/resultsdb/AvgPerf/

Service Water Pumps: 2 failures in 16,292,670 hours0.00775 days <br />0.186 hours <br />0.00111 weeks <br />2.54935e-4 months <br /> Normally Running Pumps: 225 failures in 59,582,350 hours0.00405 days <br />0.0972 hours <br />5.787037e-4 weeks <br />1.33175e-4 months <br /> Standby Pumps (1st hour operation): 48 failures in 437,647 hours0.00749 days <br />0.18 hours <br />0.00107 weeks <br />2.461835e-4 months <br /> 0.00 0.20 0.40 0.60 0.80 1.00 1.00E-09 1.00E-08 1.00E-07 1.00E-06 1.00E-05 1.00E-04 1.00E-03 Probability Density Function (Normalized)

Failure Rate (/hr)

Runtime Failures Service Water Normally Running Standby Note: point values wont alert user to potential irregularities Epistemic Uncertainty: Magnitude

Model Uncertainty

  • Some arbitrariness in distinction from parameter uncertainty
  • Currently only requirements for characterization (not quantification)
  • Typically addressed through consensus models, sensitivity studies
  • If appropriately defined, uncertainties can be quantified and brought into the PRA

- Focus on observables, difference between prediction and reality (vs. what is best or correct model)

- Limited applications in practice 22 Epistemic Uncertainty: Types

Thought Exercise Under what conditions could a flawed consensus model be good enough for RIDM?

23 Epistemic Uncertainty: Types

Example Quantification of Model Uncertainty Estimating uncertainty factor in Deterministic Reference Model (DRM) estimates based on comparisons with experiments DRM for cable fire propagation velocity Uncertainty factor (a random variable)

Note: bias in results indicates need for better DRM 24

=

2

2

=

, =

1 2

1 2

2 N. Siu, D. Karydas, and J. Temple, "Bayesian Assessment of Modeling Uncertainty:

Application to Fire Risk Assessment," in Analysis and Management of Uncertainty:

Theory and Application, B.M. Ayyub, M.M. Gupta, and L.N. Kanal, eds., North-Holland, 1992, pp. 351-361.

0.00 0.01 0.02 0.03 0.04 0.05 0

20 40 60 80 100 Probability Density Function Model Uncertainty Factor 05 50 95 E[]

3.8 14.8 56.9 18.6 Epistemic Uncertainty: Types

Another Example Quantification of Model Uncertainty Time (s)

Experiment (K)

DRM (K) 180 400 450 360 465 510 720 530 560 840 550 565 25 E. Droguett and Ali Mosleh, Bayesian methodology for model uncertainty using model Performance data, Risk Analysis, 28, No. 5, 1457-1476, 2008.

Notes:

1)

Bayesian methodology accounts for possibility of inhomogeneous data.

2)

Very large uncertainty bands might be unrealistic.

Epistemic Uncertainty: Types

Model Uncertainty Quantification - Other Considerations

  • Perspective matters

- Model developer - ability of model to address phenomena

- PRA user - include user effects

  • Uncertainties in unmeasured parameters
  • Sub-model limits of applicability 26 M.H. Salley and A. Lindeman, Verification and Validation of Selected Fire Models for Nuclear Power Plant Applications, NUREG-1824 Supplement 1/EPRI 3002002182, November 2016.

Epistemic Uncertainty: Types

Completeness Uncertainty - Example Known Sources Intentional acts (sabotage/terrorism)

Operator errors of commission Organizational influences

- Operation outside approved conditions (the licensing basis envelope)

- Safety culture

- External influences during an accident Design errors Phenomenological complexities

- Combinations of severe hazards

- Newly recognized hazards 27 Epistemic Uncertainty: Types

Propagation of Uncertainties Eventual aim: state uncertainty in risk metric estimates Quantitative aspects determined by propagating uncertainty in contributing parameters through PRA model 28

=

0

p notation used to indicate epistemic distribution Epistemic Uncertainty: Overall Characterization

Propagation of Uncertainties - Methods

  • Method of moments
  • Sampling

- Direct Monte Carlo

- Latin Hypercube

  • Advanced methods for computationally challenging situations

- Importance sampling (line sampling, subset sampling, )

- Surrogate models (response surface, Gaussian process, neural nets, )

29 Epistemic Uncertainty: Overall Characterization

Sensitivity Analysis

  • Typical approach for treating model uncertainties
  • In PRA/RIDM context, need to ensure analysis provides useful information

- Plausible variations

- Specific insights supporting decision problem (e.g., whether variation could lead to different decision)

  • Importance measures, routinely available from current PRA software tools, provide results from one-at-a-time sensitivity calculations
  • Advanced tools (e.g., global sensitivity analysis) available but not routinely used 30 Epistemic Uncertainty: Overall Characterization

Importance Measures Mathematical measures depend on definition of importance Some notation: R = risk metric of interest, Pi = probability of event i Common measures:

31 Measure Definition Alternate Notes Fussell-Vesely (F-V)

=

Same rankings as RRW and RRR Birnbaum

= 1 = 0

=

Nearly same rankings as RAW and RIR Risk Achievement Worth (RAW)

= 1 Nearly same rankings as Birnbaum Risk Increase Ratio (RIR)

= 1

Nearly same rankings as Birnbaum Risk Reduction Worth (RRW)

= 0 Same rankings as F-V Risk Reduction Ratio (RRR)

= 0 Same rankings as F-V Epistemic Uncertainty: Overall Characterization

Once uncertainties are characterized, then what?

  • See Lecture 8-1 32 Epistemic Uncertainty: Overall Characterization