ML19011A427: Difference between revisions
StriderTol (talk | contribs) (Created page by program invented by StriderTol) |
StriderTol (talk | contribs) (Created page by program invented by StriderTol) |
||
(One intermediate revision by the same user not shown) | |||
Line 16: | Line 16: | ||
=Text= | =Text= | ||
{{#Wiki_filter:Uncertainty and | {{#Wiki_filter:Uncertainty and Uncertainties Lecture 3-2 1 | ||
Overview Key Topics | |||
* Types of uncertainty (as treated in NPP PRA): | |||
aleatory and epistemic | |||
* Epistemic uncertainty | |||
- Characterization (subjective probability) | |||
- Magnitude for typical parameters | |||
- Types: completeness, model, parameter | |||
- Characterization for overall system (propagation) 2 | |||
Overview Resources | |||
* G. Apostolakis, Probability and risk assessment: the subjectivistic viewpoint and some suggestions, Nuclear Safety, 9, 305-315, 1978. | |||
* G. Apostolakis, The concept of probability in safety assessments of technological systems, Science, 250, 1359-1364, 1990. | |||
* N. Siu, et al., Probabilistic Risk Assessment and Regulatory Decisionmaking: Some Frequently Asked Questions, NUREG-2201, U.S. | |||
Nuclear Regulatory Commission, September 2016. | |||
* U.S. Nuclear Regulatory Commission, Guidance on the Treatment of Uncertainties Associated with PRAs in Risk-Informed Decision Making, NUREG-1855, Revision 1, March 2013. | |||
* M. Granger Morgan, Use (and abuse) of expert elicitation in support of decision making for public policy, National Academy of Sciences Proceedings (NASP), 111, No. 20, 7176-7184, May 20, 2014. | |||
3 | |||
Overview Other References | |||
* A. Mosleh, et al., Model Uncertainty: Its Characterization and Quantification, Center for Reliability Engineering, University of Maryland, College Park, MD, 1995. (Also available as NUREG/CP-0138) | |||
* N. Siu, D. Karydas, and J. Temple, "Bayesian Assessment of Modeling Uncertainty: Application to Fire Risk Assessment," in Analysis and Management of Uncertainty: Theory and Application, B.M. Ayyub, M.M. | |||
Gupta, and L.N. Kanal, eds., North-Holland, 1992, pp. 351-361. | |||
* E. Droguett and A. Mosleh, Bayesian methodology for model uncertainty using model performance data, Risk Analysis, 28, No. 5, 1457-1476, 2008. | |||
* U.S. Nuclear Regulatory Commission, Reliability and Availability Data System (RADS) https://nrcoe.inl.gov/resultsdb/RADS/ | |||
* U.S. Nuclear Regulatory Commission, Industry Average Parameter Estimates, https://nrcoe.inl.gov/resultsdb/AvgPerf/ | |||
4 | |||
Overview Other References (cont.) | |||
* W.E. Vesely, et al., Measures of Risk Importance and Their Applications, NUREG/CR-3385, July 1983. | |||
* Siu, N. and D.L. Kelly, On the Use of Importance Measures for Prioritizing Systems, Structures, and Components, Proceedings 5th International Topical Meeting on Nuclear Thermal Hydraulics, Operations, and Safety (NUTHOS-5), Beijing, China, April 14-18, 1997, pp. L.4-1 through L.4-6. | |||
* American Nuclear Society and the Institute of Electrical and Electronics Engineers, PRA Procedures Guide, NUREG/CR-2300, January 1983. | |||
* E. Zio and N. Pedroni, How to effectively compute the reliability of a thermal-hydraulic passive system, Nuclear Engineering and Design, 241, 310-327, 2011. | |||
5 | |||
Motivation The Big Picture (Level 1) | |||
= | |||
: = | |||
= 2 + 4 + | |||
= | |||
* CDF is a measure of uncertainty | |||
* Random variables: | |||
- number of core damage events | |||
- time to a core damage event How well do we know CDF? | |||
6 | |||
Types of Uncertainty Types of Uncertainty Addressed by NPP PRA/RIDM | |||
* Aleatory (random, stochastic) | |||
- Irreducible | |||
- Examples: | |||
* How many times a safety relief valve (SRV) operates before failing | |||
* How long it takes for operators to initiate a fire response procedure | |||
- Modeling addressed in Lecture 3-1 | |||
* Epistemic (state-of-knowledge) | |||
- Reducible | |||
- Examples: | |||
* True value of SRV failure rate | |||
* Best model for operator response 7 | |||
Indicators of Uncertainty | |||
* Percentiles | |||
: = = = | |||
- Bounds (e.g., 5th, 95th) | |||
* Moments | |||
- General | |||
- Mean | |||
- Variance 2 | |||
8 | |||
Epistemic Uncertainty: Characterization Epistemic Uncertainty: The Probability of Frequency Model Subjective Probability | |||
* Probability is an internal l could easily be above or below 4.6E-5 (the median) measure of degree of belief in a proposition | |||
* To be useful, need: | |||
50% | |||
- Consistent scale for comparing judgments | |||
- Convergence with classical (relative frequency) | |||
Very confident l is below probabilities given enough 7.8E-5 (the 95th percentile) data | |||
* Coherent analyst: | |||
95% probabilities conform to the laws of probability 9 | |||
Epistemic Uncertainty: Characterization Qualitative to Quantitative: Examples M. Granger Morgan, Use (and abuse) of expert elicitation in support of decision making for public policy, National Academy of Sciences Proceedings (NASP), 111, No. 20, 7176-7184, May 20, 2014. | |||
NUREG/CR-6771 10 | |||
Types of Uncertainty Aleatory vs. Epistemic - Relationship | |||
* Subjective view: all uncertainties are epistemic. Aleatory model | |||
- brings in analysts understanding of underlying processes | |||
- supports quantification | |||
- supports communication regarding model parameters | |||
* Coin toss example | |||
- Uncertain proposition X: observation of n heads on next m tosses (trials) | |||
- P{XlC,H} can include consideration of conditions: fairness of coin, tossing, measurement; whether outcomes are truly binary | |||
- If conditions are ideal, epistemic uncertainty (for a coherent analyst) is given by aleatory (binomial) distribution: | |||
= lf = f 1f | |||
- If there is uncertainty in the conditions, as reflected by a probability distribution for f, the total probability is the expectation of the aleatory result: | |||
1 | |||
= lC, H = f 1f F f f 0 | |||
epistemic distribution for f 11 | |||
Types of Uncertainty Aleatory vs. Epistemic Uncertainty - | |||
Commentary and Observations | |||
* Distinction depends on frame of reference (model of the world). | |||
* Example: Weld crack propagation depends on local material conditions (e.g., Cu content, flaw geometry). | |||
- Aleatory model: conditions have a statistical distribution. | |||
- Epistemic model: local conditions are knowable (in principle). | |||
- Appropriate model depends on question being asked (e.g., probability of conditions at a randomly sampled location vs. conditions at a specific location). | |||
* Practice of distinction is generally accepted, has proven useful in RIDM. | |||
* NPP PRA community (particularly Level 1) uses probability of frequency modeling convention (used in these lectures). | |||
* Terminology has been (and remains) a source of angst. | |||
* Some arguments in the broader risk community for alternate measures of epistemic uncertainty (e.g., possibility, degree of membership) 12 | |||
Epistemic Uncertainty: Magnitude PRA Uncertainties - Various Studies 13 | |||
Epistemic Uncertainty: Magnitude NUREG-1150 Uncertainties - Another View 2.5E+05 2.0E+05 Probability Density Function 1.5E+05 1.0E+05 5.0E+04 0.0E+00 1.E-11 1.E-10 1.E-09 1.E-08 1.E-07 1.E-06 1.E-05 1.E-04 1.E-03 CDF (/ry) | |||
Surry Internal Surry EQ Surry Fire Surry Total Estimated 14 | |||
Epistemic Uncertainty: Magnitude NUREG-1150 Uncertainties - Another View 2.5E+05 2.0E+05 Probability Density Function 1.5E+05 1.0E+05 5.0E+04 0.0E+00 0.E+00 2.E-05 4.E-05 6.E-05 8.E-05 1.E-04 CDF (/ry) | |||
Surry Internal Surry EQ Surry Fire Surry Total Estimated 15 | |||
Epistemic Uncertainty: Magnitude PRA Uncertainties - Indian Point (1980s) 16 | |||
Epistemic Uncertainty: Characterization Knowledge Check | |||
* Why do the NUREG-1150 curves get progressively taller with smaller values of CDF? | |||
* What does this tell you about the Indian Point graph? | |||
17 | |||
Epistemic Uncertainty: Types Types of Epistemic Uncertainty | |||
* Completeness | |||
- Recognized (known unknowns) | |||
- Unrecognized (unknown unknowns) | |||
* Model | |||
- Competing models | |||
- Difference between predictions and reality | |||
* Parameter | |||
- Conditioned on model | |||
- Estimated (quantified) based on available evidence 18 | |||
Epistemic Uncertainty: Types Parameter Uncertainty | |||
* Typically characterized using parametric distributions (e.g., beta, gamma, lognormal) | |||
* Parameters estimated using Bayes Theorem (see Lecture 5-1) | |||
- Statistical evidence (operational experience, test results); | |||
engineering judgment in processing | |||
- Large amounts of data => subjective probabilities converge with evidence | |||
- Small amounts of data => large uncertainties | |||
* Note: pooling data increases confidence in estimates for population, but not necessarily for NPP studied. | |||
19 | |||
Epistemic Uncertainty: Magnitude Generic Demand Failure Probabilities | |||
* 2015 Industry-wide estimates from: https://nrcoe.inl.gov/resultsdb/AvgPerf/ | |||
* Explosive valves: 3 failures in 713 trials | |||
* SBO EDGs: 12 failures in 419 trials | |||
* EDGs: 214 failures in 75,452 trials Demand Failures Demand Failures Probability Density Function Probability Density Function 500 500 400 400 300 300 200 200 100 100 0 0 0.00 0.01 0.02 0.03 0.04 0.05 0.06 1.00E-04 1.00E-03 1.00E-02 1.00E-01 Failure Probability Failure Probability Explosive Valves SBO EDG EDG Explosive Valves SBO EDG EDG 20 | |||
Epistemic Uncertainty: Magnitude Generic Runtime Failure Rates | |||
* 2015 Industry-wide estimates from: https://nrcoe.inl.gov/resultsdb/AvgPerf/ | |||
* Service Water Pumps: 2 failures in 16,292,670 hours | |||
* Normally Running Pumps: 225 failures in 59,582,350 hours | |||
* Standby Pumps (1st hour operation): 48 failures in 437,647 hours Runtime Failures 1.00 Probability Density Function 0.80 0.60 Note: point values wont 0.40 alert user to potential (Normalized) 0.20 irregularities 0.00 1.00E-09 1.00E-08 1.00E-07 1.00E-06 1.00E-05 1.00E-04 1.00E-03 Failure Rate (/hr) | |||
Service Water Normally Running Standby 21 | |||
Epistemic Uncertainty: Types Model Uncertainty | |||
* Some arbitrariness in distinction from parameter uncertainty | |||
* Currently only requirements for characterization (not quantification) | |||
* Typically addressed through consensus models, sensitivity studies | |||
* If appropriately defined, uncertainties can be quantified and brought into the PRA | |||
- Focus on observables, difference between prediction and reality (vs. what is best or correct model) | |||
- Limited applications in practice 22 | |||
Epistemic Uncertainty: Types Thought Exercise Under what conditions could a flawed consensus model be good enough for RIDM? | |||
23 | |||
Epistemic Uncertainty: Types Example Quantification of Model Uncertainty | |||
* Estimating uncertainty factor in Deterministic Reference Model (DRM) estimates based on comparisons with experiments | |||
* DRM for cable fire propagation velocity 0.05 Probability Density Function 2 0.04 05 3.8 | |||
= 2 50 14.8 0.03 95 56.9 0.02 E[] 18.6 | |||
* Uncertainty factor (a random variable) 1 2 0.01 1 | |||
= , = 2 | |||
, 2 0.00 0 20 40 60 80 100 | |||
* Note: bias in results indicates need for better DRM Model Uncertainty Factor N. Siu, D. Karydas, and J. Temple, "Bayesian Assessment of Modeling Uncertainty: | |||
Application to Fire Risk Assessment," in Analysis and Management of Uncertainty: | |||
Theory and Application, B.M. Ayyub, M.M. Gupta, and L.N. Kanal, eds., North-Holland, 1992, pp. 351-361. | |||
24 | |||
Epistemic Uncertainty: Types Another Example Quantification of Model Uncertainty Time (s) Experiment (K) DRM (K) 180 400 450 360 465 510 720 530 560 840 550 565 Notes: | |||
: 1) Bayesian methodology accounts for possibility of inhomogeneous data. | |||
: 2) Very large uncertainty bands might be unrealistic. | |||
E. Droguett and Ali Mosleh, Bayesian methodology for model uncertainty using model Performance data, Risk Analysis, 28, No. 5, 1457-1476, 2008. | |||
25 | |||
Epistemic Uncertainty: Types Model Uncertainty Quantification - Other Considerations | |||
* Perspective matters | |||
- Model developer - ability of model to address phenomena | |||
- PRA user - include user effects | |||
* Uncertainties in unmeasured parameters | |||
* Sub-model limits of applicability M.H. Salley and A. Lindeman, Verification and Validation of Selected Fire Models for Nuclear Power Plant Applications, NUREG-1824 Supplement 1/EPRI 3002002182, November 2016. | |||
26 | |||
Epistemic Uncertainty: Types Completeness Uncertainty - Example Known Sources | |||
* Intentional acts (sabotage/terrorism) | |||
* Operator errors of commission | |||
* Organizational influences | |||
- Operation outside approved conditions (the licensing basis envelope) | |||
- Safety culture | |||
- External influences during an accident | |||
* Design errors | |||
* Phenomenological complexities | |||
- Combinations of severe hazards | |||
- Newly recognized hazards 27 | |||
Epistemic Uncertainty: Overall Characterization Propagation of Uncertainties | |||
* Eventual aim: state uncertainty in risk metric estimates | |||
* Quantitative aspects determined by propagating uncertainty in contributing parameters through PRA model p notation used to indicate epistemic distribution | |||
= | |||
0 28 | |||
Epistemic Uncertainty: Overall Characterization Propagation of Uncertainties - Methods | |||
* Method of moments | |||
* Sampling | |||
- Direct Monte Carlo | |||
- Latin Hypercube | |||
* Advanced methods for computationally challenging situations | |||
- Importance sampling (line sampling, subset sampling, ) | |||
- Surrogate models (response surface, Gaussian process, neural nets, ) | |||
29 | |||
Epistemic Uncertainty: Overall Characterization Sensitivity Analysis | |||
* Typical approach for treating model uncertainties | |||
* In PRA/RIDM context, need to ensure analysis provides useful information | |||
- Plausible variations | |||
- Specific insights supporting decision problem (e.g., whether variation could lead to different decision) | |||
* Importance measures, routinely available from current PRA software tools, provide results from one-at-a-time sensitivity calculations | |||
* Advanced tools (e.g., global sensitivity analysis) available but not routinely used 30 | |||
Epistemic Uncertainty: Overall Characterization Importance Measures | |||
* Mathematical measures depend on definition of importance | |||
* Some notation: R = risk metric of interest, Pi = probability of event i | |||
* Common measures: | |||
Measure Definition Alternate Notes Fussell-Vesely : | |||
= Same rankings as RRW and RRR (F-V) | |||
Birnbaum = 1 = 0 = Nearly same rankings as RAW and RIR Risk Achievement | |||
= 1 Nearly same rankings as Birnbaum Worth (RAW) | |||
Risk Increase Ratio = 1 Nearly same rankings as Birnbaum (RIR) | |||
Risk Reduction | |||
= 0 Same rankings as F-V Worth (RRW) | |||
Risk Reduction Same rankings as F-V 31 Ratio (RRR) = 0 | |||
Epistemic Uncertainty: Overall Characterization Once uncertainties are characterized, then what? | |||
* See Lecture 8-1 32}} |
Latest revision as of 13:23, 2 February 2020
ML19011A427 | |
Person / Time | |
---|---|
Issue date: | 01/16/2019 |
From: | Office of Nuclear Regulatory Research |
To: | |
Nathan Siu 415-0744 | |
Shared Package | |
ML19011A416 | List:
|
References | |
Download: ML19011A427 (32) | |
Text
Uncertainty and Uncertainties Lecture 3-2 1
Overview Key Topics
aleatory and epistemic
- Epistemic uncertainty
- Characterization (subjective probability)
- Magnitude for typical parameters
- Types: completeness, model, parameter
- Characterization for overall system (propagation) 2
Overview Resources
- G. Apostolakis, Probability and risk assessment: the subjectivistic viewpoint and some suggestions, Nuclear Safety, 9, 305-315, 1978.
- G. Apostolakis, The concept of probability in safety assessments of technological systems, Science, 250, 1359-1364, 1990.
- N. Siu, et al., Probabilistic Risk Assessment and Regulatory Decisionmaking: Some Frequently Asked Questions, NUREG-2201, U.S.
Nuclear Regulatory Commission, September 2016.
- U.S. Nuclear Regulatory Commission, Guidance on the Treatment of Uncertainties Associated with PRAs in Risk-Informed Decision Making, NUREG-1855, Revision 1, March 2013.
- M. Granger Morgan, Use (and abuse) of expert elicitation in support of decision making for public policy, National Academy of Sciences Proceedings (NASP), 111, No. 20, 7176-7184, May 20, 2014.
3
Overview Other References
- A. Mosleh, et al., Model Uncertainty: Its Characterization and Quantification, Center for Reliability Engineering, University of Maryland, College Park, MD, 1995. (Also available as NUREG/CP-0138)
- N. Siu, D. Karydas, and J. Temple, "Bayesian Assessment of Modeling Uncertainty: Application to Fire Risk Assessment," in Analysis and Management of Uncertainty: Theory and Application, B.M. Ayyub, M.M.
Gupta, and L.N. Kanal, eds., North-Holland, 1992, pp. 351-361.
- E. Droguett and A. Mosleh, Bayesian methodology for model uncertainty using model performance data, Risk Analysis, 28, No. 5, 1457-1476, 2008.
- U.S. Nuclear Regulatory Commission, Reliability and Availability Data System (RADS) https://nrcoe.inl.gov/resultsdb/RADS/
- U.S. Nuclear Regulatory Commission, Industry Average Parameter Estimates, https://nrcoe.inl.gov/resultsdb/AvgPerf/
4
Overview Other References (cont.)
- W.E. Vesely, et al., Measures of Risk Importance and Their Applications, NUREG/CR-3385, July 1983.
- Siu, N. and D.L. Kelly, On the Use of Importance Measures for Prioritizing Systems, Structures, and Components, Proceedings 5th International Topical Meeting on Nuclear Thermal Hydraulics, Operations, and Safety (NUTHOS-5), Beijing, China, April 14-18, 1997, pp. L.4-1 through L.4-6.
- American Nuclear Society and the Institute of Electrical and Electronics Engineers, PRA Procedures Guide, NUREG/CR-2300, January 1983.
- E. Zio and N. Pedroni, How to effectively compute the reliability of a thermal-hydraulic passive system, Nuclear Engineering and Design, 241, 310-327, 2011.
5
Motivation The Big Picture (Level 1)
=
- =
= 2 + 4 +
=
- CDF is a measure of uncertainty
- Random variables:
- number of core damage events
- time to a core damage event How well do we know CDF?
6
Types of Uncertainty Types of Uncertainty Addressed by NPP PRA/RIDM
- Aleatory (random, stochastic)
- Irreducible
- Examples:
- How many times a safety relief valve (SRV) operates before failing
- How long it takes for operators to initiate a fire response procedure
- Modeling addressed in Lecture 3-1
- Epistemic (state-of-knowledge)
- Reducible
- Examples:
- True value of SRV failure rate
- Best model for operator response 7
Indicators of Uncertainty
- Percentiles
- = = =
- Bounds (e.g., 5th, 95th)
- Moments
- General
- Mean
- Variance 2
8
Epistemic Uncertainty: Characterization Epistemic Uncertainty: The Probability of Frequency Model Subjective Probability
- Probability is an internal l could easily be above or below 4.6E-5 (the median) measure of degree of belief in a proposition
- To be useful, need:
50%
- Consistent scale for comparing judgments
- Convergence with classical (relative frequency)
Very confident l is below probabilities given enough 7.8E-5 (the 95th percentile) data
- Coherent analyst:
95% probabilities conform to the laws of probability 9
Epistemic Uncertainty: Characterization Qualitative to Quantitative: Examples M. Granger Morgan, Use (and abuse) of expert elicitation in support of decision making for public policy, National Academy of Sciences Proceedings (NASP), 111, No. 20, 7176-7184, May 20, 2014.
Types of Uncertainty Aleatory vs. Epistemic - Relationship
- Subjective view: all uncertainties are epistemic. Aleatory model
- brings in analysts understanding of underlying processes
- supports quantification
- supports communication regarding model parameters
- Coin toss example
- Uncertain proposition X: observation of n heads on next m tosses (trials)
- P{XlC,H} can include consideration of conditions: fairness of coin, tossing, measurement; whether outcomes are truly binary
- If conditions are ideal, epistemic uncertainty (for a coherent analyst) is given by aleatory (binomial) distribution:
= lf = f 1f
- If there is uncertainty in the conditions, as reflected by a probability distribution for f, the total probability is the expectation of the aleatory result:
1
= lC, H = f 1f F f f 0
epistemic distribution for f 11
Types of Uncertainty Aleatory vs. Epistemic Uncertainty -
Commentary and Observations
- Distinction depends on frame of reference (model of the world).
- Example: Weld crack propagation depends on local material conditions (e.g., Cu content, flaw geometry).
- Aleatory model: conditions have a statistical distribution.
- Epistemic model: local conditions are knowable (in principle).
- Appropriate model depends on question being asked (e.g., probability of conditions at a randomly sampled location vs. conditions at a specific location).
- Practice of distinction is generally accepted, has proven useful in RIDM.
- NPP PRA community (particularly Level 1) uses probability of frequency modeling convention (used in these lectures).
- Terminology has been (and remains) a source of angst.
- Some arguments in the broader risk community for alternate measures of epistemic uncertainty (e.g., possibility, degree of membership) 12
Epistemic Uncertainty: Magnitude PRA Uncertainties - Various Studies 13
Epistemic Uncertainty: Magnitude NUREG-1150 Uncertainties - Another View 2.5E+05 2.0E+05 Probability Density Function 1.5E+05 1.0E+05 5.0E+04 0.0E+00 1.E-11 1.E-10 1.E-09 1.E-08 1.E-07 1.E-06 1.E-05 1.E-04 1.E-03 CDF (/ry)
Surry Internal Surry EQ Surry Fire Surry Total Estimated 14
Epistemic Uncertainty: Magnitude NUREG-1150 Uncertainties - Another View 2.5E+05 2.0E+05 Probability Density Function 1.5E+05 1.0E+05 5.0E+04 0.0E+00 0.E+00 2.E-05 4.E-05 6.E-05 8.E-05 1.E-04 CDF (/ry)
Surry Internal Surry EQ Surry Fire Surry Total Estimated 15
Epistemic Uncertainty: Magnitude PRA Uncertainties - Indian Point (1980s) 16
Epistemic Uncertainty: Characterization Knowledge Check
- Why do the NUREG-1150 curves get progressively taller with smaller values of CDF?
- What does this tell you about the Indian Point graph?
17
Epistemic Uncertainty: Types Types of Epistemic Uncertainty
- Completeness
- Recognized (known unknowns)
- Unrecognized (unknown unknowns)
- Model
- Competing models
- Difference between predictions and reality
- Parameter
- Conditioned on model
- Estimated (quantified) based on available evidence 18
Epistemic Uncertainty: Types Parameter Uncertainty
- Typically characterized using parametric distributions (e.g., beta, gamma, lognormal)
- Parameters estimated using Bayes Theorem (see Lecture 5-1)
- Statistical evidence (operational experience, test results);
engineering judgment in processing
- Large amounts of data => subjective probabilities converge with evidence
- Small amounts of data => large uncertainties
- Note: pooling data increases confidence in estimates for population, but not necessarily for NPP studied.
19
Epistemic Uncertainty: Magnitude Generic Demand Failure Probabilities
- 2015 Industry-wide estimates from: https://nrcoe.inl.gov/resultsdb/AvgPerf/
- Explosive valves: 3 failures in 713 trials
- EDGs: 214 failures in 75,452 trials Demand Failures Demand Failures Probability Density Function Probability Density Function 500 500 400 400 300 300 200 200 100 100 0 0 0.00 0.01 0.02 0.03 0.04 0.05 0.06 1.00E-04 1.00E-03 1.00E-02 1.00E-01 Failure Probability Failure Probability Explosive Valves SBO EDG EDG Explosive Valves SBO EDG EDG 20
Epistemic Uncertainty: Magnitude Generic Runtime Failure Rates
- 2015 Industry-wide estimates from: https://nrcoe.inl.gov/resultsdb/AvgPerf/
- Service Water Pumps: 2 failures in 16,292,670 hours0.00775 days <br />0.186 hours <br />0.00111 weeks <br />2.54935e-4 months <br />
- Normally Running Pumps: 225 failures in 59,582,350 hours0.00405 days <br />0.0972 hours <br />5.787037e-4 weeks <br />1.33175e-4 months <br />
- Standby Pumps (1st hour operation): 48 failures in 437,647 hours0.00749 days <br />0.18 hours <br />0.00107 weeks <br />2.461835e-4 months <br /> Runtime Failures 1.00 Probability Density Function 0.80 0.60 Note: point values wont 0.40 alert user to potential (Normalized) 0.20 irregularities 0.00 1.00E-09 1.00E-08 1.00E-07 1.00E-06 1.00E-05 1.00E-04 1.00E-03 Failure Rate (/hr)
Service Water Normally Running Standby 21
Epistemic Uncertainty: Types Model Uncertainty
- Some arbitrariness in distinction from parameter uncertainty
- Currently only requirements for characterization (not quantification)
- Typically addressed through consensus models, sensitivity studies
- If appropriately defined, uncertainties can be quantified and brought into the PRA
- Focus on observables, difference between prediction and reality (vs. what is best or correct model)
- Limited applications in practice 22
Epistemic Uncertainty: Types Thought Exercise Under what conditions could a flawed consensus model be good enough for RIDM?
23
Epistemic Uncertainty: Types Example Quantification of Model Uncertainty
- Estimating uncertainty factor in Deterministic Reference Model (DRM) estimates based on comparisons with experiments
- DRM for cable fire propagation velocity 0.05 Probability Density Function 2 0.04 05 3.8
= 2 50 14.8 0.03 95 56.9 0.02 E[] 18.6
- Uncertainty factor (a random variable) 1 2 0.01 1
= , = 2
, 2 0.00 0 20 40 60 80 100
- Note: bias in results indicates need for better DRM Model Uncertainty Factor N. Siu, D. Karydas, and J. Temple, "Bayesian Assessment of Modeling Uncertainty:
Application to Fire Risk Assessment," in Analysis and Management of Uncertainty:
Theory and Application, B.M. Ayyub, M.M. Gupta, and L.N. Kanal, eds., North-Holland, 1992, pp. 351-361.
24
Epistemic Uncertainty: Types Another Example Quantification of Model Uncertainty Time (s) Experiment (K) DRM (K) 180 400 450 360 465 510 720 530 560 840 550 565 Notes:
- 1) Bayesian methodology accounts for possibility of inhomogeneous data.
- 2) Very large uncertainty bands might be unrealistic.
E. Droguett and Ali Mosleh, Bayesian methodology for model uncertainty using model Performance data, Risk Analysis, 28, No. 5, 1457-1476, 2008.
25
Epistemic Uncertainty: Types Model Uncertainty Quantification - Other Considerations
- Perspective matters
- Model developer - ability of model to address phenomena
- PRA user - include user effects
- Uncertainties in unmeasured parameters
- Sub-model limits of applicability M.H. Salley and A. Lindeman, Verification and Validation of Selected Fire Models for Nuclear Power Plant Applications, NUREG-1824 Supplement 1/EPRI 3002002182, November 2016.
26
Epistemic Uncertainty: Types Completeness Uncertainty - Example Known Sources
- Intentional acts (sabotage/terrorism)
- Operator errors of commission
- Organizational influences
- Operation outside approved conditions (the licensing basis envelope)
- Safety culture
- External influences during an accident
- Design errors
- Phenomenological complexities
- Combinations of severe hazards
- Newly recognized hazards 27
Epistemic Uncertainty: Overall Characterization Propagation of Uncertainties
- Eventual aim: state uncertainty in risk metric estimates
- Quantitative aspects determined by propagating uncertainty in contributing parameters through PRA model p notation used to indicate epistemic distribution
=
0 28
Epistemic Uncertainty: Overall Characterization Propagation of Uncertainties - Methods
- Method of moments
- Sampling
- Direct Monte Carlo
- Latin Hypercube
- Advanced methods for computationally challenging situations
- Importance sampling (line sampling, subset sampling, )
- Surrogate models (response surface, Gaussian process, neural nets, )
29
Epistemic Uncertainty: Overall Characterization Sensitivity Analysis
- Typical approach for treating model uncertainties
- In PRA/RIDM context, need to ensure analysis provides useful information
- Plausible variations
- Specific insights supporting decision problem (e.g., whether variation could lead to different decision)
- Importance measures, routinely available from current PRA software tools, provide results from one-at-a-time sensitivity calculations
- Advanced tools (e.g., global sensitivity analysis) available but not routinely used 30
Epistemic Uncertainty: Overall Characterization Importance Measures
- Mathematical measures depend on definition of importance
- Some notation: R = risk metric of interest, Pi = probability of event i
- Common measures:
Measure Definition Alternate Notes Fussell-Vesely :
= Same rankings as RRW and RRR (F-V)
Birnbaum = 1 = 0 = Nearly same rankings as RAW and RIR Risk Achievement
= 1 Nearly same rankings as Birnbaum Worth (RAW)
Risk Increase Ratio = 1 Nearly same rankings as Birnbaum (RIR)
Risk Reduction
= 0 Same rankings as F-V Worth (RRW)
Risk Reduction Same rankings as F-V 31 Ratio (RRR) = 0
Epistemic Uncertainty: Overall Characterization Once uncertainties are characterized, then what?
- See Lecture 8-1 32