ML19011A427: Difference between revisions

From kanterella
Jump to navigation Jump to search
(Created page by program invented by StriderTol)
 
(Created page by program invented by StriderTol)
Line 16: Line 16:


=Text=
=Text=
{{#Wiki_filter:Uncertainty and UncertaintiesLecture 3-21 Key TopicsTypes of uncertainty (as treated in NPP PRA): aleatory and epistemicEpistemic uncertaintyCharacterization (subjective probability)Magnitude for typical parametersTypes: completeness, model, parameter2Overview ResourcesG. ApostolakissubjectivisticNuclear Safety, 9, 305315, 1978.G. ApostolakisScience, 250, 13591364, 1990. DecisionmakingNUREG-2201, U.S. Nuclear Regulatory Commission, September 2016.Uncertainties Associated with PRAs in Risk-NUREG-1855, Revision 1, March 2013.National Academy of Sciences Proceedings (NASP), 111, No. 20, 7176-7184, May 20, 2014.3Overview Other ReferencesA. Mosleh, et al., Model Uncertainty: Its Characterization and Quantification, Center for Reliability Engineering, University of Maryland, College Park, MD, 1995. (Also available as NUREG/CP-0138)N. Siu, D. Karydas, and J. Temple, "Bayesian Assessment of Modeling Uncertainty: Application to Fire Risk Assessment," in Analysis and Management of Uncertainty: Theory and Application, B.M. Ayyub, M.M. Gupta, and L.N. Kanal, eds., North-Holland, 1992, pp. 351-361.E. DroguettRisk Analysis, 28, No. 5, 1457-1476, 2008.https://nrcoe.inl.gov/resultsdb/RADS/https://nrcoe.inl.gov/resultsdb/AvgPerf/4Overview Other References (cont.)W.E. VeselyNUREG/CR-3385, July 1983.Proceedings 5th International Topical Meeting on Nuclear Thermal Hydraulics, Operations, and Safety (NUTHOS-5), Beijing, China, April 14-18, 1997, pp. L.4-1 through L.4-6.American Nuclear Society and the Institute of Electrical and -2300, January 1983.a thermal-Nuclear Engineering and Design, 241, 310-327, 2011.5Overview The Big Picture (Level 1)CDF is a measure of uncertaintyRandom variables:number of core damage eventstime to a core damage event6How well do we know CDF?Motivation Types of Uncertainty Addressed by NPP PRA/RIDMAleatory (random, stochastic)Irreducible Examples:How many times a safety relief valve (SRV) operates before failingHow long it takes for operators to initiate a fire response procedureModeling addressed in Lecture 3-1Epistemic (state-of-knowledge)ReducibleExamples:7Types of Uncertainty Indicators of UncertaintyPercentilesth, 95th)MomentsGeneralMeanVariance8 950%95%could easily be above or below 4.6E-5 (the median)Very confident is below 7.8E-5 (the 95thpercentile)Subjective ProbabilityProbability is an internal measure of degree of belief in a propositionTo be useful, need:Consistent scale for comparing judgmentsConvergence with classical (relative frequency) probabilities given enough dataprobabilities conform to the laws of probabilityEpistemic Uncertainty: Characterization Qualitative to Quantitative: Examples10National Academy of Sciences Proceedings (NASP), 111, No. 20, 7176-7184, May 20, 2014.Epistemic Uncertainty: CharacterizationNUREG/CR-6771 Aleatory vs. Epistemic Relationship Subjective view: all uncertainties are epistemic. Aleatory modelsupports quantificationsupports communication regarding model parametersCoin toss exampleUncertain proposition X: observation of nmtosses (trials)P{XlC,H} can include consideration of conditions: fairness of coin, tossing, measurement; whether outcomes are truly binaryaleatory (binomial) distribution:If there is uncertainty in the conditions, as reflected by a probability distribution for , the total probability is the expectation of the aleatory result:11epistemic distribution for Types of Uncertainty Aleatory vs. Epistemic Uncertainty Commentary and ObservationsExample: Weld crack propagation depends on local material conditions (e.g., Cu content, flaw geometry).Aleatory model: conditions have a statistical distribution. Epistemic model: local conditions are knowable (in principle).at a randomly sampled location vs. conditions at a specific location).Practice of distinction is generally accepted, has proven useful in RIDM.modeling convention (used in these lectures).Terminology has been (and remains) a source of angst.Some arguments in the broader risk community for alternate measures of epistemic uncertainty (e.g., possibility, degree of membership)12Types of Uncertainty PRA Uncertainties Various Studies13Epistemic Uncertainty: Magnitude NUREG-1150 Uncertainties Another View140.0E+005.0E+041.0E+051.5E+052.0E+052.5E+051.E-111.E-101.E-091.E-081.E-071.E-061.E-051.E-041.E-03Probability Density FunctionCDF (/ry)Surry InternalSurry EQSurry FireSurry TotalEstimatedEpistemic Uncertainty: Magnitude NUREG-1150 Uncertainties Another View150.0E+005.0E+041.0E+051.5E+052.0E+052.5E+050.E+002.E-054.E-056.E-058.E-051.E-04Probability Density FunctionCDF (/ry)Surry InternalSurry EQSurry FireSurry TotalEstimatedEpistemic Uncertainty: Magnitude PRA Uncertainties Indian Point (1980s)16Epistemic Uncertainty: Magnitude Knowledge CheckWhy do the NUREG-1150 curves get progressively taller with smaller values of CDF?What does this tell you about the Indian Point graph?17Epistemic Uncertainty: Characterization Types of Epistemic UncertaintyCompletenessModelCompeting modelsParameter Conditioned on model18Epistemic Uncertainty: Types Parameter UncertaintyTypically characterized using parametric distributions (e.g., beta, gamma, lognormal)Parameters estimated using Bayes Theorem (see Lecture 5-1)Statistical evidence (operational experience, test results); engineering judgment in processingLarge amounts of data => subjective probabilities converge with evidenceSmall amounts of data => large uncertaintiesNote: pooling data increases confidence in estimates for population, but not necessarily for NPP studied.19Epistemic Uncertainty: Types Generic Demand Failure Probabilities2015 Industry-wide estimates from: https://nrcoe.inl.gov/resultsdb/AvgPerf/Explosive valves: 3 failures in 713 trialsSBO EDGs: 12 failures in 419 trialsEDGs: 214 failures in 75,452 trials2001002003004005000.000.010.020.030.040.050.06Probability Density FunctionFailure ProbabilityDemand FailuresExplosive ValvesSBO EDGEDG01002003004005001.00E-041.00E-031.00E-021.00E-01Probability Density FunctionFailure ProbabilityDemand FailuresExplosive ValvesSBO EDGEDGEpistemic Uncertainty: Magnitude Generic Runtime Failure Rates212015 Industry-wide estimates from: https://nrcoe.inl.gov/resultsdb/AvgPerf/Service Water Pumps: 2 failures in 16,292,670 hoursNormally Running Pumps: 225 failures in 59,582,350 hoursStandby Pumps (1sthour operation): 48 failures in 437,647 hours0.000.200.400.600.801.001.00E-091.00E-081.00E-071.00E-061.00E-051.00E-041.00E-03Probability Density Function (Normalized)Failure Rate (/hr)Runtime FailuresService WaterNormally RunningStandbyalert user to potential irregularitiesEpistemic Uncertainty: Magnitude Model UncertaintyCurrently only requirements for characterization (not quantification)sensitivity studiesIf appropriately defined, uncertainties can be quantified and brought into the PRAFocus on observables, difference between prediction and Limited applications in practice22Epistemic Uncertainty: Types Thought ExerciseUnder what conditions could a flawed consensus 23Epistemic Uncertainty: Types Example Quantification of Model Uncertaintyestimates based on comparisons with experimentsDRM for cable fire propagation velocityUncertainty factor (a random variable)Note: bias in results indicates need for better DRM24N. Siu, D. Karydas, and J. Temple, "Bayesian Assessment of Modeling Uncertainty: Application to Fire Risk Assessment," in Analysis and Management of Uncertainty: Theory and Application, B.M. Ayyub, M.M. Gupta, and L.N. Kanal, eds., North-Holland, 1992, pp. 351-361.0.000.010.020.030.040.05020406080100Probability Density FunctionModel Uncertainty Factor055095E[]3.814.856.918.6Epistemic Uncertainty: Types Another Example Quantification of Model UncertaintyTime (s)Experiment (K)DRM (K)18040045036046551072053056084055056525E. DroguettRisk Analysis, 28, No. 5, 1457-1476, 2008.Notes:1)Bayesian methodology accounts for possibility of inhomogeneous data.2)Very large uncertainty bands might be unrealistic.Epistemic Uncertainty: Types Model Uncertainty Quantification Other ConsiderationsPerspective mattersModel developer ability of model to address phenomenaPRA user include user effectsUncertainties in unmeasured parametersSub-model limits of applicability26M.H. Salley-1824 Supplement 1/EPRI 3002002182, November 2016.Epistemic Uncertainty: Types Completeness Uncertainty Example Known SourcesIntentional acts (sabotage/terrorism)Operator errors of commissionOrganizational influencesSafety cultureExternal influences during an accidentDesign errorsPhenomenological complexitiesCombinations of severe hazardsNewly recognized hazards27Epistemic Uncertainty: Types Propagation of UncertaintiesEventual aim: state uncertainty in risk metric estimatescontributing parameters through PRA model28notation used to indicate epistemic distributionEpistemic Uncertainty: Overall Characterization Propagation of Uncertainties -MethodsMethod of momentsSamplingDirect Monte CarloLatin HypercubeAdvanced methods for computationally challenging situationsImportance sampling (line sampling, subset Surrogate models (response surface, Gaussian 29Epistemic Uncertainty: Overall Characterization Sensitivity AnalysisTypical approach for treating model uncertaintiesIn PRA/RIDM context, need to ensure analysis provides useful informationSpecific insights supporting decision problem (e.g., whether variation could lead to different decision)Importance measures, routinely available from current -at-a-sensitivity calculationsAdvanced tools (e.g., global sensitivity analysis) available but not routinely used30Epistemic Uncertainty: Overall Characterization Importance MeasuresSome notation: R = risk metric of interest, Pi= probability of event iCommon measures: 31MeasureDefinitionAlternateNotesFussell-Vesely(F-V)Same rankings as RRW and RRRBirnbaumNearly same rankings as RAW and RIRRisk Achievement Worth (RAW)Nearly same rankings as BirnbaumRisk Increase Ratio (RIR)Nearly same rankings as BirnbaumRisk Reduction Worth (RRW)Same rankings as F-VRisk Reduction Ratio (RRR)Same rankings as F-VEpistemic Uncertainty: Overall Characterization Once uncertainties are characterized, then what?See Lecture 8-132Epistemic Uncertainty: Overall Characterization}}
{{#Wiki_filter:Uncertainty and Uncertainties Lecture 3-2 1
 
Overview Key Topics
* Types of uncertainty (as treated in NPP PRA):
aleatory and epistemic
* Epistemic uncertainty
  - Characterization (subjective probability)
  - Magnitude for typical parameters
  - Types: completeness, model, parameter
  - Characterization for overall system (propagation) 2
 
Overview Resources
* G. Apostolakis, Probability and risk assessment: the subjectivistic viewpoint and some suggestions, Nuclear Safety, 9, 305-315, 1978.
* G. Apostolakis, The concept of probability in safety assessments of technological systems, Science, 250, 1359-1364, 1990.
* N. Siu, et al., Probabilistic Risk Assessment and Regulatory Decisionmaking: Some Frequently Asked Questions, NUREG-2201, U.S.
Nuclear Regulatory Commission, September 2016.
* U.S. Nuclear Regulatory Commission, Guidance on the Treatment of Uncertainties Associated with PRAs in Risk-Informed Decision Making, NUREG-1855, Revision 1, March 2013.
* M. Granger Morgan, Use (and abuse) of expert elicitation in support of decision making for public policy, National Academy of Sciences Proceedings (NASP), 111, No. 20, 7176-7184, May 20, 2014.
3
 
Overview Other References
* A. Mosleh, et al., Model Uncertainty: Its Characterization and Quantification, Center for Reliability Engineering, University of Maryland, College Park, MD, 1995. (Also available as NUREG/CP-0138)
* N. Siu, D. Karydas, and J. Temple, "Bayesian Assessment of Modeling Uncertainty: Application to Fire Risk Assessment," in Analysis and Management of Uncertainty: Theory and Application, B.M. Ayyub, M.M.
Gupta, and L.N. Kanal, eds., North-Holland, 1992, pp. 351-361.
* E. Droguett and A. Mosleh, Bayesian methodology for model uncertainty using model performance data, Risk Analysis, 28, No. 5, 1457-1476, 2008.
* U.S. Nuclear Regulatory Commission, Reliability and Availability Data System (RADS) https://nrcoe.inl.gov/resultsdb/RADS/
* U.S. Nuclear Regulatory Commission, Industry Average Parameter Estimates, https://nrcoe.inl.gov/resultsdb/AvgPerf/
4
 
Overview Other References (cont.)
* W.E. Vesely, et al., Measures of Risk Importance and Their Applications, NUREG/CR-3385, July 1983.
* Siu, N. and D.L. Kelly, On the Use of Importance Measures for Prioritizing Systems, Structures, and Components, Proceedings 5th International Topical Meeting on Nuclear Thermal Hydraulics, Operations, and Safety (NUTHOS-5), Beijing, China, April 14-18, 1997, pp. L.4-1 through L.4-6.
* American Nuclear Society and the Institute of Electrical and Electronics Engineers, PRA Procedures Guide, NUREG/CR-2300, January 1983.
* E. Zio and N. Pedroni, How to effectively compute the reliability of a thermal-hydraulic passive system, Nuclear Engineering and Design, 241, 310-327, 2011.
5
 
Motivation The Big Picture (Level 1)
                      =
: =
                      = 2 + 4 +
                      =
* CDF is a measure of uncertainty
* Random variables:
                        - number of core damage events
                        - time to a core damage event How well do we know CDF?
6
 
Types of Uncertainty Types of Uncertainty Addressed by NPP PRA/RIDM
* Aleatory (random, stochastic)
  - Irreducible
  - Examples:
* How many times a safety relief valve (SRV) operates before failing
* How long it takes for operators to initiate a fire response procedure
  - Modeling addressed in Lecture 3-1
* Epistemic (state-of-knowledge)
  - Reducible
  - Examples:
* True value of SRV failure rate
* Best model for operator response 7
 
Indicators of Uncertainty
* Percentiles
:   =    =    =
 
  - Bounds (e.g., 5th, 95th)
* Moments
  - General
 
  - Mean
 
  - Variance 2
 
8
 
Epistemic Uncertainty: Characterization Epistemic Uncertainty: The Probability of Frequency Model                            Subjective Probability
* Probability is an internal l could easily be above or below 4.6E-5 (the median)       measure of degree of belief in a proposition
* To be useful, need:
50%
                                            - Consistent scale for comparing judgments
                                            - Convergence with classical (relative frequency)
Very confident l is below            probabilities given enough 7.8E-5 (the 95th percentile) data
* Coherent analyst:
95%                                  probabilities conform to the laws of probability 9
 
Epistemic Uncertainty: Characterization Qualitative to Quantitative: Examples M. Granger Morgan, Use (and abuse) of expert elicitation in support of decision making for public policy, National Academy of Sciences Proceedings (NASP), 111, No. 20, 7176-7184, May 20, 2014.
NUREG/CR-6771 10
 
Types of Uncertainty Aleatory vs. Epistemic - Relationship
* Subjective view: all uncertainties are epistemic. Aleatory model
  - brings in analysts understanding of underlying processes
  - supports quantification
  - supports communication regarding model parameters
* Coin toss example
  - Uncertain proposition X: observation of n heads on next m tosses (trials)
  - P{XlC,H} can include consideration of conditions: fairness of coin, tossing, measurement; whether outcomes are truly binary
  - If conditions are ideal, epistemic uncertainty (for a coherent analyst) is given by aleatory (binomial) distribution:
 
            =    lf =                f 1f     
 
  - If there is uncertainty in the conditions, as reflected by a probability distribution for f, the total probability is the expectation of the aleatory result:
1
 
            =    lC, H =                  f 1f F f  f 0
 
epistemic distribution for f 11
 
Types of Uncertainty Aleatory vs. Epistemic Uncertainty -
Commentary and Observations
* Distinction depends on frame of reference (model of the world).
* Example: Weld crack propagation depends on local material conditions (e.g., Cu content, flaw geometry).
  - Aleatory model: conditions have a statistical distribution.
  - Epistemic model: local conditions are knowable (in principle).
  - Appropriate model depends on question being asked (e.g., probability of conditions at a randomly sampled location vs. conditions at a specific location).
* Practice of distinction is generally accepted, has proven useful in RIDM.
* NPP PRA community (particularly Level 1) uses probability of frequency modeling convention (used in these lectures).
* Terminology has been (and remains) a source of angst.
* Some arguments in the broader risk community for alternate measures of epistemic uncertainty (e.g., possibility, degree of membership) 12
 
Epistemic Uncertainty: Magnitude PRA Uncertainties - Various Studies 13
 
Epistemic Uncertainty: Magnitude NUREG-1150 Uncertainties - Another View 2.5E+05 2.0E+05 Probability Density Function 1.5E+05 1.0E+05 5.0E+04 0.0E+00 1.E-11  1.E-10      1.E-09    1.E-08    1.E-07    1.E-06    1.E-05    1.E-04      1.E-03 CDF (/ry)
Surry Internal            Surry EQ            Surry Fire            Surry Total Estimated 14
 
Epistemic Uncertainty: Magnitude NUREG-1150 Uncertainties - Another View 2.5E+05 2.0E+05 Probability Density Function 1.5E+05 1.0E+05 5.0E+04 0.0E+00 0.E+00          2.E-05    4.E-05              6.E-05    8.E-05          1.E-04 CDF (/ry)
Surry Internal      Surry EQ          Surry Fire        Surry Total Estimated 15
 
Epistemic Uncertainty: Magnitude PRA Uncertainties - Indian Point (1980s) 16
 
Epistemic Uncertainty: Characterization Knowledge Check
* Why do the NUREG-1150 curves get progressively taller with smaller values of CDF?
* What does this tell you about the Indian Point graph?
17
 
Epistemic Uncertainty: Types Types of Epistemic Uncertainty
* Completeness
  - Recognized (known unknowns)
  - Unrecognized (unknown unknowns)
* Model
  - Competing models
  - Difference between predictions and reality
* Parameter
  - Conditioned on model
  - Estimated (quantified) based on available evidence 18
 
Epistemic Uncertainty: Types Parameter Uncertainty
* Typically characterized using parametric distributions (e.g., beta, gamma, lognormal)
* Parameters estimated using Bayes Theorem (see Lecture 5-1)
  - Statistical evidence (operational experience, test results);
engineering judgment in processing
  - Large amounts of data => subjective probabilities converge with evidence
  - Small amounts of data => large uncertainties
* Note: pooling data increases confidence in estimates for population, but not necessarily for NPP studied.
19
 
Epistemic Uncertainty: Magnitude Generic Demand Failure Probabilities
* 2015 Industry-wide estimates from: https://nrcoe.inl.gov/resultsdb/AvgPerf/
* Explosive valves: 3 failures in 713 trials
* SBO EDGs: 12 failures in 419 trials
* EDGs: 214 failures in 75,452 trials Demand Failures                                                                          Demand Failures Probability Density Function                                                            Probability Density Function 500                                                                                      500 400                                                                                      400 300                                                                                      300 200                                                                                      200 100                                                                                      100 0                                                                                      0 0.00  0.01    0.02    0.03    0.04  0.05    0.06                                  1.00E-04        1.00E-03        1.00E-02  1.00E-01 Failure Probability                                                                    Failure Probability Explosive Valves      SBO EDG        EDG                                              Explosive Valves      SBO EDG      EDG 20
 
Epistemic Uncertainty: Magnitude Generic Runtime Failure Rates
* 2015 Industry-wide estimates from: https://nrcoe.inl.gov/resultsdb/AvgPerf/
* Service Water Pumps: 2 failures in 16,292,670 hours
* Normally Running Pumps: 225 failures in 59,582,350 hours
* Standby Pumps (1st hour operation): 48 failures in 437,647 hours Runtime Failures 1.00 Probability Density Function 0.80 0.60 Note: point values wont 0.40                                                                                        alert user to potential (Normalized) 0.20                                                                                        irregularities 0.00 1.00E-09  1.00E-08    1.00E-07      1.00E-06        1.00E-05      1.00E-04  1.00E-03 Failure Rate (/hr)
Service Water          Normally Running              Standby 21
 
Epistemic Uncertainty: Types Model Uncertainty
* Some arbitrariness in distinction from parameter uncertainty
* Currently only requirements for characterization (not quantification)
* Typically addressed through consensus models, sensitivity studies
* If appropriately defined, uncertainties can be quantified and brought into the PRA
  - Focus on observables, difference between prediction and reality (vs. what is best or correct model)
  - Limited applications in practice 22
 
Epistemic Uncertainty: Types Thought Exercise Under what conditions could a flawed consensus model be good enough for RIDM?
23
 
Epistemic Uncertainty: Types Example Quantification of Model Uncertainty
* Estimating uncertainty factor in Deterministic Reference Model (DRM) estimates based on comparisons with experiments
* DRM for cable fire propagation velocity                0.05 Probability Density Function 2                                                  0.04 05  3.8
          =                                            2                                                                  50 14.8 0.03 95 56.9 0.02            E[] 18.6
* Uncertainty factor (a random variable) 1  2                                  0.01 1         
            =                      ,  =                  2 
                    ,                        2                                                      0.00 0 20 40 60 80 100
* Note: bias in results indicates need for better DRM                                                              Model Uncertainty Factor N. Siu, D. Karydas, and J. Temple, "Bayesian Assessment of Modeling Uncertainty:
Application to Fire Risk Assessment," in Analysis and Management of Uncertainty:
Theory and Application, B.M. Ayyub, M.M. Gupta, and L.N. Kanal, eds., North-Holland, 1992, pp. 351-361.
24
 
Epistemic Uncertainty: Types Another Example Quantification of Model Uncertainty Time (s)    Experiment (K)          DRM (K) 180            400                450 360            465                510 720            530                560 840            550                565 Notes:
: 1) Bayesian methodology accounts for possibility of inhomogeneous data.
: 2) Very large uncertainty bands might be unrealistic.
E. Droguett and Ali Mosleh, Bayesian methodology for model uncertainty using model Performance data, Risk Analysis, 28, No. 5, 1457-1476, 2008.
25
 
Epistemic Uncertainty: Types Model Uncertainty Quantification - Other Considerations
* Perspective matters
  - Model developer - ability of model to address phenomena
  - PRA user - include user effects
* Uncertainties in unmeasured parameters
* Sub-model limits of applicability M.H. Salley and A. Lindeman, Verification and Validation of Selected Fire Models for Nuclear Power Plant Applications, NUREG-1824 Supplement 1/EPRI 3002002182, November 2016.
26
 
Epistemic Uncertainty: Types Completeness Uncertainty - Example Known Sources
* Intentional acts (sabotage/terrorism)
* Operator errors of commission
* Organizational influences
  - Operation outside approved conditions (the licensing basis envelope)
  - Safety culture
  - External influences during an accident
* Design errors
* Phenomenological complexities
  - Combinations of severe hazards
  - Newly recognized hazards 27
 
Epistemic Uncertainty: Overall Characterization Propagation of Uncertainties
* Eventual aim: state uncertainty in risk metric estimates
* Quantitative aspects determined by propagating uncertainty in contributing parameters through PRA model p notation used to indicate epistemic distribution
 
                =
0 28
 
Epistemic Uncertainty: Overall Characterization Propagation of Uncertainties - Methods
* Method of moments
* Sampling
  - Direct Monte Carlo
  - Latin Hypercube
* Advanced methods for computationally challenging situations
  - Importance sampling (line sampling, subset sampling, )
  - Surrogate models (response surface, Gaussian process, neural nets, )
29
 
Epistemic Uncertainty: Overall Characterization Sensitivity Analysis
* Typical approach for treating model uncertainties
* In PRA/RIDM context, need to ensure analysis provides useful information
  - Plausible variations
  - Specific insights supporting decision problem (e.g., whether variation could lead to different decision)
* Importance measures, routinely available from current PRA software tools, provide results from one-at-a-time sensitivity calculations
* Advanced tools (e.g., global sensitivity analysis) available but not routinely used 30
 
Epistemic Uncertainty: Overall Characterization Importance Measures
* Mathematical measures depend on definition of importance
* Some notation: R = risk metric of interest, Pi = probability of event i
* Common measures:
Measure                Definition          Alternate    Notes Fussell-Vesely            :           
                                            =        Same rankings as RRW and RRR (F-V)                                         
 
Birnbaum          = 1    = 0      =      Nearly same rankings as RAW and RIR
 
Risk Achievement
                        = 1                    Nearly same rankings as Birnbaum Worth (RAW)
Risk Increase Ratio              = 1 Nearly same rankings as Birnbaum (RIR)
Risk Reduction
                          = 0                  Same rankings as F-V Worth (RRW)
Risk Reduction Same rankings as F-V                31 Ratio (RRR)                    = 0
 
Epistemic Uncertainty: Overall Characterization Once uncertainties are characterized, then what?
* See Lecture 8-1 32}}

Revision as of 07:54, 20 October 2019

Lecture 3-2 Uncertainties 2019-01-17
ML19011A427
Person / Time
Issue date: 01/16/2019
From:
Office of Nuclear Regulatory Research
To:
Nathan Siu 415-0744
Shared Package
ML19011A416 List:
References
Download: ML19011A427 (32)


Text

Uncertainty and Uncertainties Lecture 3-2 1

Overview Key Topics

  • Types of uncertainty (as treated in NPP PRA):

aleatory and epistemic

  • Epistemic uncertainty

- Characterization (subjective probability)

- Magnitude for typical parameters

- Types: completeness, model, parameter

- Characterization for overall system (propagation) 2

Overview Resources

  • G. Apostolakis, Probability and risk assessment: the subjectivistic viewpoint and some suggestions, Nuclear Safety, 9, 305-315, 1978.
  • G. Apostolakis, The concept of probability in safety assessments of technological systems, Science, 250, 1359-1364, 1990.

Nuclear Regulatory Commission, September 2016.

  • U.S. Nuclear Regulatory Commission, Guidance on the Treatment of Uncertainties Associated with PRAs in Risk-Informed Decision Making, NUREG-1855, Revision 1, March 2013.
  • M. Granger Morgan, Use (and abuse) of expert elicitation in support of decision making for public policy, National Academy of Sciences Proceedings (NASP), 111, No. 20, 7176-7184, May 20, 2014.

3

Overview Other References

  • A. Mosleh, et al., Model Uncertainty: Its Characterization and Quantification, Center for Reliability Engineering, University of Maryland, College Park, MD, 1995. (Also available as NUREG/CP-0138)
  • N. Siu, D. Karydas, and J. Temple, "Bayesian Assessment of Modeling Uncertainty: Application to Fire Risk Assessment," in Analysis and Management of Uncertainty: Theory and Application, B.M. Ayyub, M.M.

Gupta, and L.N. Kanal, eds., North-Holland, 1992, pp. 351-361.

  • E. Droguett and A. Mosleh, Bayesian methodology for model uncertainty using model performance data, Risk Analysis, 28, No. 5, 1457-1476, 2008.

4

Overview Other References (cont.)

  • W.E. Vesely, et al., Measures of Risk Importance and Their Applications, NUREG/CR-3385, July 1983.
  • Siu, N. and D.L. Kelly, On the Use of Importance Measures for Prioritizing Systems, Structures, and Components, Proceedings 5th International Topical Meeting on Nuclear Thermal Hydraulics, Operations, and Safety (NUTHOS-5), Beijing, China, April 14-18, 1997, pp. L.4-1 through L.4-6.
  • American Nuclear Society and the Institute of Electrical and Electronics Engineers, PRA Procedures Guide, NUREG/CR-2300, January 1983.
  • E. Zio and N. Pedroni, How to effectively compute the reliability of a thermal-hydraulic passive system, Nuclear Engineering and Design, 241, 310-327, 2011.

5

Motivation The Big Picture (Level 1)

=

=

= 2 + 4 +

=

  • CDF is a measure of uncertainty
  • Random variables:

- number of core damage events

- time to a core damage event How well do we know CDF?

6

Types of Uncertainty Types of Uncertainty Addressed by NPP PRA/RIDM

  • Aleatory (random, stochastic)

- Irreducible

- Examples:

  • How long it takes for operators to initiate a fire response procedure

- Modeling addressed in Lecture 3-1

  • Epistemic (state-of-knowledge)

- Reducible

- Examples:

  • True value of SRV failure rate
  • Best model for operator response 7

Indicators of Uncertainty

  • Percentiles
= = =

- Bounds (e.g., 5th, 95th)

  • Moments

- General

- Mean

- Variance 2

8

Epistemic Uncertainty: Characterization Epistemic Uncertainty: The Probability of Frequency Model Subjective Probability

  • Probability is an internal l could easily be above or below 4.6E-5 (the median) measure of degree of belief in a proposition
  • To be useful, need:

50%

- Consistent scale for comparing judgments

- Convergence with classical (relative frequency)

Very confident l is below probabilities given enough 7.8E-5 (the 95th percentile) data

  • Coherent analyst:

95% probabilities conform to the laws of probability 9

Epistemic Uncertainty: Characterization Qualitative to Quantitative: Examples M. Granger Morgan, Use (and abuse) of expert elicitation in support of decision making for public policy, National Academy of Sciences Proceedings (NASP), 111, No. 20, 7176-7184, May 20, 2014.

NUREG/CR-6771 10

Types of Uncertainty Aleatory vs. Epistemic - Relationship

  • Subjective view: all uncertainties are epistemic. Aleatory model

- brings in analysts understanding of underlying processes

- supports quantification

- supports communication regarding model parameters

  • Coin toss example

- Uncertain proposition X: observation of n heads on next m tosses (trials)

- P{XlC,H} can include consideration of conditions: fairness of coin, tossing, measurement; whether outcomes are truly binary

- If conditions are ideal, epistemic uncertainty (for a coherent analyst) is given by aleatory (binomial) distribution:

= lf = f 1f

- If there is uncertainty in the conditions, as reflected by a probability distribution for f, the total probability is the expectation of the aleatory result:

1

= lC, H = f 1f F f f 0

epistemic distribution for f 11

Types of Uncertainty Aleatory vs. Epistemic Uncertainty -

Commentary and Observations

  • Distinction depends on frame of reference (model of the world).
  • Example: Weld crack propagation depends on local material conditions (e.g., Cu content, flaw geometry).

- Aleatory model: conditions have a statistical distribution.

- Epistemic model: local conditions are knowable (in principle).

- Appropriate model depends on question being asked (e.g., probability of conditions at a randomly sampled location vs. conditions at a specific location).

  • Practice of distinction is generally accepted, has proven useful in RIDM.
  • NPP PRA community (particularly Level 1) uses probability of frequency modeling convention (used in these lectures).
  • Terminology has been (and remains) a source of angst.
  • Some arguments in the broader risk community for alternate measures of epistemic uncertainty (e.g., possibility, degree of membership) 12

Epistemic Uncertainty: Magnitude PRA Uncertainties - Various Studies 13

Epistemic Uncertainty: Magnitude NUREG-1150 Uncertainties - Another View 2.5E+05 2.0E+05 Probability Density Function 1.5E+05 1.0E+05 5.0E+04 0.0E+00 1.E-11 1.E-10 1.E-09 1.E-08 1.E-07 1.E-06 1.E-05 1.E-04 1.E-03 CDF (/ry)

Surry Internal Surry EQ Surry Fire Surry Total Estimated 14

Epistemic Uncertainty: Magnitude NUREG-1150 Uncertainties - Another View 2.5E+05 2.0E+05 Probability Density Function 1.5E+05 1.0E+05 5.0E+04 0.0E+00 0.E+00 2.E-05 4.E-05 6.E-05 8.E-05 1.E-04 CDF (/ry)

Surry Internal Surry EQ Surry Fire Surry Total Estimated 15

Epistemic Uncertainty: Magnitude PRA Uncertainties - Indian Point (1980s) 16

Epistemic Uncertainty: Characterization Knowledge Check

  • Why do the NUREG-1150 curves get progressively taller with smaller values of CDF?
  • What does this tell you about the Indian Point graph?

17

Epistemic Uncertainty: Types Types of Epistemic Uncertainty

  • Completeness

- Recognized (known unknowns)

- Unrecognized (unknown unknowns)

  • Model

- Competing models

- Difference between predictions and reality

  • Parameter

- Conditioned on model

- Estimated (quantified) based on available evidence 18

Epistemic Uncertainty: Types Parameter Uncertainty

  • Typically characterized using parametric distributions (e.g., beta, gamma, lognormal)
  • Parameters estimated using Bayes Theorem (see Lecture 5-1)

- Statistical evidence (operational experience, test results);

engineering judgment in processing

- Large amounts of data => subjective probabilities converge with evidence

- Small amounts of data => large uncertainties

  • Note: pooling data increases confidence in estimates for population, but not necessarily for NPP studied.

19

Epistemic Uncertainty: Magnitude Generic Demand Failure Probabilities

  • Explosive valves: 3 failures in 713 trials
  • SBO EDGs: 12 failures in 419 trials
  • EDGs: 214 failures in 75,452 trials Demand Failures Demand Failures Probability Density Function Probability Density Function 500 500 400 400 300 300 200 200 100 100 0 0 0.00 0.01 0.02 0.03 0.04 0.05 0.06 1.00E-04 1.00E-03 1.00E-02 1.00E-01 Failure Probability Failure Probability Explosive Valves SBO EDG EDG Explosive Valves SBO EDG EDG 20

Epistemic Uncertainty: Magnitude Generic Runtime Failure Rates

  • Service Water Pumps: 2 failures in 16,292,670 hours0.00775 days <br />0.186 hours <br />0.00111 weeks <br />2.54935e-4 months <br />
  • Normally Running Pumps: 225 failures in 59,582,350 hours0.00405 days <br />0.0972 hours <br />5.787037e-4 weeks <br />1.33175e-4 months <br />
  • Standby Pumps (1st hour operation): 48 failures in 437,647 hours0.00749 days <br />0.18 hours <br />0.00107 weeks <br />2.461835e-4 months <br /> Runtime Failures 1.00 Probability Density Function 0.80 0.60 Note: point values wont 0.40 alert user to potential (Normalized) 0.20 irregularities 0.00 1.00E-09 1.00E-08 1.00E-07 1.00E-06 1.00E-05 1.00E-04 1.00E-03 Failure Rate (/hr)

Service Water Normally Running Standby 21

Epistemic Uncertainty: Types Model Uncertainty

  • Some arbitrariness in distinction from parameter uncertainty
  • Currently only requirements for characterization (not quantification)
  • Typically addressed through consensus models, sensitivity studies
  • If appropriately defined, uncertainties can be quantified and brought into the PRA

- Focus on observables, difference between prediction and reality (vs. what is best or correct model)

- Limited applications in practice 22

Epistemic Uncertainty: Types Thought Exercise Under what conditions could a flawed consensus model be good enough for RIDM?

23

Epistemic Uncertainty: Types Example Quantification of Model Uncertainty

  • Estimating uncertainty factor in Deterministic Reference Model (DRM) estimates based on comparisons with experiments
  • DRM for cable fire propagation velocity 0.05 Probability Density Function 2 0.04 05 3.8

= 2 50 14.8 0.03 95 56.9 0.02 E[] 18.6

  • Uncertainty factor (a random variable) 1 2 0.01 1

= , = 2

, 2 0.00 0 20 40 60 80 100

  • Note: bias in results indicates need for better DRM Model Uncertainty Factor N. Siu, D. Karydas, and J. Temple, "Bayesian Assessment of Modeling Uncertainty:

Application to Fire Risk Assessment," in Analysis and Management of Uncertainty:

Theory and Application, B.M. Ayyub, M.M. Gupta, and L.N. Kanal, eds., North-Holland, 1992, pp. 351-361.

24

Epistemic Uncertainty: Types Another Example Quantification of Model Uncertainty Time (s) Experiment (K) DRM (K) 180 400 450 360 465 510 720 530 560 840 550 565 Notes:

1) Bayesian methodology accounts for possibility of inhomogeneous data.
2) Very large uncertainty bands might be unrealistic.

E. Droguett and Ali Mosleh, Bayesian methodology for model uncertainty using model Performance data, Risk Analysis, 28, No. 5, 1457-1476, 2008.

25

Epistemic Uncertainty: Types Model Uncertainty Quantification - Other Considerations

  • Perspective matters

- Model developer - ability of model to address phenomena

- PRA user - include user effects

  • Uncertainties in unmeasured parameters
  • Sub-model limits of applicability M.H. Salley and A. Lindeman, Verification and Validation of Selected Fire Models for Nuclear Power Plant Applications, NUREG-1824 Supplement 1/EPRI 3002002182, November 2016.

26

Epistemic Uncertainty: Types Completeness Uncertainty - Example Known Sources

  • Intentional acts (sabotage/terrorism)
  • Operator errors of commission
  • Organizational influences

- Operation outside approved conditions (the licensing basis envelope)

- Safety culture

- External influences during an accident

  • Design errors
  • Phenomenological complexities

- Combinations of severe hazards

- Newly recognized hazards 27

Epistemic Uncertainty: Overall Characterization Propagation of Uncertainties

  • Eventual aim: state uncertainty in risk metric estimates
  • Quantitative aspects determined by propagating uncertainty in contributing parameters through PRA model p notation used to indicate epistemic distribution

=

0 28

Epistemic Uncertainty: Overall Characterization Propagation of Uncertainties - Methods

  • Method of moments
  • Sampling

- Direct Monte Carlo

- Latin Hypercube

  • Advanced methods for computationally challenging situations

- Importance sampling (line sampling, subset sampling, )

- Surrogate models (response surface, Gaussian process, neural nets, )

29

Epistemic Uncertainty: Overall Characterization Sensitivity Analysis

  • Typical approach for treating model uncertainties
  • In PRA/RIDM context, need to ensure analysis provides useful information

- Plausible variations

- Specific insights supporting decision problem (e.g., whether variation could lead to different decision)

  • Importance measures, routinely available from current PRA software tools, provide results from one-at-a-time sensitivity calculations
  • Advanced tools (e.g., global sensitivity analysis) available but not routinely used 30

Epistemic Uncertainty: Overall Characterization Importance Measures

  • Mathematical measures depend on definition of importance
  • Some notation: R = risk metric of interest, Pi = probability of event i
  • Common measures:

Measure Definition Alternate Notes Fussell-Vesely  :

= Same rankings as RRW and RRR (F-V)

Birnbaum = 1 = 0 = Nearly same rankings as RAW and RIR

Risk Achievement

= 1 Nearly same rankings as Birnbaum Worth (RAW)

Risk Increase Ratio = 1 Nearly same rankings as Birnbaum (RIR)

Risk Reduction

= 0 Same rankings as F-V Worth (RRW)

Risk Reduction Same rankings as F-V 31 Ratio (RRR) = 0

Epistemic Uncertainty: Overall Characterization Once uncertainties are characterized, then what?

  • See Lecture 8-1 32