ML19011A430: Difference between revisions

From kanterella
Jump to navigation Jump to search
(Created page by program invented by StriderTol)
 
(Created page by program invented by StriderTol)
Line 16: Line 16:


=Text=
=Text=
{{#Wiki_filter:Evidence and EstimationLecture 5-11 Schedule2Course OverviewWednesday 1/16Thursday 1/17Friday 1/18Tuesday 1/22Wednesday 1/23Module1: Introduction3: Characterizing Uncertainty5: Basic Events7: Learning from Operational Events9: The PRA Frontier9:00-9:45L1-1: What is RIDM?L3-1: Probabilistic modeling for NPP PRAL5-1: Evidence and estimationL7-1: Retrospective PRAL9-1: Challenges for NPP PRA9:45-10:00BreakBreakBreakBreakBreak10:00-11:00L1-2: RIDM in the nuclear industryL3-2: Uncertainty and uncertaintiesL5-2: Human Reliability Analysis (HRA)L7-2: Notable events and lessons for PRAL9-2: Improved PRA using existing technology11:00-12:00W1: Risk-informed thinkingW2: Characterizing uncertaintiesW4: Bayesian estimationW6: Retrospective AnalysisL9-3: The frontier: grand challenges and advanced methods12:00-1:30LunchLunchLunchLunchLunchModule2: PRA Overview4: Accident Sequence Modeling6: Special Technical Topics8: Applications and Challenges10: Recap1:30-2:15L2-1: NPP PRA and RIDM: early historyL4-1: Initiating eventsL6-1: Dependent failuresL8-1: Risk-informed regulatory applicationsL10-1: Summary and closing remarksL8-2: PRA and RIDM infrastructure2:15-2:30BreakBreakBreakBreak2:30-3:30L2-2: NPP PRA models and resultsL4-2: Modeling plant and system responseL6-2: Spatial hazards and dependenciesL8-3: Risk-informed fire protectionDiscussion: course feedback3:30-4:30L2-3: PRA and RIDM: point-counterpointW3: Plant systems modeling L6-3: Other operational modesL8-4: Risk communicationOpen DiscussionL6-4: Level 2/3 PRA: beyond core damage4:30-4:45BreakBreakBreakBreak4:45-5:30Open DiscussionW3: Plant systems modeling (cont.)W5: External Hazards modelingOpen Discussion5:30-6:00Open DiscussionOpen Discussion Learning ObjectivesRange of evidence used in NPP PRASources of operational dataBayesian estimationTreatment of model predictions and expert judgment3Overview ResourcesNEA Workshop on the Use of Operational Experience in PSA, Boulogne-Billancourt, France, April 26-27, 2018. (ADAMS ML18123A479)https://nrcoe.inl.gov/resultsdb/RADS/https://nrcoe.inl.gov/resultsdb/AvgPerf/N. Siu and D.L. Kelly, "Bayesian parameter estimation in probabilistic risk assessment," Reliability Engineering and System Safety, 62, 89-116, 1998.NUREG/CR-6823, September 2003.R. J. BudnitzNUREG/CR-6372, 1997.4Overview Other ReferencesComponent, and Mechanical Equipment Reliability for Nuclear-Power Generating IEEE Std 500-1984, Institute of Electrical and Electronics Engineers, New York, 1983.Center for Chemical Process Safety, Guidelines for Process Equipment Reliability Data with Data Tables, American Institute of Chemical Engineers, New York, 1989.G.E.P. Box and G.C. Tiao, Bayesian Inference in Statistical Analysis, Addison-Wesley, Reading, MA, 1973.D. Kahneman, P. Slovic, and A. Tversky (eds.), Judgment Under Uncertainty: Heuristics and Biases, Cambridge University Press, Cambridge, MA, 1982.National Academy of Sciences Proceedings (NASP), 111, No. 20, 7176-7184, May 20, 2014.2016. (ADAMS ML16287A734)5Overview Basic event probabilities reflect state of knowledgeP = ProbabilityX = Proposition of concern (e.g., SI pump failure rate < 10-3 per demand)C = Conditions of assessment (e.g., key assumptions)H = State of knowledge (dependent on assessor)6P{XlC,H}Introduction State of Knowledge (About X)Affected by evidence; common forms:Data (operational, tests, simulator exercises, experiments)Generic estimatesModel predictionsExpert judgmentChanges in H lead to changes in the probability distributions for model parameters 7Introduction Facts, information, statistics, or the like, either historical or derived by calculation or experimentationAny facts assumed to be a matter of direct observationPRA community uses both views:System analyst: input parameters for PRA modelNPP-specificGeneric (e.g., IEEE Std-500, CCPS)Data analyst: empirical observations used to estimate input parameters8Data Operational DataSee Lane (2018): history of NRC operational experience data collection and current programs.Key sourcesLicensee Event ReportsSee https://lersearch.inl.gov>54,000 records (1980-)Rich source, but level of detail and scope can varyINPO Consolidated Events Database (ICES) proprietary Required (MSPI program) + voluntary reportingIncludes some demand and runtime data9Data NRC Data SummariesMultiple links at http://nrcoe.inl.gov/resultsdb/Industry average estimates, trends, and summary data for PRA model parameters: https://nrcoe.inl.gov/resultsdb/AvgPerf/Initiating events Component reliabilitiesCommon cause failures: https://nrcoe.inl.gov/resultsdb/ParamEstSpar/10Event reports often require interpretationPlant-specific terminologySeverity Data 11Operating Experience DataDataBoulogne-Billancourt, France, April 26-27, 2018. (ADAMS ML18123A479)
{{#Wiki_filter:Evidence and Estimation Lecture 5-1 1
Bayesian Estimation PrinciplesPrior distribution quantifies belief before new evidence ELikelihood function quantifies probability of seeing E, given Posterior distribution quantifies belief given Ek = normalization constant = 12posteriordistributionpriordistributionlikelihoodfunctionBayesian Estimation Likelihood Functions ExamplesGeneral: Poisson process (frequency = )Evidence: nevents in time tLikelihood function: Evidence: occurrence times {t1tn}Likelihood function: Bernoulli process (probability = )Evidence: nevents in mtrialsLikelihood function: 13Bayesian Estimation Likelihood Functions Another ExampleExpert judgmentEvidence = estimate for failure frequency A possible likelihood function: lognormalB = bias= measure of confidence in expert14= 0, B = 0 => is delta function about=> perfect expertis flat => completely non-expertLikelihood Function = model of the evidence-generating processBayesian Estimation Prior DistributionsGeneral: characterizes state of knowledge regarding uncertain parameter(s)InformativePreferred in principleTakes effort to developNon-informativeObjectiveLow effort15Bayesian Estimation Informative Prior Distribution Construction MethodsDirect quantificationPercentilesParametric form + select characteristics (e.g., moments, percentiles)-Model plant-to-plant (population) variability Use population variability result as prior for plant of interest(vs. model parameters)Be cognizant of biases from common heuristics (Lecture 2-3)RepresentativenessAvailabilityAnchoring and adjustment16Bayesian Estimation Non-Informative Prior DistributionsBased upon mathematical definitions of relative ignorance (relative to data) not generally flat/uniformBernoulli process: Poisson process: -constrained non-informativeSee Siu and Kelly (1998) and Atwood et al. (NUREG/CR-6823) for further discussion of forms used in NPP PRAsComputational caution: non-informative prior distributions are often unbounded at one or both extremes; need to be careful if using numerical integration17Bayesian Estimation Knowledge CheckIf what is the mean value for ?18 Conjugate Likelihood-Prior Pairs used for computational convenienceCan be informative or non-informativeExamples:*Binomial likelihood and beta prior => beta posteriorAssume n failures in m trials, and a beta prior with parameters a and bPoisson likelihood and gamma prior => gamma posteriorAssume n failures in time t and a gamma prior with parameters and Posterior distribution is gamma with parameters and 19Bayesian Estimation*See Probability Math background slides for more information on the beta and gamma distributions Sample Results2005101520250.000.100.200.300.400.50Probability Density FunctionFrequency (/yr)Actual Data (10 events in 100 years)Non-Informative PriorNon-Informative PosteriorInformed PriorInformed PosteriorOverconfident PriorOverconfident Posterior05101520250.000.100.200.300.400.50Probability Density FunctionFrequency (/yr)Weak Data (1 event in 10 years)Non-Informative PriorNon-Informative PosteriorInformed PriorInformed PosteriorOverconfident PriorOverconfident PosteriormaximumlikelihoodmaximumlikelihoodBayesian Estimation05 = 0.0595 = 0.505 = 0.295 = 0.405 = 0.0595 = 0.505 = 0.295 = 0.4 Commentsinsensitive to reasonable priorsWhen data are weak, prior is important => important to construct carefullyOverly broad (undervaluing state of knowledge) => overly conservative resultsOverly narrow (too confident in state of knowledge) => insensitive to data when obtainedNeed to be wary of biases from heuristics21Bayesian Estimation Model PredictionsCommon aleatory model: stress vs. strengthTime-reliabilityFire-induced damageSeismically-induced damageUncertainties in parameters can be quantified and propagated through models; uncertainties in model outputs can also be quantified (Lecture 3-2)22Evidence from Models Mechanistic Modeling -CommentsResults and even models used in NPP PRAsComponent behavior (e.g., reactor coolant pump seals)Success criteriaInternal and external hazardsaging, CCF)Appealing approach to better account for current, relevant state of Need to account for parameter and model uncertaintiesIncreases vulnerability to completeness uncertainty23Evidence from Models Fire Model Uncertainty Example24Evidence from Models Fire Model Uncertainty Results 25Evidence from Models Cautionary ExamplesMulti-unit trip due to loss of communicationCapacitor failure from operation at below-design voltage (non-nuclear)Increased accident consequences of a stronger pressure vessel (non-nuclear)Reactivity accident from rocking26Evidence from Models Expert JudgmentFundamental component of PRAModelingData selection and analysisDirect elicitation (qualitative and quantitative)Justification: decision support27P{XlC,H}conditions ofprobabilitystatementproposition/eventof concern(not just analyst/analysis team)Evidence from Experts Direct ElicitationAimTake advantage of human ability to consider/integrate complex informationEngage a wide variety of expert viewpointsKey PRA applicationsProblem formulation, planning of experiments and analyses (Phenomena Identification Ranking Tables)Scenario development (logic trees)Estimation of model parametersKey guidance documentsProcesses designed to address known sources of biasesNUREG/CR-6825 and subsequent documents (Senior Seismic Hazard 28Evidence from Experts SSHAC OverviewDesigned with recognition of potential individual and social biases, e.g.,Underestimation of uncertainties from heuristics (availability, anchoring and adjustment, representativeness, etc.)Social influences on individual judgment (e.g., group dynamics, organizational influences)Emphasizes characterizing full community point of view (center, body, and range); documentation (and ownership) of bases for judgments29LevelCharacteristics1TI only (literature review, personal experience)2TI interacts with proponents and resource experts3TI brings together proponents and resource experts4TFI organizes expert panel to develop estimatesEvidence from ExpertsTI = Technical IntegratorTFI = Technical Facilitator/Integrator Level 4 SSHAC30General Process1)Preparation2)Piloting/Training3)Interactions (Workshops)a)Evaluate evidenceb)Develop, defend, and revise judgmentsc)Integrate judgments4)Participatory Peer ReviewAdapted from: R. J. BudnitzProbabilistic Seismic Hazard Analysis: Guidance on NUREG/CR-6372, 1997Evidence from Experts Direct Elicitation Cautions ExpertsKey experts might not be available during projectDifficult to get undivided attention for extended periods of timeDifferent perspectives/frameworks => considerable effort to develop common understanding of problempoint of viewSubject to usual human biases31Evidence from Experts}}
 
Course Overview Schedule Wednesday 1/16            Thursday 1/17              Friday 1/18              Tuesday 1/22            Wednesday 1/23 3: Characterizing                                 7: Learning from Module      1: Introduction Uncertainty 5: Basic Events Operational Events 9: The PRA Frontier L3-1: Probabilistic      L5-1: Evidence and                                  L9-1: Challenges for NPP 9:00-9:45  L1-1: What is RIDM?
modeling for NPP PRA    estimation L7-1: Retrospective PRA PRA 9:45-10:00 Break                    Break                    Break                    Break                    Break L1-2: RIDM in the nuclear L3-2: Uncertainty and   L5-2: Human Reliability   L7-2: Notable events and L9-2: Improved PRA using 10:00-11:00 industry                  uncertainties            Analysis (HRA)            lessons for PRA          existing technology L9-3: The frontier: grand W1: Risk-informed        W2: Characterizing                                W6: Retrospective 11:00-12:00 thinking                  uncertainties W4: Bayesian estimation Analysis challenges and advanced methods 12:00-1:30 Lunch                    Lunch                    Lunch                    Lunch                    Lunch 4: Accident              6: Special Technical      8: Applications and Module      2: PRA Overview                                                                                        10: Recap Sequence Modeling        Topics                    Challenges L8-1: Risk-informed L2-1: NPP PRA and RIDM:                                                     regulatory applications 1:30-2:15  early history L4-1: Initiating events  L6-1: Dependent failures L8-2: PRA and RIDM L10-1: Summary and closing remarks infrastructure 2:15-2:30  Break                    Break                    Break                    Break L2-2: NPP PRA models     L4-2: Modeling plant and L6-2: Spatial hazards and L8-3: Risk-informed fire Discussion: course 2:30-3:30  and results              system response          dependencies              protection                feedback L6-3: Other operational L2-3: PRA and RIDM:       W3: Plant systems       modes 3:30-4:30  point-counterpoint        modeling                L6-4: Level 2/3 PRA:
L8-4: Risk communication  Open Discussion beyond core damage 4:30-4:45  Break                    Break                    Break                    Break W3: Plant systems        W5: External Hazards 4:45-5:30                            modeling (cont.)         modeling Open Discussion                                                              Open Discussion 5:30-6:00                            Open Discussion          Open Discussion 2
 
Overview Learning Objectives
* Range of evidence used in NPP PRA
* Sources of operational data
* Bayesian estimation
* Treatment of model predictions and expert judgment 3
 
Overview Resources
* J. Lane, U.S. NRC Operational Experience Data Collection Program, NEA Workshop on the Use of Operational Experience in PSA, Boulogne-Billancourt, France, April 26-27, 2018. (ADAMS ML18123A479)
* U.S. Nuclear Regulatory Commission, Reliability and Availability Data System (RADS) https://nrcoe.inl.gov/resultsdb/RADS/
* U.S. Nuclear Regulatory Commission, Industry Average Parameter Estimates, https://nrcoe.inl.gov/resultsdb/AvgPerf/
* N. Siu and D.L. Kelly, "Bayesian parameter estimation in probabilistic risk assessment," Reliability Engineering and System Safety, 62, 89-116, 1998.
* C.L. Atwood, et al., Handbook of Parameter Estimation for Probabilistic Risk Assessment, NUREG/CR-6823, September 2003.
* R. J. Budnitz, et al., Recommendations for Probabilistic Seismic Hazard Analysis: Guidance on Uncertainty and Use of Experts, NUREG/CR-6372, 1997.
4
 
Overview Other References
* IEEE Guide to the Collection and Presentation of Electrical, Electronic, Sensing Component, and Mechanical Equipment Reliability for Nuclear-Power Generating Stations, IEEE Std 500-1984, Institute of Electrical and Electronics Engineers, New York, 1983.
* Center for Chemical Process Safety, Guidelines for Process Equipment Reliability Data with Data Tables, American Institute of Chemical Engineers, New York, 1989.
* G.E.P. Box and G.C. Tiao, Bayesian Inference in Statistical Analysis, Addison-Wesley, Reading, MA, 1973.
* D. Kahneman, P. Slovic, and A. Tversky (eds.), Judgment Under Uncertainty:
Heuristics and Biases, Cambridge University Press, Cambridge, MA, 1982.
* M. Granger Morgan, Use (and abuse) of expert elicitation in support of decision making for public policy, National Academy of Sciences Proceedings (NASP), 111, No. 20, 7176-7184, May 20, 2014.
* J. Xing and S. Morrow, White Paper: Practical Insights and Lessons Learned on Implementing Expert Elicitation, U.S. Nuclear Regulatory Commission, October 13, 2016. (ADAMS ML16287A734) 5
 
Introduction Basic event probabilities reflect state of knowledge P{XlC,H}
* P = Probability
* X = Proposition of concern (e.g., SI pump failure rate < 10-3 per demand)
* C = Conditions of assessment (e.g., key assumptions)
* H = State of knowledge (dependent on assessor) 6
 
Introduction State of Knowledge (About X)
* Affected by evidence; common forms:
  -  Data (operational, tests, simulator exercises, experiments)
  -  Generic estimates
  -  Model predictions
  -  Expert judgment
* Changes in H lead to changes in the probability distributions for model parameters 0  0  1  1
* Bayes Theorem is the formal tool for updating 7
 
Data On Data
* Data (plural of datum)
  - Facts, information, statistics, or the like, either historical or derived by calculation or experimentation
  - Any facts assumed to be a matter of direct observation
* PRA community uses both views:
  - System analyst: input parameters for PRA model
* NPP-specific
* Generic (e.g., IEEE Std-500, CCPS)
  - Data analyst: empirical observations used to estimate input parameters 8
 
Data Operational Data
* See Lane (2018): history of NRC operational experience data collection and current programs.
* Key sources
  - Licensee Event Reports
* See https://lersearch.inl.gov
      * >54,000 records (1980-)
* Rich source, but level of detail and scope can vary
  - INPO Consolidated Events Database (ICES) - proprietary
* Required (MSPI program) + voluntary reporting
* Includes some demand and runtime data 9
 
Data NRC Data Summaries
* Multiple links at http://nrcoe.inl.gov/resultsdb/
* Industry average estimates, trends, and summary data for PRA model parameters:
https://nrcoe.inl.gov/resultsdb/AvgPerf/
  - Initiating events
  - Component reliabilities
* Common cause failures:
https://nrcoe.inl.gov/resultsdb/ParamEstSpar/
Event reports often require interpretation
* Plant-specific terminology
* Severity - truly a failure?
10
 
Data Operating Experience Data J. Lane, U.S. NRC Operational Experience Data Collection Program, NEA Workshop on the Use of Operational Experience in PSA, Boulogne-Billancourt, France, April 26-27, 2018. (ADAMS ML18123A479) 11
 
Bayesian Estimation Bayesian Estimation - Principles 1
1  =    0
 
posterior      likelihood  prior distribution      function distribution
* Bayes Theorem: an expression of conditional probability
* Prior distribution quantifies belief before new evidence E
* Likelihood function quantifies probability of seeing E, given q
* Posterior distribution quantifies belief given E
* k = normalization constant =     0 12
 
Bayesian Estimation Likelihood Functions - Examples
* General:   =
* Poisson process (frequency = l)
  - Evidence: n events in time t l
 
  - Likelihood function: ,  l =            l
                                          !
  - Evidence: occurrence times {t1,  , tn}
  - Likelihood function: 1 ,  ,  l = =1 l l
* Bernoulli process (probability = f)
  - Evidence: n events in m trials
 
  - Likelihood function: ,  f =          f 1f 
 
13
 
Bayesian Estimation Likelihood Functions - Another Example
* Expert judgment
    - Evidence = estimate l for failure frequency l
    - A possible likelihood function: lognormal 2
1 l l+
 
1 l l =          2 2l
    - B = bias
    - s = measure of confidence in expert s = 0, B = 0 => l l is delta function about l => perfect expert s =  =>  l l is flat => completely non-expert Likelihood Function = model of the evidence-generating process 14
 
Bayesian Estimation Prior Distributions
* General: characterizes state of knowledge regarding uncertain parameter(s)
* Informative
  - Preferred in principle
  - Takes effort to develop
* Non-informative
  - Objective
  - Low effort
  - Conservative but can be good enough 15
 
Bayesian Estimation Informative Prior Distribution Construction Methods
* Direct quantification
  - Percentiles
  - Parametric form + select characteristics (e.g., moments, percentiles)
* Hierarchical Bayes (notably two-stage Bayes)
  - Model plant-to-plant (population) variability
  - Use population variability result as prior for plant of interest
* Reverse engineering: make judgments about generated samples (vs. model parameters)
* Be cognizant of biases from common heuristics (Lecture 2-3)
  - Representativeness
  - Availability
  - Anchoring and adjustment 16
 
Bayesian Estimation Non-Informative Prior Distributions
* Based upon mathematical definitions of relative ignorance (relative to data) - not generally flat/uniform
* Examples (Jeffreys Rule priors) 1
  - Bernoulli process: 0 f f 1f 1
  - Poisson process: 0 l l
* Other non-informative distributions: maximum entropy, constrained non-informative
* See Siu and Kelly (1998) and Atwood et al. (NUREG/CR-6823) for further discussion of forms used in NPP PRAs
* Computational caution: non-informative prior distributions are often unbounded at one or both extremes; need to be careful if using numerical integration 17
 
Knowledge Check If l is distributed according to a Jeffreys prior, what is the mean value for l?
18
 
Bayesian Estimation Conjugate Likelihood-Prior Pairs
* Result in analytical solution of Bayes Theorem => often used for computational convenience
* Can be informative or non-informative
* Examples:*
    - Binomial likelihood and beta prior => beta posterior
* Assume n failures in m trials, and a beta prior with parameters a and b
* Posterior distribution is beta with parameters a = a + n and b = b + m
    - Poisson likelihood and gamma prior => gamma posterior
* Assume n failures in time t and a gamma prior with parameters a and b
* Posterior distribution is gamma with parameters a = n + a and b = t + b
  *See Probability Math background slides for more information on the beta and gamma distributions 19
 
Bayesian Estimation Sample Results Actual Data (10 events in 100 years)
Probability Density Function 25 Non-Informative Prior 20 maximum                                                Non-Informative Posterior likelihood                                                                                l05 = 0.05 15                                                            Informed Prior l95 = 0.5 Informed Posterior 10 Overconfident Prior l05 = 0.2 l95 = 0.4 5                                                              Overconfident Posterior 0
0.00      0.10                0.20                    0.30  0.40                        0.50 Frequency (/yr)
Weak Data (1 event in 10 years)
Probability Density Function 25 Non-Informative Prior 20 Non-Informative Posterior maximum likelihood                                            Informed Prior l05 = 0.05 15 l95 = 0.5 Informed Posterior 10 Overconfident Prior                l05 = 0.2 l95 = 0.4 5                                                              Overconfident Posterior 0
0.00      0.10                0.20                    0.30  0.40                        0.50 Frequency (/yr)                                                    20
 
Bayesian Estimation Comments
* When data are plentiful (strong), posterior is relatively insensitive to reasonable priors
* When data are weak, prior is important => important to construct carefully
  - Overly broad (undervaluing state of knowledge) => overly conservative results
  - Overly narrow (too confident in state of knowledge) =>
insensitive to data when obtained
  - Need to be wary of biases from heuristics 21
 
Evidence from Models Model Predictions
* Common aleatory model: stress vs. strength
  - Time-reliability
          =  <  l
  - Fire-induced damage
          =  <  l
  - Seismically-induced damage
          =  <  l
* Uncertainties in parameters can be quantified and propagated through models; uncertainties in model outputs can also be quantified (Lecture 3-2) 22
 
Evidence from Models Mechanistic Modeling - Comments
* Results and even models used in NPP PRAs
  - Component behavior (e.g., reactor coolant pump seals)
  - Success criteria
  - Internal and external hazards
* Physics of Failure models proposed for various issues (e.g.,
aging, CCF)
* Appealing approach to better account for current, relevant state of knowledge (what we know)
* Need to account for parameter and model uncertainties
* Increases vulnerability to completeness uncertainty 23
 
Evidence from Models Fire Model Uncertainty Example 24
 
Evidence from Models Fire Model Uncertainty - Results 25
 
Evidence from Models Cautionary Examples
* Multi-unit trip due to loss of communication
* Capacitor failure from operation at below-design voltage (non-nuclear)
* Increased accident consequences of a stronger pressure vessel (non-nuclear)
* Reactivity accident from rocking 26
 
Evidence from Experts Expert Judgment
* Fundamental component of PRA
  - Modeling
  - Data selection and analysis
  - Direct elicitation (qualitative and quantitative)
* Justification: decision support                          what we know what we believe      P{XlC,H}
proposition/event conditions of of concern      probability statement For RIDM, we = informed technical community (not just analyst/analysis team) 27
 
Evidence from Experts Direct Elicitation
* Aim
  - Take advantage of human ability to consider/integrate complex information
  - Engage a wide variety of expert viewpoints
* Key PRA applications
  - Problem formulation, planning of experiments and analyses (Phenomena Identification Ranking Tables)
  - Scenario development (logic trees)
  - Estimation of model parameters
* Key guidance documents
  - Processes designed to address known sources of biases
  - NUREG/CR-6825 and subsequent documents (Senior Seismic Hazard Analysis Committee: SSHAC) 28
 
Evidence from Experts SSHAC Overview
* Designed with recognition of potential      Level  Characteristics individual and social biases, e.g.,            1    TI only (literature review,
  - Underestimation of uncertainties from            personal experience) heuristics (availability, anchoring and    2    TI interacts with proponents and adjustment, representativeness, etc.)          resource experts
  - Social influences on individual judgment    3    TI brings together proponents and resource experts (e.g., group dynamics, organizational influences)                                4    TFI organizes expert panel to develop estimates
* Emphasizes characterizing full TI = Technical Integrator community point of view (center, body,      TFI = Technical Facilitator/Integrator and range); documentation (and ownership) of bases for judgments
* Different levels for different needs 29
 
Evidence from Experts General Process
: 1) Preparation Level 4 SSHAC                                        2) Piloting/Training
: 3) Interactions (Workshops) a) Evaluate evidence b) Develop, defend, and revise judgments c) Integrate judgments
: 4) Participatory Peer Review Adapted from: R. J. Budnitz, et al., Recommendations for Probabilistic Seismic Hazard Analysis: Guidance on Uncertainty and Use of Experts, NUREG/CR-6372, 1997 30
 
Evidence from Experts Direct Elicitation - Cautions
* Experts
  - Key experts might not be available during project
  - Difficult to get undivided attention for extended periods of time
  - Different perspectives/frameworks => considerable effort to develop common understanding of problem
  - Inclination to provide the right answer based on individual point of view
  - Subject to usual human biases
* Results can be viewed as the final solution A serious, important activity, but not the Easy Button 31}}

Revision as of 07:54, 20 October 2019

Lecture 5-1 Evidence 2019-01-18
ML19011A430
Person / Time
Issue date: 01/16/2019
From:
Office of Nuclear Regulatory Research
To:
Nathan Siu 415-0744
Shared Package
ML19011A416 List:
References
Download: ML19011A430 (31)


Text

Evidence and Estimation Lecture 5-1 1

Course Overview Schedule Wednesday 1/16 Thursday 1/17 Friday 1/18 Tuesday 1/22 Wednesday 1/23 3: Characterizing 7: Learning from Module 1: Introduction Uncertainty 5: Basic Events Operational Events 9: The PRA Frontier L3-1: Probabilistic L5-1: Evidence and L9-1: Challenges for NPP 9:00-9:45 L1-1: What is RIDM?

modeling for NPP PRA estimation L7-1: Retrospective PRA PRA 9:45-10:00 Break Break Break Break Break L1-2: RIDM in the nuclear L3-2: Uncertainty and L5-2: Human Reliability L7-2: Notable events and L9-2: Improved PRA using 10:00-11:00 industry uncertainties Analysis (HRA) lessons for PRA existing technology L9-3: The frontier: grand W1: Risk-informed W2: Characterizing W6: Retrospective 11:00-12:00 thinking uncertainties W4: Bayesian estimation Analysis challenges and advanced methods 12:00-1:30 Lunch Lunch Lunch Lunch Lunch 4: Accident 6: Special Technical 8: Applications and Module 2: PRA Overview 10: Recap Sequence Modeling Topics Challenges L8-1: Risk-informed L2-1: NPP PRA and RIDM: regulatory applications 1:30-2:15 early history L4-1: Initiating events L6-1: Dependent failures L8-2: PRA and RIDM L10-1: Summary and closing remarks infrastructure 2:15-2:30 Break Break Break Break L2-2: NPP PRA models L4-2: Modeling plant and L6-2: Spatial hazards and L8-3: Risk-informed fire Discussion: course 2:30-3:30 and results system response dependencies protection feedback L6-3: Other operational L2-3: PRA and RIDM: W3: Plant systems modes 3:30-4:30 point-counterpoint modeling L6-4: Level 2/3 PRA:

L8-4: Risk communication Open Discussion beyond core damage 4:30-4:45 Break Break Break Break W3: Plant systems W5: External Hazards 4:45-5:30 modeling (cont.) modeling Open Discussion Open Discussion 5:30-6:00 Open Discussion Open Discussion 2

Overview Learning Objectives

  • Range of evidence used in NPP PRA
  • Sources of operational data
  • Bayesian estimation
  • Treatment of model predictions and expert judgment 3

Overview Resources

  • J. Lane, U.S. NRC Operational Experience Data Collection Program, NEA Workshop on the Use of Operational Experience in PSA, Boulogne-Billancourt, France, April 26-27, 2018. (ADAMS ML18123A479)
  • R. J. Budnitz, et al., Recommendations for Probabilistic Seismic Hazard Analysis: Guidance on Uncertainty and Use of Experts, NUREG/CR-6372, 1997.

4

Overview Other References

  • IEEE Guide to the Collection and Presentation of Electrical, Electronic, Sensing Component, and Mechanical Equipment Reliability for Nuclear-Power Generating Stations, IEEE Std 500-1984, Institute of Electrical and Electronics Engineers, New York, 1983.
  • Center for Chemical Process Safety, Guidelines for Process Equipment Reliability Data with Data Tables, American Institute of Chemical Engineers, New York, 1989.
  • G.E.P. Box and G.C. Tiao, Bayesian Inference in Statistical Analysis, Addison-Wesley, Reading, MA, 1973.
  • D. Kahneman, P. Slovic, and A. Tversky (eds.), Judgment Under Uncertainty:

Heuristics and Biases, Cambridge University Press, Cambridge, MA, 1982.

  • M. Granger Morgan, Use (and abuse) of expert elicitation in support of decision making for public policy, National Academy of Sciences Proceedings (NASP), 111, No. 20, 7176-7184, May 20, 2014.
  • J. Xing and S. Morrow, White Paper: Practical Insights and Lessons Learned on Implementing Expert Elicitation, U.S. Nuclear Regulatory Commission, October 13, 2016. (ADAMS ML16287A734) 5

Introduction Basic event probabilities reflect state of knowledge P{XlC,H}

  • P = Probability
  • X = Proposition of concern (e.g., SI pump failure rate < 10-3 per demand)
  • C = Conditions of assessment (e.g., key assumptions)
  • H = State of knowledge (dependent on assessor) 6

Introduction State of Knowledge (About X)

  • Affected by evidence; common forms:

- Data (operational, tests, simulator exercises, experiments)

- Generic estimates

- Model predictions

- Expert judgment

  • Changes in H lead to changes in the probability distributions for model parameters 0 0 1 1
  • Bayes Theorem is the formal tool for updating 7

Data On Data

  • Data (plural of datum)

- Facts, information, statistics, or the like, either historical or derived by calculation or experimentation

- Any facts assumed to be a matter of direct observation

  • PRA community uses both views:

- System analyst: input parameters for PRA model

  • NPP-specific

- Data analyst: empirical observations used to estimate input parameters 8

Data Operational Data

  • See Lane (2018): history of NRC operational experience data collection and current programs.
  • Key sources

- Licensee Event Reports

  • >54,000 records (1980-)
  • Rich source, but level of detail and scope can vary

- INPO Consolidated Events Database (ICES) - proprietary

  • Required (MSPI program) + voluntary reporting
  • Includes some demand and runtime data 9

Data NRC Data Summaries

  • Industry average estimates, trends, and summary data for PRA model parameters:

https://nrcoe.inl.gov/resultsdb/AvgPerf/

- Initiating events

- Component reliabilities

  • Common cause failures:

https://nrcoe.inl.gov/resultsdb/ParamEstSpar/

Event reports often require interpretation

  • Plant-specific terminology
  • Severity - truly a failure?

10

Data Operating Experience Data J. Lane, U.S. NRC Operational Experience Data Collection Program, NEA Workshop on the Use of Operational Experience in PSA, Boulogne-Billancourt, France, April 26-27, 2018. (ADAMS ML18123A479) 11

Bayesian Estimation Bayesian Estimation - Principles 1

1 = 0

posterior likelihood prior distribution function distribution

  • Bayes Theorem: an expression of conditional probability
  • Prior distribution quantifies belief before new evidence E
  • Likelihood function quantifies probability of seeing E, given q
  • Posterior distribution quantifies belief given E
  • k = normalization constant = 0 12

Bayesian Estimation Likelihood Functions - Examples

  • General: =
  • Poisson process (frequency = l)

- Evidence: n events in time t l

- Likelihood function: , l = l

!

- Evidence: occurrence times {t1, , tn}

- Likelihood function: 1 , , l = =1 l l

  • Bernoulli process (probability = f)

- Evidence: n events in m trials

- Likelihood function: , f = f 1f

13

Bayesian Estimation Likelihood Functions - Another Example

  • Expert judgment

- Evidence = estimate l for failure frequency l

- A possible likelihood function: lognormal 2

1 l l+

1 l l = 2 2l

- B = bias

- s = measure of confidence in expert s = 0, B = 0 => l l is delta function about l => perfect expert s = => l l is flat => completely non-expert Likelihood Function = model of the evidence-generating process 14

Bayesian Estimation Prior Distributions

  • General: characterizes state of knowledge regarding uncertain parameter(s)
  • Informative

- Preferred in principle

- Takes effort to develop

  • Non-informative

- Objective

- Low effort

- Conservative but can be good enough 15

Bayesian Estimation Informative Prior Distribution Construction Methods

  • Direct quantification

- Percentiles

- Parametric form + select characteristics (e.g., moments, percentiles)

  • Hierarchical Bayes (notably two-stage Bayes)

- Model plant-to-plant (population) variability

- Use population variability result as prior for plant of interest

  • Reverse engineering: make judgments about generated samples (vs. model parameters)
  • Be cognizant of biases from common heuristics (Lecture 2-3)

- Representativeness

- Availability

- Anchoring and adjustment 16

Bayesian Estimation Non-Informative Prior Distributions

  • Based upon mathematical definitions of relative ignorance (relative to data) - not generally flat/uniform
  • Examples (Jeffreys Rule priors) 1

- Bernoulli process: 0 f f 1f 1

- Poisson process: 0 l l

  • Other non-informative distributions: maximum entropy, constrained non-informative
  • See Siu and Kelly (1998) and Atwood et al. (NUREG/CR-6823) for further discussion of forms used in NPP PRAs
  • Computational caution: non-informative prior distributions are often unbounded at one or both extremes; need to be careful if using numerical integration 17

Knowledge Check If l is distributed according to a Jeffreys prior, what is the mean value for l?

18

Bayesian Estimation Conjugate Likelihood-Prior Pairs

  • Result in analytical solution of Bayes Theorem => often used for computational convenience
  • Can be informative or non-informative
  • Examples:*

- Binomial likelihood and beta prior => beta posterior

  • Assume n failures in m trials, and a beta prior with parameters a and b
  • Posterior distribution is beta with parameters a = a + n and b = b + m

- Poisson likelihood and gamma prior => gamma posterior

  • Assume n failures in time t and a gamma prior with parameters a and b
  • Posterior distribution is gamma with parameters a = n + a and b = t + b
  • See Probability Math background slides for more information on the beta and gamma distributions 19

Bayesian Estimation Sample Results Actual Data (10 events in 100 years)

Probability Density Function 25 Non-Informative Prior 20 maximum Non-Informative Posterior likelihood l05 = 0.05 15 Informed Prior l95 = 0.5 Informed Posterior 10 Overconfident Prior l05 = 0.2 l95 = 0.4 5 Overconfident Posterior 0

0.00 0.10 0.20 0.30 0.40 0.50 Frequency (/yr)

Weak Data (1 event in 10 years)

Probability Density Function 25 Non-Informative Prior 20 Non-Informative Posterior maximum likelihood Informed Prior l05 = 0.05 15 l95 = 0.5 Informed Posterior 10 Overconfident Prior l05 = 0.2 l95 = 0.4 5 Overconfident Posterior 0

0.00 0.10 0.20 0.30 0.40 0.50 Frequency (/yr) 20

Bayesian Estimation Comments

  • When data are plentiful (strong), posterior is relatively insensitive to reasonable priors
  • When data are weak, prior is important => important to construct carefully

- Overly broad (undervaluing state of knowledge) => overly conservative results

- Overly narrow (too confident in state of knowledge) =>

insensitive to data when obtained

- Need to be wary of biases from heuristics 21

Evidence from Models Model Predictions

  • Common aleatory model: stress vs. strength

- Time-reliability

= < l

- Fire-induced damage

= < l

- Seismically-induced damage

= < l

  • Uncertainties in parameters can be quantified and propagated through models; uncertainties in model outputs can also be quantified (Lecture 3-2) 22

Evidence from Models Mechanistic Modeling - Comments

  • Results and even models used in NPP PRAs

- Component behavior (e.g., reactor coolant pump seals)

- Success criteria

- Internal and external hazards

  • Physics of Failure models proposed for various issues (e.g.,

aging, CCF)

  • Appealing approach to better account for current, relevant state of knowledge (what we know)
  • Need to account for parameter and model uncertainties
  • Increases vulnerability to completeness uncertainty 23

Evidence from Models Fire Model Uncertainty Example 24

Evidence from Models Fire Model Uncertainty - Results 25

Evidence from Models Cautionary Examples

  • Multi-unit trip due to loss of communication
  • Capacitor failure from operation at below-design voltage (non-nuclear)
  • Increased accident consequences of a stronger pressure vessel (non-nuclear)
  • Reactivity accident from rocking 26

Evidence from Experts Expert Judgment

  • Fundamental component of PRA

- Modeling

- Data selection and analysis

- Direct elicitation (qualitative and quantitative)

  • Justification: decision support what we know what we believe P{XlC,H}

proposition/event conditions of of concern probability statement For RIDM, we = informed technical community (not just analyst/analysis team) 27

Evidence from Experts Direct Elicitation

  • Aim

- Take advantage of human ability to consider/integrate complex information

- Engage a wide variety of expert viewpoints

  • Key PRA applications

- Problem formulation, planning of experiments and analyses (Phenomena Identification Ranking Tables)

- Scenario development (logic trees)

- Estimation of model parameters

  • Key guidance documents

- Processes designed to address known sources of biases

- NUREG/CR-6825 and subsequent documents (Senior Seismic Hazard Analysis Committee: SSHAC) 28

Evidence from Experts SSHAC Overview

  • Designed with recognition of potential Level Characteristics individual and social biases, e.g., 1 TI only (literature review,

- Underestimation of uncertainties from personal experience) heuristics (availability, anchoring and 2 TI interacts with proponents and adjustment, representativeness, etc.) resource experts

- Social influences on individual judgment 3 TI brings together proponents and resource experts (e.g., group dynamics, organizational influences) 4 TFI organizes expert panel to develop estimates

  • Emphasizes characterizing full TI = Technical Integrator community point of view (center, body, TFI = Technical Facilitator/Integrator and range); documentation (and ownership) of bases for judgments
  • Different levels for different needs 29

Evidence from Experts General Process

1) Preparation Level 4 SSHAC 2) Piloting/Training
3) Interactions (Workshops) a) Evaluate evidence b) Develop, defend, and revise judgments c) Integrate judgments
4) Participatory Peer Review Adapted from: R. J. Budnitz, et al., Recommendations for Probabilistic Seismic Hazard Analysis: Guidance on Uncertainty and Use of Experts, NUREG/CR-6372, 1997 30

Evidence from Experts Direct Elicitation - Cautions

  • Experts

- Key experts might not be available during project

- Difficult to get undivided attention for extended periods of time

- Different perspectives/frameworks => considerable effort to develop common understanding of problem

- Inclination to provide the right answer based on individual point of view

- Subject to usual human biases

  • Results can be viewed as the final solution A serious, important activity, but not the Easy Button 31