ML19011A430

From kanterella
Revision as of 15:30, 12 February 2019 by StriderTol (talk | contribs) (Created page by program invented by StriderTol)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search
Lecture 5-1 Evidence 2019-01-18
ML19011A430
Person / Time
Issue date: 01/16/2019
From:
Office of Nuclear Regulatory Research
To:
Nathan Siu 415-0744
Shared Package
ML19011A416 List:
References
Download: ML19011A430 (31)


Text

Evidence and EstimationLecture 5-11 Schedule2Course OverviewWednesday 1/16Thursday 1/17Friday 1/18Tuesday 1/22Wednesday 1/23Module1: Introduction3: Characterizing Uncertainty5: Basic Events7: Learning from Operational Events9: The PRA Frontier9:00-9:45L1-1: What is RIDM?L3-1: Probabilistic modeling for NPP PRAL5-1: Evidence and estimationL7-1: Retrospective PRAL9-1: Challenges for NPP PRA9:45-10:00BreakBreakBreakBreakBreak10:00-11:00L1-2: RIDM in the nuclear industryL3-2: Uncertainty and uncertaintiesL5-2: Human Reliability Analysis (HRA)L7-2: Notable events and lessons for PRAL9-2: Improved PRA using existing technology11:00-12:00W1: Risk-informed thinkingW2: Characterizing uncertaintiesW4: Bayesian estimationW6: Retrospective AnalysisL9-3: The frontier: grand challenges and advanced methods12:00-1:30LunchLunchLunchLunchLunchModule2: PRA Overview4: Accident Sequence Modeling6: Special Technical Topics8: Applications and Challenges10: Recap1:30-2:15L2-1: NPP PRA and RIDM: early historyL4-1: Initiating eventsL6-1: Dependent failuresL8-1: Risk-informed regulatory applicationsL10-1: Summary and closing remarksL8-2: PRA and RIDM infrastructure2:15-2:30BreakBreakBreakBreak2:30-3:30L2-2: NPP PRA models and resultsL4-2: Modeling plant and system responseL6-2: Spatial hazards and dependenciesL8-3: Risk-informed fire protectionDiscussion: course feedback3:30-4:30L2-3: PRA and RIDM: point-counterpointW3: Plant systems modeling L6-3: Other operational modesL8-4: Risk communicationOpen DiscussionL6-4: Level 2/3 PRA: beyond core damage4:30-4:45BreakBreakBreakBreak4:45-5:30Open DiscussionW3: Plant systems modeling (cont.)W5: External Hazards modelingOpen Discussion5:30-6:00Open DiscussionOpen Discussion Learning ObjectivesRange of evidence used in NPP PRASources of operational dataBayesian estimationTreatment of model predictions and expert judgment3Overview ResourcesNEA Workshop on the Use of Operational Experience in PSA, Boulogne-Billancourt, France, April 26-27, 2018. (ADAMS ML18123A479)https://nrcoe.inl.gov/resultsdb/RADS/https://nrcoe.inl.gov/resultsdb/AvgPerf/N. Siu and D.L. Kelly, "Bayesian parameter estimation in probabilistic risk assessment," Reliability Engineering and System Safety, 62,89-116, 1998.NUREG/CR-6823, September 2003.R. J. BudnitzNUREG/CR-6372, 1997.4Overview Other ReferencesComponent, and Mechanical Equipment Reliability for Nuclear-Power Generating IEEE Std 500-1984, Institute of Electrical and Electronics Engineers, New York, 1983.Center for Chemical Process Safety, Guidelines for Process Equipment Reliability Data with Data Tables, American Institute of Chemical Engineers, New York, 1989.G.E.P. Box and G.C. Tiao, Bayesian Inference in Statistical Analysis, Addison-Wesley, Reading, MA, 1973.D. Kahneman, P. Slovic, and A. Tversky (eds.), Judgment Under Uncertainty: Heuristics and Biases, Cambridge University Press, Cambridge, MA, 1982.National Academy of Sciences Proceedings (NASP), 111, No. 20, 7176-7184, May 20, 2014.2016. (ADAMS ML16287A734)5Overview Basic event probabilities reflect state of knowledgeP = ProbabilityX = Proposition of concern (e.g., SI pump failure rate < 10-3 per demand)C = Conditions of assessment (e.g., key assumptions)H = State of knowledge (dependent on assessor)6P{XlC,H}Introduction State of Knowledge (About X)Affected by evidence; common forms:Data (operational, tests, simulator exercises, experiments)Generic estimatesModel predictionsExpert judgmentChanges in H lead to changes in the probability distributions for model parameters 7Introduction Facts, information, statistics, or the like, either historical or derived by calculation or experimentationAny facts assumed to be a matter of direct observationPRA community uses both views:System analyst: input parameters for PRA modelNPP-specificGeneric (e.g., IEEE Std-500, CCPS)Data analyst: empirical observations used to estimate input parameters8Data Operational DataSee Lane (2018): history of NRC operational experience data collection and current programs.Key sourcesLicensee Event ReportsSee https://lersearch.inl.gov>54,000 records (1980-)Rich source, but level of detail and scope can varyINPO Consolidated Events Database (ICES) proprietary Required (MSPI program) + voluntary reportingIncludes some demand and runtime data9Data NRC Data SummariesMultiple links at http://nrcoe.inl.gov/resultsdb/Industry average estimates, trends, and summary data for PRA model parameters: https://nrcoe.inl.gov/resultsdb/AvgPerf/Initiating events Component reliabilitiesCommon cause failures: https://nrcoe.inl.gov/resultsdb/ParamEstSpar/10Event reports often require interpretationPlant-specific terminologySeverity Data 11Operating Experience DataDataBoulogne-Billancourt, France, April 26-27, 2018. (ADAMS ML18123A479)

Bayesian Estimation PrinciplesPrior distribution quantifies belief before new evidence ELikelihood function quantifies probability of seeing E, given Posterior distribution quantifies belief given Ek = normalization constant = 12posteriordistributionpriordistributionlikelihoodfunctionBayesian Estimation Likelihood Functions ExamplesGeneral: Poisson process (frequency = )Evidence: nevents in time tLikelihood function: Evidence: occurrence times {t1tn}Likelihood function: Bernoulli process (probability = )Evidence: nevents in mtrialsLikelihood function: 13Bayesian Estimation Likelihood Functions Another ExampleExpert judgmentEvidence = estimate for failure frequency A possible likelihood function: lognormalB = bias= measure of confidence in expert14= 0, B = 0 => is delta function about=> perfect expertis flat => completely non-expertLikelihood Function = model of the evidence-generating processBayesian Estimation Prior DistributionsGeneral: characterizes state of knowledge regarding uncertain parameter(s)InformativePreferred in principleTakes effort to developNon-informativeObjectiveLow effort15Bayesian Estimation Informative Prior Distribution Construction MethodsDirect quantificationPercentilesParametric form + select characteristics (e.g., moments, percentiles)-Model plant-to-plant (population) variability Use population variability result as prior for plant of interest(vs. model parameters)Be cognizant of biases from common heuristics (Lecture 2-3)RepresentativenessAvailabilityAnchoring and adjustment16Bayesian Estimation Non-Informative Prior DistributionsBased upon mathematical definitions of relative ignorance (relative to data) not generally flat/uniformBernoulli process: Poisson process: -constrained non-informativeSee Siu and Kelly (1998) and Atwood et al. (NUREG/CR-6823) for further discussion of forms used in NPP PRAsComputational caution: non-informative prior distributions are often unbounded at one or both extremes; need to be careful if using numerical integration17Bayesian Estimation Knowledge CheckIf what is the mean value for ?18 Conjugate Likelihood-Prior Pairs used for computational convenienceCan be informative or non-informativeExamples:*Binomial likelihood and beta prior => beta posteriorAssume n failures in m trials, and a beta prior with parameters a and bPoisson likelihood and gamma prior => gamma posteriorAssume n failures in time t and a gamma prior with parameters and Posterior distribution is gamma with parameters and 19Bayesian Estimation*See Probability Math background slides for more information on the beta and gamma distributions Sample Results2005101520250.000.100.200.300.400.50Probability Density FunctionFrequency (/yr)Actual Data (10 events in 100 years)Non-Informative PriorNon-Informative PosteriorInformed PriorInformed PosteriorOverconfident PriorOverconfident Posterior05101520250.000.100.200.300.400.50Probability Density FunctionFrequency (/yr)Weak Data (1 event in 10 years)Non-Informative PriorNon-Informative PosteriorInformed PriorInformed PosteriorOverconfident PriorOverconfident PosteriormaximumlikelihoodmaximumlikelihoodBayesian Estimation05 = 0.0595 = 0.505 = 0.295 = 0.405 = 0.0595 = 0.505 = 0.295 = 0.4 Commentsinsensitive to reasonable priorsWhen data are weak, prior is important => important to construct carefullyOverly broad (undervaluing state of knowledge) => overly conservative resultsOverly narrow (too confident in state of knowledge) => insensitive to data when obtainedNeed to be wary of biases from heuristics21Bayesian Estimation Model PredictionsCommon aleatory model: stress vs. strengthTime-reliabilityFire-induced damageSeismically-induced damageUncertainties in parameters can be quantified and propagated through models; uncertainties in model outputs can also be quantified (Lecture 3-2)22Evidence from Models Mechanistic Modeling -CommentsResults and even models used in NPP PRAsComponent behavior (e.g., reactor coolant pump seals)Success criteriaInternal and external hazardsaging, CCF)Appealing approach to better account for current, relevant state of Need to account for parameter and model uncertaintiesIncreases vulnerability to completeness uncertainty23Evidence from Models Fire Model Uncertainty Example24Evidence from Models Fire Model Uncertainty Results 25Evidence from Models Cautionary ExamplesMulti-unit trip due to loss of communicationCapacitor failure from operation at below-design voltage (non-nuclear)Increased accident consequences of a stronger pressure vessel (non-nuclear)Reactivity accident from rocking26Evidence from Models Expert JudgmentFundamental component of PRAModelingData selection and analysisDirect elicitation (qualitative and quantitative)Justification: decision support27P{XlC,H}conditions ofprobabilitystatementproposition/eventof concern(not just analyst/analysis team)Evidence from Experts Direct ElicitationAimTake advantage of human ability to consider/integrate complex informationEngage a wide variety of expert viewpointsKey PRA applicationsProblem formulation, planning of experiments and analyses (Phenomena Identification Ranking Tables)Scenario development (logic trees)Estimation of model parametersKey guidance documentsProcesses designed to address known sources of biasesNUREG/CR-6825 and subsequent documents (Senior Seismic Hazard 28Evidence from Experts SSHAC OverviewDesigned with recognition of potential individual and social biases, e.g.,Underestimation of uncertainties from heuristics (availability, anchoring and adjustment, representativeness, etc.)Social influences on individual judgment (e.g., group dynamics, organizational influences)Emphasizes characterizing full community point of view (center, body, and range); documentation (and ownership) of bases for judgments29LevelCharacteristics1TI only (literature review, personal experience)2TI interacts with proponents and resource experts3TI brings together proponents and resource experts4TFI organizes expert panel to develop estimatesEvidence from ExpertsTI = Technical IntegratorTFI = Technical Facilitator/Integrator Level 4 SSHAC30General Process1)Preparation2)Piloting/Training3)Interactions (Workshops)a)Evaluate evidenceb)Develop, defend, and revise judgmentsc)Integrate judgments4)Participatory Peer ReviewAdapted from: R. J. BudnitzProbabilistic Seismic Hazard Analysis: Guidance on NUREG/CR-6372, 1997Evidence from Experts Direct Elicitation Cautions ExpertsKey experts might not be available during projectDifficult to get undivided attention for extended periods of timeDifferent perspectives/frameworks => considerable effort to develop common understanding of problempoint of viewSubject to usual human biases31Evidence from Experts