ML19210D836
ML19210D836 | |
Person / Time | |
---|---|
Issue date: | 08/04/2019 |
From: | Nathan Siu NRC/RES/DRA |
To: | |
Nathan Siu 415-0744 | |
References | |
Download: ML19210D836 (26) | |
Text
Model Uncertainty and Adequacy:
A PRA Reviewers Perspective*
N. Siu U.S. Nuclear Regulatory Commission Office of Nuclear Regulatory Research Workshop on Flooding Risk Assessment: Validation, Application, and Experimental Studies 25th International Conference on Structural Mechanics in Reactor Technology (SMiRT 25)
Charlotte, NC, August 4-9, 2019
- The views expressed in this presentation are not necessarily those of the U.S. Nuclear Regulatory Commission
Outline
- Introductory remarks
- Model uncertainty
- Model adequacy
- Flood model considerations
- Concluding remarks 2
Introductory Remarks High-Fidelity Flood Models S. Prescott, et al., 3D Simulation of External Flooding Events for the RISMC Pathway, INL/EXT-15-36773, Idaho National Laboratory, 2015.
3
Introductory Remarks High-Fidelity Modeling in Decision Support
- A reasonable expectation
- Personal views
- Continued pursuit: a good thing
- Transition is inevitable (in tune with trends in technology, society)
- Finite economic and attentional resources =>
Whats important? Is analysis good enough?
4
Introductory Remarks Reminder: Risk-Informed Decisionmaking (RIDM) => Multiple Inputs to Decisions NUREG-2150 Other Considerations (Risk-Informed Decisionmaking)
- Current regulations
- Safety margins
- Defense-in-depth 5
Introductory Remarks Uncertainties in Engineering Models for Decision Support: Two Perspectives Model* mod*el, n. a representation Approximations of reality created with a Lack of knowledge specific objective in mind.
A. Mosleh, N. Siu, C. Smidts, and C. Lui, Model Uncertainty: Its Characterization and Quantification, Center for Reliability Engineering, University of Maryland, College Park, MD, 1995. (Also NUREG/CP-0138, 1994)
- In practice, often an assemblage of sub-models, implemented as a software code (tool, meta-model) used to develop application-specific models (e.g., for a particular plant).
Model Uncertainty Model Adequacy (How confident?) (Good enough?)
6
Model Uncertainty Model Uncertainty: Quantification Not Required for Risk-Informed Applications
- PRA Standard (ASME-ANS-RA-Sb-2013)
- Overall: identify, characterize, and document sources of model uncertainty, understand potential impact on results
- Seismic hazards: logic trees are acceptable for propagating epistemic uncertainties => extend to other hazards?
- RIDM Guidance (NUREG-1855, Rev. 1, 2017)
- Consensus models
- Sensitivity analysis (considering needs of application)
- Recognition of formal quantification approaches (e.g., logic trees) 7
Model Uncertainty Model Uncertainty: Quantification is Possible Question for PRA P {XlS,E} = ?
Uncertain proposition PRA scenario (often a bundle of similar progressions)
Evidence (from and about model)
- Different frameworks => different propositions
- Model selection. Proposition: Mi is right/best model (logic trees)
- Model error. Proposition: X {x, x+dx} (uncertainty factor)
- Methods
- Expert elicitation
- Bayesian estimation
- Frameworks can address
- Aleatory uncertainty (process is stochastic rather than deterministic)
- Multiple models and dependencies
- Uncertain/fuzzy data (empirical data, model parameters) 8
Model Adequacy Model Adequacy: Also Relevant for PRA Evaluation Question Good Enough?
- Answer can depend on
- scope of evaluation (e.g., tool/meta-model vs. application model)
- intended application (e.g., scoping analysis, PRA support, licensing)
- Need to consider a variety of factors (e.g., treatment of key phenomena, strength of supporting evidence, explainable inconsistencies with evidence)
For PRA support applications
- Often concerned with behavior in the tails
- Objective integral data typically not available => conventional validation typically not achievable => accreditation more practical concept
- Important to identify and characterize uncertainties 9
Model Adequacy Model Validation: PRA Requirements, Guidance, and Common Practices
- PRA Standards
- ASME-ANS-RA-Sb-2013
- No explicit requirements for model validation
- Relies on peer review to evaluate adequacy
- NFPA 805 (2001): fire models must be verified and validated (comparison with test results, other acceptable models)
- PRA Guidance
- NUREG-1855, Rev. 1 (2017): no explicit guidance
- NUREG/CR-2300 (1983): cautions to users given lack of validation
- Common practices
- Software QA (e.g., NUREG/BR-0167)
- Best modeling practices (modeling guides)
- Independent review and user feedback
- Comparison against benchmarks (analytical/known solutions, separate effects tests, integral tests, event data) 10
Flood Model Considerations High-Fidelity Flood Models S. Prescott, et al., 3D Simulation of External Flooding Events for the RISMC Pathway, INL/EXT-15-36773, Idaho National Laboratory, 2015.
11
Flood Model Considerations Model and Modeling Considerations:
A PRA Reviewers Perspective
- Tool/meta-model scope and structure Levee Failure Modes
- Model element failure modes and associated hazards (immersion, wetting, dynamic forces, debris, drawdown, )
- Included phenomena (wind, flood-induced changes to topography/geomorphology and Overtopping Slope Instability artificial barriers)
- Underlying stochasticity (Model of the World)
- Heterogeneity (e.g., differing resolution for different phenomena), potential biases in integrated results
- Modeling for application Piping Erosion
- Establishment of initial and boundary conditions Adapted from T. Schweckendiek, Dutch approach to
- Input samples (coverage, correlation, model levee reliability and flood risk, Workshop on Probabilistic Flood Hazard Assessment, Rockville, MD, range of applicability) January 29-31, 2013.
- Other user choices (e.g., convergence criteria, workarounds) 12
Flood Model Considerations Example: Input Conditions for Site Flooding Model
- Hazard
- Uncertainty
- Source
- Propagation
- Multiple shocks (timing, magnitude)
- Dynamics
- Buildup
- Persistence
- Decay
- Site conditions
- Barriers (installation, condition)
- Potential debris (vehicles, temporary buildings, etc.)
- Subsidence/elevation 13
Concluding Remarks Modsim* is here to stay, but
- Model uncertainties are important and need to be understood
- PRA considers entire scenario: hazard, Wheres the fragility, response; how to identify and goat???
realistically address potentially important
- lightly or untreated phenomena
- concurrent hazards
- consequential hazards
- hazards affecting plant personnel Is a high-fidelity solution to part of the problem better than none?
- Another use of high-fidelity models: reduce completeness uncertainty (fear of the dark)
- Modeling and Simulation 14
ADDITIONAL SLIDES 15
Backup Treatment of Uncertainties (NUREG-1855, Rev. 1) 16
Backup Model Uncertainty: Logic Trees Fault Attenuation Recurrence Sources Segmentation Relationship Model Source 1 S-Model 1 A-Model 1 (pS1)
(pA1) Source 2 F-Model 1 (pF1) Source 3 S-Model 2 A-Model 2 Background (pS2)
(pA2)
F-Model 2 (pF2)
A-Model 3 (pA3)
Adapted from V.M. Andersen, Seismic Probabilistic Risk Assessment Implementation Guide, EPRI 3002000709, Electric Power Research Institute, Palo Alto, CA, December 2013 17
Backup Example Quantification of Model Uncertainty
- Estimating uncertainty factor in Deterministic Reference Model (DRM) estimates based on comparisons with experiments
- DRM for cable fire propagation velocity 0.05 Probability Density Function 2 0.04 05 3.8
= 0.03 50 14.8 2
95 56.9 0.02 E[] 18.6
- Uncertainty factor (a random variable) 0.01 1 1 2 0.00
= , = 2 0 20 40 60 80 100
, 2 Model Uncertainty Factor
- Bias in results indicates need for better DRM N. Siu, D. Karydas, and J. Temple, "Bayesian Assessment of Modeling Uncertainty: Application to Fire Risk Assessment," in Analysis and Management of Uncertainty: Theory and Application, B.M. Ayyub, M.M. Gupta, and L.N. Kanal, eds., North-Holland, 1992, pp. 351-361.
18
Backup Example Quantification of Model Uncertainty Time (s) Experiment (K) DRM (K) 180 400 450 360 465 510 720 530 560 840 550 565 Notes:
- 1) Bayesian methodology accounts for possibility of inhomogeneous data.
- 2) Very large uncertainty bands might be unrealistic.
E. Droguett and Ali Mosleh, Bayesian methodology for model uncertainty using model performance data, Risk Analysis, 28, No. 5, 1457-1476, 2008.
19
Backup Model Uncertainty Quantification:
Some Considerations
- Perspective and definition
- Model developer: ability of model to address phenomena
- PRA analyst: influence of user effects (even given best practices in modeling)
- Uncertainties (magnitude, correlation) in unmeasured parameters
- Sub-model limits of M.H. Salley and A. Lindeman, Verification and Validation of applicability Selected Fire Models for Nuclear Power Plant Applications, NUREG-1824 Supplement 1/EPRI 3002002182, November 2016.
20
Backup Terminology: V&V Verification Does the tool do what its required to do?
Probabilistic
- Life cycle reviews and audits
- Peer inspections
- Informal tests NRC Deterministic Validation Are the results accurate?
Industry
- Tests Licensing- Informational
- Acceptable models Basis Evaluation Technology = {methods, models, tools, data}
e.g., codes 21
Backup On Model Validation: NUREG/CR-2300 (1983)
- Accident sequence definition and system modeling: Processes both external and internal to the PRA team should be established to ensure that the study is conducted in a controlled manner and that all study activities can be validated.
- Severe accident analysis: Because research into core-melt processes is limited and the computer codes have not been validated in some areas, the computer codes cannot be used routinely without understanding the limitations of the models and thoroughly understanding the physical processes involved
- Consequence analysis: there is a sense in which no consequence-modeling code has been properly validatedThe code user has to assume that the models incorporated into the various modules of the code - the meteorological model, for example - represent good practice.
He should read the code manual and associated literature with a critical eye, noting the sources cited for the various models and the justifications given for their use. He should not hesitate to query the code developer if there is any reason for doubt.
22
Backup Code Development/Assessment Are discrepancies
- acceptably small
- conservative
- unimportant SER Headings (T-H)
- Phenomena Identification and Ranking
- Analytical Models
- Component Models
- Code Qualification Benchmark Calculations S. Bajorek, et al., Development, Validation and Assessment of the Trace Separate Effects Tests Thermal-Hydraulics Systems Code, Proc. 16th Intl. Topical Mtg. Nuclear Integral Effects Tests
- Regulatory Compliance Reactor Thermal Hydraulics (NURETH-16), August 30-September 4, 2015.
- ACRS Review
- Resolution of Open and Confirmatory Items 23
Backup SPAR Model QA Process United States Nuclear Regulatory Commission Standardized Plant Analysis Risk (SPAR) Model Quality Assurance Plan, Rev. 1, June 19, 2013. (ADAMS ML13141A333) 24
Backup COMPBRN: An Example of Good Enough?
- Fire model for use in PRA, analogous to current visions for flood modeling
- Deterministic physical (vs. state-transition or statistics-based) model
- Non-linear, cliff-edge phenomena
- Reasonable state-of-knowledge
- Major criticism: temperature profile dip in plume
- Lost in the furor
- Importance of weakness
- Phenomena/issues treated (fire growth, transient heat conduction, quantification of parameter and model uncertainties)
- Opportunities for using available data
- Fundamentally deterministic formulation 25
Backup External Hazards: A Causality-Based Classification 26