ML19156A447

From kanterella
Jump to navigation Jump to search
PFHA2019-Final_Abstracts
ML19156A447
Person / Time
Issue date: 04/30/2019
From:
Office of Nuclear Regulatory Research
To:
M. Carr 415-6322
Shared Package
ML19156A446 List:
References
Download: ML19156A447 (16)


Text

4th Annual Probabilistic Flood Hazard Assessment Workshop April 30 - May 2, 2019 U.S. NRC Headquarters, Rockville, Maryland Abstracts Summary S ESSION 1B: C OASTAL F LOODING 1B-1 KEYNOTE: NWS Storm Surge Ensemble Guidance Arthur Taylor*, National Weather Service/Office of Science and Technology Integration/Meteorological Development Laboratory 1325 East West Hwy, Silver Spring, MD 20910 The National Weather Service (NWS) Meteorological Development Lab (MDL) is tasked with developing storm surge guidance to help protect life and property from disastrous storms. After developing the Sea Lake and Overland Surges from Hurricanes (SLOSH) storm surge model in the 1980s, we recognized that storm surge is highly dependent on the location of the winds with regards to the underlying bathymetry, so one of the largest errors in storm surge guidance was the quality of the wind forecasts. Thus to save lives we had to account for wind uncertainty in a timely manner even if that meant erring on the side of caution in regards to the storm surge guidance.

Initially our approach was to develop Maximum Envelopes of Water (MEOWs) and Maximum of MEOWs (MOMs). MEOWs and MOMs are the maximum storm surge attained in each grid cell from a set of hypothetical hurricanes. As such they approximate the potential inundation for an area from a specific type of hurricane. MEOWs and MOMs form the basis of the US hurricane evacuation plans (Shaffer et al, 1989). The National Storm Surge Hazard map, developed by the National Hurricane Center (NHC),

was created by merging the MEOWs and MOMs from each computational domain onto a uniform grid.

In the 2000s, we addressed the issue via the Probabilistic Tropical Cyclone storm Surge model (P-Surge)

(Taylor et. al., 2008). P-Surge is a real-time ensemble based on parameterizing an active storm and permuting it via NHCs 5-year average forecasting errors. MEOWs and MOMs are based on an undefined error space, hypothetical storms, and an unknown time and tide whereas P-Surge is based on a defined error space, an active storm, and the current time and tide. NWSs storm surge watch and warning is primarily based on P-Surge.

More recently, in the 2010s, we treated wind uncertainty for extra-tropical and/or post-tropical storms via the Probabilistic Extra-Tropical Storm Surge model (P-ETSS). Extra-Tropical storms dont lend themselves to parameterization, so instead of permuting through an error space, P-ETSS is based on running a storm surge model with each of the 21 members of the Global Ensemble Forecasting System (GEFS). P-ETSS is intended for storms that arent well represented by a parametric hurricane wind model such as broader extra-tropical storms, weaker tropical storms or post-tropical depressions.

This talk will briefly describe the SLOSH model, and then focus on the details of P-Surge and P-ETSS along with future plans for development. The audience should come away knowing why NWS uses P-Surge and P-ETSS for the forecasting (as opposed to design) problem.

  • indicates speaker, ^ indicates remote speaker 1

4th Annual Probabilistic Flood Hazard Assessment Workshop 1B-2 Advancements in Probabilistic Storm Surge Models and Uncertainty Quantification Using Gaussian Process Metamodeling Norberto C. Nadal-Caraballo*, PhD; Victor M. Gonzalez, PE; Alexandros Taflanidis, PhD, U.S. Army Corps of Engineers (USACE) R&D Center, Coastal and Hydraulics Laboratory The application of probabilistic storm surge models for the probabilistic flood hazard assessment (PFHA) of critical infrastructure in coastal zones requires a comprehensive uncertainty quantification framework.

The new approach for treating uncertainty in probabilistic storm surge studies consists of quantifying aleatory and epistemic uncertainties associated with the application of data, methods, and models in each step of the analysis. The U.S. Army Engineer Research and Development Center, Coastal and Hydraulics Laboratory (ERDC-CHL) is performing a comprehensive assessment of uncertainties in probabilistic storm surge models in support of the U.S. Nuclear Regulatory Commissions (USNRC) efforts to develop a framework for probabilistic storm surge hazard assessment for nuclear power plants.

In the case of the joint probability method (JPM), which is the standard probabilistic model used to assess coastal storm hazard in hurricane-prone coastal regions of the United States, the error is incorporated in the integration of the hazard curve. Recent advancements by ERDC-CHL include the computation of spatially-varying hydrodynamic modeling errors, and the development of Gaussian process metamodels (GPMs) based on existing JPM storm suites. The GPMs emulate the response of hydrodynamic numerical models, such as ADCIRC, and enable the development of augmented storm suites consisting of tens of thousands to millions of tropical cyclones without introducing significant error. This, in turn, facilitates the evaluation of JPM and Monte Carlo methods and other probabilistic models and approaches for the integration of uncertainty that would otherwise be unfeasible due to computational burden constraints.

The treatment of epistemic uncertainty in the present study expands upon the traditional JPM approach of considering this uncertainty as an error term in the JPM integral by estimating the epistemic uncertainty that arises from the selection and application of alternate technically defensible data, methods and models at each step of the probabilistic storm surge modeling. The approach followed is based on USNRC guidance on probabilistic seismic hazard assessment (PSHA) where the uncertainty is propagated through the use of logic trees. The epistemic uncertainty is then quantified through the development of a family of hazard curves, with individual curves corresponding to evaluated data sources and methods associated with the different applications of probabilistic storm surge models. The range of the epistemic uncertainty is conveyed through the fractiles of the family of storm hazard curves, which are equivalent to non-exceedance confidence limits (e.g., 0.05, 0.16, 0.5 (median), 0.84, and 0.95).

1B-3 Probabilistic Flood Hazard Assessment Using the Joint Probability Method for Hurricane Storm Surge Michael B. Salisbury^, Atkins North America, Inc. for Electric Power Research Institute (EPRI)

Hurricane-induced storm surge can be significant, depending on the circumstances. For nuclear plants to adequately assess the risk posed by storm surge, we need to better understand the expected frequencies of storms that would produce a storm surge that could affect a site. A probabilistic flood hazard assessment can be used to determine the frequency of a storm surge that would be expected to exceed a particular flood height. This report explains how to use the Joint Probability Method and

  • indicates speaker, ^ indicates remote speaker 2

4th Annual Probabilistic Flood Hazard Assessment Workshop numerical simulations to obtain an estimate of annual exceedance probabilities as a function of surge height. This method can be used to determine the appropriate design basis and develop the flood hazard curvethe relationship between surge level and frequencyfor a probabilistic risk assessment.

The report includes discussions on developing and validating storm surge models for project sites, determining storms to be simulated, and identifying storm surge parameters and their associated probabilistic uncertainties. The methodology presented in the report is applicable to any coastal nuclear power plant that could be impacted by tropical cyclones (hurricanes). A practical example is presented to help demonstrate the method.

1B-4 Assessment of Epistemic Uncertainty for Probabilistic Storm Surge Hazard Assessment using a Logic Tree Approach Bin Wang*, GZA Probabilistic storm surge hazard assessment (PSSHA) requires the characterization of the mean storm surge frequency, inclusive of consideration of epistemic uncertainty. The PSSHA often involves inevitably significant uncertainty, especially at the low frequency range that is often applicable for hazard evaluations at critical infrastructures and facilities. Epistemic uncertainty arises due to the use of models to characterize the hazard input such as data source, probability distribution, storm rate and storm surge modeling. A logic tree approach uses alternatives at various input nodes and generates a family of hazard curves, from which a mean hazard curve can be derived. Branch weights are assigned based on analysts confidence of the selected alternatives being the best representation of the hazard input. The logic tree provides a transparent, structured framework for systematic characterization and quantification of potential epistemic uncertainties. The final product is a weighted mean flood hazard curve with confidence intervals based on the family of flood hazard curves from the logic tree.

This presentation provides an overview of the logic tree methodology and key engineering considerations including the role of engineering judgement. In the presented examples, each hazard curve was developed using the Joint Probability Method with Optimal Sampling and Response Surface method (JPM-OS-RS), which allows a full coverage of the hurricane parameter space with minimum modeling effort and surge interpolation with reasonable accuracy. However, alternative methods (e.g.,

empirical track method) can also be used for the logic tree. Multiple data sources are discussed, including historical and synthetic hurricane tracks. The presentation also discusses sensitivity of various nodes and engineering decisions.

S ESSION 1C: P RECIPITATION 1C-1 KEYNOTE: Satellite Precipitation Estimates, GPM, and Extremes George J. Huffman*, NASA/GSFC, Greenbelt, MD, USA The satellite precipitation retrievals considered high quality come from passive microwave sensors, and they have been available in sufficient quantity to construct fairly high-resolution time sequences of maps for about 20 years. The resulting publicly available, quasi-global, long-term datasets are listed in

  • indicates speaker, ^ indicates remote speaker 3

4th Annual Probabilistic Flood Hazard Assessment Workshop the tables at http://www.isac.cnr.it/~ipwg/data/datasets.html. In particular, the Global Precipitation Measurement (GPM) mission, a joint project of NASA and the Japanese Aerospace Exploration Agency, is working to create a long-term (1998-present) data record that is relatively homogeneous across the many individual precipitation-related satellites that have flown during that time. From the U.S. GPM Science Team, the Integrated Multi-satellitE Retrievals for GPM (IMERG) algorithm intercompares and merges these satellite data together with other inputs to create a best map of global precipitation every half hour with a resolution of 0.1°x0.1° of latitude/longitude. IMERG processing back to June 2000 should take place in the first 4 months of 2019. The on-going IMERG processing occurs three times to serve different needs: an Early Run 4 hours4.62963e-5 days <br />0.00111 hours <br />6.613757e-6 weeks <br />1.522e-6 months <br /> after observation time (rapid analysis of flood, landslide, and other high-impact events), a Late Run 14 hours1.62037e-4 days <br />0.00389 hours <br />2.314815e-5 weeks <br />5.327e-6 months <br /> after (a better estimate for crop, drought, and water resource analysis), and a Final Run 3.5 months later (the research-grade product that incorporates the most data, including monthly precipitation gauge information).

One key issue in defining extreme precipitation events is the strong interdecadal variability in extreme for any reasonable index of the term (Fu et al. 2010). The intermittent occurrence and non-Gaussian statistics that characterize precipitation make the analysis much more challenging than for, say, temperature. Nonetheless, a recent study (Demirdjian et al. 2018) demonstrates reasonable skill at computing Average Recurrence Interval ARI maps from 15 years of a predecessor of IMERG (the Tropical Rainfall Measuring Mission [TRMM] Multi-satellite Precipitation Analysis, or TMPA) that are comparable to ARI (up to about 20 years) computed with 65 years of gauge data over the continental U.S. The advantage provided by the satellite datasets is that they cover much of the globe, providing information that is not otherwise obtainable from surface data in remote, developing, and oceanic regions.

Demirdjian, L., Y. Zhou, G.J. Huffman, 2018: Statistical Modeling of Extreme Precipitation with TRMM Data. J. Appl. Meteor. Climatol., 57(1),

15-30. doi:10.1175/JAMC-D-17-0023.1 Fu, G., N.R. Viney, S.P. Charles, J. Liu, 2010: Long-Term Temporal Variation of Extreme Rainfall Events in Australia: 1910-2006. J. Hydrometeor.,

11, 950-965. doi:10.1175/2010JHM1204.1 1C-2 Hurricane Harvey Highlights: Need to Assess the Adequacy of Probable Maximum Precipitation Estimation Methods Shih-Chieh Kao*, Scott T. DeNeale, and David B. Watson, Environmental Sciences Division, Oak Ridge National Laboratory (ORNL)

Probable maximum precipitation (PMP) has been the primary criterion used to design flood protection measures for critical infrastructures such as dams and nuclear power plants. Based on our analysis using the Stage IV (ST4) quantitative precipitation estimates, precipitation associated with Hurricane Harvey near Houston, Texas, represents a PMP-scale storm and partially exceeds the Hydrometeorological Report No. 51 (HMR51) 72-h PMP estimates at 5,000 mi2 (ST4 = 805 mm; HMR51 = 780 mm) and 10,000 mi2 (ST4 = 686 mm; HMR51 = 673 mm). We also find statistically significant increasing trends since 1949 in the annual maximum total precipitable water and dew point temperature observations along the US Gulf Coast region, suggesting that, if the trend continues, the theoretical upper bound of PMP could be even larger. Our analysis of Hurricane Harvey rainfall data demonstrates that an extremely large PMP-scale storm is physically possible and that PMP estimates should not be considered overly conservative.

This case study highlights the need for improved PMP estimation methodologies to account for long-term trends and to ensure the safety of our critical infrastructures.

  • indicates speaker, ^ indicates remote speaker 4

4th Annual Probabilistic Flood Hazard Assessment Workshop 1C-3 Current Use of Reanalysis in Flood Hazard Assessments Jason Caldwell*, USACE, Galveston District As spatiotemporal resolution of reanalysis products and the quality of numerical weather prediction models continue to improve, the utility of these data may serve to supplement historical storm analyses.

There exists great potential to enrich the sample set of temporal and spatial distributions of precipitation for stochastic modeling approaches (e.g., those rooted in extreme value theory/frequency analyses and probable maximum precipitation (PMP).

In situ measurements of precipitation are now assimilated into some numerical weather models for initialization and; therefore, the future may hold opportunity to use model-generated data as a surrogate in low- and mid-level hydrologic hazard analysis (HHA) studies. Comparison of model forecast and reanalysis representations of Hurricane Harvey to quantitative precipitation estimates (QPE) are provided, focusing on the perspective of dam safety HHA, including: (i) evaluation of depth-area statistics; (ii) spatial/temporal evolution; and (iii) ensemble-based probabilistic measures of confidence.

Additional potential applications of these products will be summarized.

1C-4 Current Capabilities for Developing Watershed Precipitation-Frequency Relationships and Storm-Related Inputs for Stochastic Flood Modeling for Use in Risk-Informed Decision-Making M.G. Schaefer*, MGS Engineering Consultants, Inc.

Several advancements in watershed precipitation-frequency analysis over the past five years have increased the practicality and reduced uncertainties in using stochastic flood modeling for estimating the hydrologic characteristics of extreme floods with Annual Exceedance Probabilities (AEPs) of 10-5 and rarer. Stochastic flood modeling is now the preferred method for Probabilistic Flood Hazard Analysis (PFHA) for high consequence dams owned by Federal Agencies (TVA, USACE, USBR) where information and decisions are required about the hydrologic performance of large capital projects under extreme flood loading conditions.

The introduction of storm typing in 2014 for assembling precipitation maxima datasets for a given storm type for use in regional precipitation-frequency (PF) analysis was a major advancement that provides increased data homogeneity and reduction of uncertainties for estimation of extreme precipitation. This approach allows separate point and watershed precipitation-frequency (PF) relationships to be developed for each storm type. This is important because different storm types have different spatial and temporal patterns and storm seasonality which must be matched with the appropriate watershed PF relationship for proper hydrologic modeling. For watersheds subjected to snowmelt and rain-on-snow floods, compatibility of storm-related inputs must also extend to air temperature and freezing level time-series and associated temperature lapse-rates. The storm typing approach allows development of separate stochastic flood models and separate Hydrologic Hazard Curves for each storm/flood type and addresses the problem of mixed populations of flood types common to many watersheds/climatic environments.

Stochastic storm generation methods have been developed for synoptic scale Mid-Latitude Cyclone (MLC) and Tropical Storm Remnant (TSR) storm types. Stochastic storm transposition methods with

  • indicates speaker, ^ indicates remote speaker 5

4th Annual Probabilistic Flood Hazard Assessment Workshop resampling of historical convective spatial patterns have been developed for the Mesoscale Storm with Embedded Convection (MEC) storm type. These stochastic storm generation methods in combination with the Schaefer-Wallis-Taylor (SWT) method of regional PF analysis provide for development of watershed PF relationship for a given storm type.

The current capabilities and practices for development of watershed PF relationships will be presented along with the associated topics: of storm-typing; SWT method of regional PF analysis; storm seasonality; storm spatial and temporal patterns; stochastic storm generation methods and characterization of uncertainties.

1C-5 Factors Affecting the Development of Precipitation Areal Reduction Factors Shih-Chieh Kao*, Scott DeNeale, Environmental Sciences Division, Oak Ridge National Laboratory Probabilistic precipitation estimates (e.g., T-year rainfall) are widely used to inform hydrologic and hydraulic simulation, urban planning, critical infrastructure protection, and flood risk mitigation. Such estimates are quantified through precipitation frequency analysis, such as those provided in the National Oceanic and Atmospheric Administration (NOAA) Atlas 14. Nevertheless, many precipitation frequency products (such as Atlas 14) only provide point precipitation estimates. For watershed-scale applications, further adjustment through areal reduction factors (ARFs) is needed to establish spatially representative probabilistic precipitation estimates at a large area size.

Compared to the modern precipitation frequency products, the progress of ARF development in the U.S.

is lagging. The Technical Paper No. 29 (TP-29) ARFs published in the 1950s are still widely used in practice. In addition to being based on limited data, TP-29 ARFs do not vary with geographic location, seasonality, or return period, and are only provided for area sizes only up to 400 square miles. To support the development and implementation of probabilistic flood hazard assessment (PFHA) for U.S.

Nuclear Power Plants (NPPs), clearly the values of ARFs should be updated based on the most recent data, methods, and suitable models.

To help improve our understanding of ARF, we conducted a comprehensive study to explore factors that should be considered during ARF development. We start by identifying multiple gauge- and radar-based precipitation data products that can be used for ARF development. We then conduct a literature review to summarize recent ARF methods that may address the deficiencies in the conventional approach.

Using a watershed-based annual maximum precipitation searching approach, we demonstrate how these factors may quantitatively affect the values of ARFs for three hydrologic regions in the US. Our results suggest that ARFs could vary widely by return periods, data sources, methods, and geographical regions, and hence there is a need to determine site-specific ARF for unbiased flood estimates. This study will provide technical basis to help develop ARF guidance for NPP-PFHA applications. The study also demonstrates the values of modern, gridded precipitation products for computing ARFs and offers a framework for ARF evaluation with different datasets and methods.

  • indicates speaker, ^ indicates remote speaker 6

4th Annual Probabilistic Flood Hazard Assessment Workshop S ESSION 2A: R IVERINE F LOODING 2A-1 Using HEC-WAT to evaluate watershed level systems based risk Will Lehmann, Lea Adams, Chris Dunn; USACE, Institute for Water Resources, Hydrologic Engineering Center (HEC) 2A-2 Global Sensitivity Analyses Applied to Riverine Flood Modeling Vincent Rebour and Claire-Marie Duluc*, Institut de radioprotection et de sûreté nucléaire (IRSN)

Radioprotection and Nuclear Safety Institute 2A-3 Detection and attribution of flood change across the United States Stacey A. Archfield *, Water Mission Area, U.S. Geological Survey In the United States, there have a been an increasing number of studies quantifying trends in the annual maximum flood; yet, few studies examine trends in floods that may occur more than once in a given year and even fewer assess trends in floods on rivers that have undergone substantial changes due to urbanization, land-cover change, and agricultural drainage practices. Previous research has shown that, for streamgages having minimal direct human intervention, trends in the peak magnitude, frequency, duration and volume of frequent floods (floods occurring at an average of two events per year relative to a base period) across the United States show large changes; however, few trends are found to be statistically significant. Current research is extending previous work to provide a comprehensive assessment of flood change across the United States that includes rivers that have experienced confounding alterations to streamflow in addition to the more commonly-used datasets of rivers that have only experienced limited catchment alteration.

2A-4 Bulletin 17C Flood Frequency and Extrapolations for Dams and Nuclear Facilities John England*, Haden Smith, USACE Risk Management Center, Brian Skahill, USACE ERDC-CHL Flood frequency guidelines have been published in the United States since 1967 for nationwide flood risk management and flood damage abatement programs. These Guidelines for Determining Flood Flow Frequency have undergone periodic revisions. Bulletin 17C is the latest revision to these flood frequency guidelines, was published in March 2018, and is available at https://doi.org/10.3133/tm4B5.

Bulletin 17C contains statistical procedures using the Expected Moments Algorithm (EMA) and the log Pearson Type III (LP-III) distribution that efficiently and effectively utilize extreme flood data, including historical, paleoflood, and botanical information. Interval flood data are now included in flood frequency, such as large floods that exceeded some level, floods in a range, and/or floods that were less than some value. The Guidelines include procedures to properly estimate the extreme flood right-hand tail of the flood frequency distribution, by carefully eliminating the influence of small floods or zero values called Potentially Influential Low Floods. An enhanced focus is on collecting at-site and regional data on extreme floods, including expanding the record in time with data from historical and paleoflood

  • indicates speaker, ^ indicates remote speaker 7

4th Annual Probabilistic Flood Hazard Assessment Workshop sources. Extraordinary floods, those that are the largest magnitude at a site or region and substantially exceed other observations, are of critical importance and are included by judicious use of flow perception thresholds and longer time frames than the number of years at a gaging station. The Guidelines include a new section on Frequency Curve Extrapolation that provides guidance on estimating flood probabilities less than 0.01 Annual Exceedance Probability (AEP) and relevant for dams, levees, and nuclear facilities. In these situations, such as to estimate AEPs in the range of 10-3 to 10-6, a flexible, three-step approach using multiple lines of evidence is recommended. These three elements are: (1) utilize EMA and LP-III with expanded historical, paleoflood, and extraordinary flood data at the site (temporal information expansion); (2) expand and improve regional skew models (spatial information expansion); and (3) expand and include regional independent information such as from extreme flood rainfall-runoff models within the watershed or other physical and causal information.

Information can be combined and weighted using quantile variances or other procedures. An overview of this approach is presented with example flood hazard estimates for dam safety.

2A-5 Riverine Paleoflood Analyses in Risk-Informed Decision Making: Improving Hydrologic Loading Input for USACE Dam Safety Evaluations Keith Kelson*, USACE Sacramento Dam Safety Production Center, Justin Pearce, USACE Risk Management Center, Brian Hall, USACE Dam Safety Modification Mandatory Center of Expertise.

USACE dam safety assessments consider potential hazards (loading), system responses (fragility), and associated consequences, as part of the risk assessment of its national dam portfolio. Where possible, paleoflood data are incorporated in dam safety evaluations to reduce uncertainties in the hydrologic loading component of the risk assessment. To date, the reaches of interest to the dam safety assessments have focused primarily on riverine paleostage indicators (PSI) and non-exceedance bounds (NEB), rather than established characterization methods using slackwater deposits, cave-deposit stratigraphy, dendrochronology, or others. For sites deemed justified within the risk-informed decision making framework, the geomorphic and/or hydrologic conditions commonly are not idealwhether because of sparsely preserved PSI or NEB, channel geometric non-stationarity, temporal hydrologic non-stationarity, processes that shift stage-discharge relationships (e.g., ice jams, debris blockage), and a host of other possible problems. These challenges are addressed during site selection, in order to eliminate or minimize uncertainties that could diminish applicability of the results to the risk assessment.

Data collection in riverine settings demand an integrated approach among geomorphic, hydrologic, and hydraulic disciplines. Following initial geomorphic identification of feasible riverine reaches, the peak flood of record (FOR) is defined or estimated, using systematic gage data, historical observations, and/or reservoir-inflow calculations. The existing Probable Maximum Flood (PMF) for the site is used or refined, as needed for the risk assessment. Existing hydraulic models are utilized to develop information on the extents and down-valley elevation of both the FOR and PMF, which are then used to constrain geologic and geomorphic field reconnaissance. Geologic/geomorphic field efforts focus on characterizing flood-related PSI (e.g., high-discharge fluvial terraces, stranded slackwater deposits, biologic features) or non-inundation NEB (e.g., undisturbed alluvial-fans, well-developed relict soils), and obtaining numerical or reasonable relative age estimates for the PSI and NEB. For the riverine reaches, the down-valley longitudinal profiles of the PSI and NEB are compared to flood water surface elevations generated from either 1D or 2D HEC-RAS hydraulic modeling of various discharges between the FOR and

  • indicates speaker, ^ indicates remote speaker 8

4th Annual Probabilistic Flood Hazard Assessment Workshop the PMF. Discharges associated with the PSI and NEB are estimated through comparing the geomorphic field observations, historic hydrologic data, and HEC-RAS hydraulic modeling.

Uncertainties are captured throughout the analysis, and explicitly calculated or subjectively estimated for inclusion in flow-frequency calculations using Bulletin 17C methodology. For analyses using riverine PSI or NEB analyses, primary sources of uncertainty are: (1) ranges in estimated ages of PSI and NEB; (2) ranges in roughness and energy gradient for various discharges; (3) water depth and velocity required for sediment deposition (for PSI) or erosion (for NEB); and (4) inherent variability of the water surface and the PSI or NEB geomorphic datum in the down-valley longitudinal profile. These potential uncertainties are captured via sensitivity analyses and/or acknowledged within ranges of paleoflood discharges or timing. The best-estimates and ranges in discharge and age for each PSI and NEB are included into flow-frequency statistics through use of perception thresholds and flow intervals (ranges in discharge).

This approach has assisted multiple recent and ongoing USACE analyses for dam safety. Flow-frequency curves incorporating paleoflood information can be compared to those based solely on systematic and historic information. Often, the paleoflood analysis prompts re-evaluation and improvement of the historical record, which extends the effective record length (ERL) regardless of pre-historic data. The analyses have allowed interpretations that shift flow-frequency curves both up and down, or provided a basis for narrowing confidence bands in existing curves. At this time, there appears to be no systematic shift for greater or lesser frequencies of given extreme discharges, when comparing flow-frequency curves using paleoflood data and those using only systematic and historic data. While uncertainties can be significant in paleoflood analyses, conscientious site-selection, data collection, and analytical efforts allow uncertainties to be minimized and captured for appropriate inclusion in risk-informed hydrologic loading estimates.

2A-6 Improving Flood Frequency Analysis with a Multi-Millennial Record of Extreme Floods on the Tennessee River near Chattanooga, TN Tess Harden*, Jim OConnor & Mackenzie Keith, U.S. Geological Survey

  • indicates speaker, ^ indicates remote speaker 9

4th Annual Probabilistic Flood Hazard Assessment Workshop S ESSION 2B: M ODELING F RAMEWORKS 2B-1 Structured Hazard Assessment Committee Process for Flooding (SHAC-F) for Probabilistic Flood Hazard Assessment Rajiv Prasad^ & Phillip Meyer, Pacific Northwest National Laboratory; Kevin Coppersmith, Coppersmith Consulting This research project is part of the U.S. Nuclear Regulatory Commissions (NRC's) Probabilistic Flood Hazard Assessment (PFHA) Research plan in support of development of a risk-informed analytical approach for flood hazards. Risk-informed approaches require full expression of flood hazards probabilistically. The structured hazard assessment committee process for flooding (SHAC-F) is expected to support reviews of license applications, license amendment requests, and reactor oversight activities. Pacific Northwest National Laboratory is leading the development of SHAC-F. In previous years, we described virtual studies following a Senior Seismic Hazard Analysis Committee (SSHAC) Level 3 process for local intense precipitation (LIP)-generated flood and riverine floods. These studies indicated a need to define the basic aleatory model, adopt explicit characterization of epistemic uncertainties, document all aspects of the hazard assessment, and describe SHAC-F studies progressively from the simplest to most complex. Lessons learned from these studies contributed to the development of SHAC-F.

SHAC-F is structured at three levels with the levels defined in terms of the purpose of the assessment.

Level 1 SHAC-F studies support screening assessments for structures, components, and systems (SSCs),

or binning the flood hazard into significant or non-significant categories. Level 2 studies are appropriate to (1) refine a screening analysis (e.g., where a Level 1 study could not adequately support binning of flood hazards) and (2) update an existing Level 3 assessment. Level 3 assessments support licensing reviews, design reviews, and probabilistic risk assessment (PRA) for new and existing power reactors.

For all three SHAC-F levels, the expected outcome of the study is generation of a family of flood hazard curves appropriate for the purpose of the assessment. Project structures and roles and responsibilities of team members are clearly defined in SHAC-F. Composition of SHAC-F analysis teams are specific to the needs of flooding analyses and depend on the complexity of the study. At all three SHAC-F levels, an appropriately-sized participatory peer review panel oversees the hazard assessment.

Data and methods used for SHAC-F are also defined to be commensurate with the purpose of the study.

A SHAC-F Level 1 study uses existing data, possibly for an at-site flood-frequency analysis and/or simplified hydraulics simulation. A Level 1 study may use alternative conceptual models (ACMs) to represent epistemic uncertainty (e.g., various parametric or non-parametric distributions in the case of flood-frequency studies), and may include regionalization and accounting for nonstationarities. A SHAC-F Level 2 study might combine flood-frequency analyses with existing simulation model studies to refine the flood hazard estimation. ACMs in this case could include alternative simulation models that can reasonably represent the flood behavior at the site. A SHAC-F Level 3 study needs to account for spatiotemporal resolution of flood hazard predictions that can support licensing and PRA needs.

Existing data can be used in a Level 3 study, but a site-specific, detailed analysis would be needed with explicit accounting of nonstationarities to support licensing timeframes. At all levels of SHAC-F, explicit characterization of uncertainties, both aleatory and epistemic, is required.

  • indicates speaker, ^ indicates remote speaker 10

4th Annual Probabilistic Flood Hazard Assessment Workshop 2B-2 Overview of the TVA PFHA Calculation System Shaun Carney*, RTI International, Water Resource Management Division, 2950 E. Harmony Road, Suite 390, Fort Collins, CO 80528 Around 2015, The Tennessee Valley Authority (TVA) began a large-scale program to develop inputs to support their Risk Informed Decision Making process, including estimation of economic and life loss consequences along with development of hydrologic hazard curves based on Probabilistic Flood Hazards Analysis (PFHA). Through this process, RTI, MGS Engineering, and MetStat helped build a computational framework for performing the PFHA for reservoirs and other critical locations throughout their system.

The framework uses computational modules from the Stochastic Event Flood Model (SEFM) in combination with a suite of hydrologic models and RiverWare operations models. The use of SEFM modules provides flexibility to implement different models within the computational system. MetStat developed a customized storm transposition interface to generate representative storm patterns needed for the stochastic modeling. Other unique aspects of the computational framework include sampling for different storm types, the use of synthetic long-term time series, intelligent sampling and convergence evaluation, both natural and regulated hazard curve development, and advanced data-mining techniques to explore output from tens of thousands of simulations. This presentation will review the unique aspects of the computational framework and will discuss plans for ongoing development.

2B-3 Development of Risk-Informed safety margin characterization framework for flooding of nuclear power plants M.A. Andre1, E. Ryan2,3, S. Prescott3*, L. Lin4, N. Montanari5, R. Sampath5, A. Gupta4, N. Dinh4, P.M.

Bardet1, 1 George Washington University, Mechanical and Aerospace Engineering, 2Idaho State University, Nuclear Engineering, 3 Idaho National Laboratory, 4North Carolina State University, Nuclear Engineering, 5Centroic Lab There are six categories of external flooding listed by the NRCs Flood Digest work. Because of site specificities, nuclear power plants have different concerns for flooding. Some flooding mitigation methods, such as temporary or permanent flood walls to guard against storm-surge or stream and river flooding, have high uncertainties on their effectiveness due to conditions such as debris, wave overtopping, etc. Thus flood mitigation systems would benefit from validated Risk-Informed Safety Marching Characterization (RISMC) methodologies. Hence, a computationally efficient and flexible 3D FSI Smoothed -Particle Hydrodynamics (SPH) code, NEUTRINO, is being validated in this context with existing (published) datasets as well as with a dedicated experimental facility. The experimental facility is a large sloshing tank (6 m x 2.4 m x 1.2 m) where a variety of scenarios can be tested. Large impact forces can be generated in a controllable and repeatable manner, and a broad range of diagnostics can be deployed. The code and facility will be presented as well as the organization and collaboration of the numerical and experimental teams, who are closely working to define the experimental and numerical campaign to validate NEUTRINO and its use in RISMC. Additionally this work is being used to demonstrate an uncertainty propagation methodology when using multiple sources of data.

S ESSION 2C: P OSTER S ESSION

  • indicates speaker, ^ indicates remote speaker 11

4th Annual Probabilistic Flood Hazard Assessment Workshop S ESSION 3A: C LIMATE AND N ON -S TATIONARITY 3A-1 KEYNOTE: Hydroclimatic Extremes Trends and Projections: A View from the Fourth National Climate Assessment Kenneth Kunkel*, North Carolina State University This presentation will summarize the findings in the Fourth National Climate Assessment as well as recent work by the presenter. Extreme precipitation has increased overall in the U.S. There has been regional variability in the trends, with large increases in the eastern half of the U.S. and small changes in the far western U.S. Future global warming is highly likely to cause further increases in extreme precipitation because atmospheric water vapor concentration will increase as surface ocean waters warm. Analysis of historical precipitation extremes provides a basis for this projection; the magnitude of extreme precipitation events is positively correlated with atmospheric water vapor content. Future changes in floods are much less certain because floods depend on a number of factors in addition to the magnitude of extreme precipitation, such as antecedent soil moisture and flood type.

3A-2 Regional Climate Change Projections: Potential Impacts to Nuclear Facilities L. Ruby Leung* & Rajiv Prasad, Pacific Northwest National Laboratory This project is part of the U.S. Nuclear Regulatory Commissions (NRCs) Probabilistic Flood Hazard Assessment (PFHA) research plan that aims to build upon recent advances in deterministic, probabilistic, and statistical modeling to develop PFHA regulatory tools and guidance. To provide improved understanding of large-scale climate patterns and associated changes in the context of PFHA, this project provides a literature review, focusing on recent studies of the mechanisms of and changes in climate parameters in the future, including discussions of the robust and uncertain aspects of the changes and future directions for reducing uncertainty in projecting those changes. During the first year, the project reviewed various aspects of climatic changes nationwide, the second year focused on more detailed changes in the southeastern U.S., and the third year focused on changes in the Midwestern U.S.

The current focus, during the fourth year, is on the northeast region and on updating the first three years reports. The Midwest region consists of 8 states (Minnesota, Wisconsin, Michigan, Iowa, Missouri, Illinois, Indiana, and Ohio). The third year report discusses observed historical changes and the projected future changes, drawing on major reports from the National Climate Assessment and peer-reviewed journal papers. Overall, mean and annual 5-day maximum temperatures are projected to increase. With increasing moisture accompanying the warmer temperatures, precipitation is projected to increase in the cool season, but the changes in warm season precipitation are not statistically significant. Despite inconsistency in mean precipitation changes across the seasons, extreme precipitation (99th percentile) is projected to increase by more than 10% and 30% by the end of the 21st century under the RCP4.5 and RCP8.5 emissions scenarios, respectively. Consistent with observational evidence, a regional climate modeling study at 4-km resolution projected more than tripling in the frequency of intense summertime mesoscale convective systems. Lake effect snow storms are projected to increase as reduction of the surface area of lake ice with warming increases evaporation from the surface; larger warming farther into the future may shift snowfall into rain. The Great Lakes water levels have exhibited large variability historically. Models projected small variations in the lakes levels with a

  • indicates speaker, ^ indicates remote speaker 12

4th Annual Probabilistic Flood Hazard Assessment Workshop large range of uncertainty, possibly indicating incomplete understanding of lakes water budget.

Observational records show strong evidence of increasing flood frequency but limited evidence of increasing flood magnitudes. With the projected increase in extreme precipitation and storm events, flooding is projected to increase notably in the future. Both land use and climate changes affect streamflow with greater effects from the latter. The fourth year report on the northeastern U.S. is being developed to include literature review on historical and future changes in mean temperature, heat waves, and wet bulb temperature, mean and extreme precipitation, extratropical storms, heavy snowfall, severe storms and strong winds (tropical cyclones, severe convective storms, lake effect snow storms), sea level rise, storm surge, nuisance flood, Lake Ontario water level, flood and drought, and stream temperature.

3A-3 Role of Climate Change/Variability in the 2017 Atlantic Hurricane Season Young-Kwon Lim*,1,2, Siegfried Schubert1,3, Robin Kovach1,3, Andrea Molod1, Steven Pawson1; 1NASA Goddard Space Flight Center, Global Modeling and Assimilation Office, 2Goddard Earth Sciences, Technology, and Research / I. M. Systems Group, 3Science Systems and Applications, Inc.

The 2017 Atlantic hurricane season was extremely active with six major hurricanes, the third most on record. The sea-surface temperatures (SSTs) over the eastern Main Development Region (EMDR), where many tropical cyclones (TCs) developed during active months of August/September, were ~0.96°C above the 1901-2017 average (warmest on record): about ~0.42°C from a long-term upward trend and the rest

(~80%) attributed to the Atlantic Meridional Mode (AMM). The contribution to the SST from the North Atlantic Oscillation (NAO) over the EMDR was a weak warming, while that from El Nino-Southern Oscillation (ENSO) was negligible. Nevertheless, ENSO, the NAO, and the AMM all contributed to favorable wind shear conditions, while the AMM also produced enhanced atmospheric instability.

Compared with the strong hurricane years of 2005/2010, the ocean heat content (OHC) during 2017 was larger across the tropics, with higher SST anomalies over the EMDR and Caribbean Sea. On the other hand, the dynamical/thermodynamical atmospheric conditions, while favorable for enhanced TC activity, were less prominent than in 2005/2010 across the tropics. The results suggest that unusually warm SST in the EMDR together with the long fetch of the resulting storms in the presence of record-breaking OHC may be key factors in driving the strong TC activity in 2017.

S ESSION 3B: F LOOD P ROTECTION AND P LANT R ESPONSE 3B-1 Qualitative Risk Ranking Process of External Flood Penetration Seals Marko Randelovic*, EPRI Preventing water from entering into areas of NPPs that contain significant safety components is the function that various flood-protection components serve across the industry. Several types of flood barriers, both permanent and temporary, are used at NPPs. These barriers include external walls, flood doors, and flood barrier penetration seals (FBPSs) that allow cables, conduits, cable trays, pipes, ducts, and other items to pass between different areas in the plant. A comprehensive guidance on the design,

  • indicates speaker, ^ indicates remote speaker 13

4th Annual Probabilistic Flood Hazard Assessment Workshop inspection and maintenance of flood-protection components has been assembled in EPRIs technical report Flood Protection System Guide. This document includes information related to these topics for a variety of flood-protection components, while focusing specifically on FBPSs. The NRC-RES has initiated a project to develop testing standards and protocols to evaluate the effectiveness and performance of seals for penetrations in flood rated barriers at nuclear power plants. EPRI is currently developing a qualitative risk ranking process for the plants to categorize, or risk-rank installed penetration seals according to the likelihood and consequence of seal failure(s) considering the various metrics regarding seal condition, design, and location. In addition to identifying potentially risk significant FBPS for prioritization of surveillance and/or replacement, plants performing an external flood probabilistic risk assessment (PRA) may use this process to identify which penetrations may need to be explicitly modeled in the PRA. The intent of this guidance is to provide a process to categorize and rank penetration seals with regard to likelihood of failure and the significance of a loss of the penetration sealing capability.

3B-2 Results of Performance of Flood-Rated Penetration Seals Tests William (Mark) Cummings*, Fisher Engineering, Inc. (formerly Fire Risk Management, Inc.)

Overall risk analyses of nuclear power plants (NPPs) include the need for protection against potential flooding events; both internal and external events. Typically, a primary means to mitigate the effects of a flooding event are to construct flood rated barriers to isolate areas of the plant to prevent the intrusion or spread of flood waters. Any penetrations through flood-rated barriers to facilitate piping, cabling, etc. must be properly protected to maintain the flood-resistance of the barrier. Numerous types and configurations of seal assemblies and materials are being used at NPPs to protect penetrations in flood-rated barriers. However, no standardized methods or testing protocols exist to evaluate, verify, or quantify the performance of these, or any newly installed, flood seal assemblies. In FY2016, the NRC implemented a research program to develop a set of standard testing procedures that will be used to evaluate and quantify the performance of any penetration seal assembly that is, or will be, installed in flood rated barriers. Although this presentation represents a summary of the project, which completed in September 2018; its primary focus is the results of testing of candidate seal assemblies using the draft Test Protocol that was performed in August 2018. The test results were used to evaluate if potential changes/updates to the Test Protocol were needed.

3B-3 Modeling Overtopping Erosion Tests of Zoned Rockfill Embankments Tony Wahl^, U.S. Bureau of Reclamation A 3-ft high physical model of a zoned rockfill embankment dam was tested in the Bureau of Reclamation hydraulics laboratory to gain a better understanding of erosion and dam breach processes associated with overtopping flow. Erosion rates of the model embankments were evaluated from visual records, and erodibility parameters of the soils were compared to small-scale submerged jet erosion tests performed on the test embankments and other compacted soil samples.

This presentation discusses attempts to simulate the test using two computational dam breach models, WinDAM C and DL Breach. The idealized erosion process frameworks of each model are presented and compared to one another and to the erosion processes observed in the physical model test.

  • indicates speaker, ^ indicates remote speaker 14

4th Annual Probabilistic Flood Hazard Assessment Workshop S ESSION 3C: T OWARDS E XTERNAL F LOODING PRA 3C-1 External Flooding PRA Walkdown Guidance Marko Randelovic*, EPRI, Andrew Miller*, Jensen Hughes As a result of the accident at Fukushima Dai-ichi, the need to understand and account for external hazards (both natural and man-made) has become more important to the industry. A major cause of loss of AC power at Fukushima Dai-ichi was a seismically induced Tsunami that inundated the plants safety-related systems, structures and components (SSCs) with flood water. As a result, many nuclear power plants (NPPs) have reevaluated their external flooding (XF) hazards to be consistent with current regulations and methodologies. As with all new information obtained from updating previous assumptions, inputs and methods (AIMs), the desire exists to understand the changes in the characterization of the XF hazard and the potential impact to the plants overall risk profile. This has led to an increased need to develop a comprehensive External Flooding Probabilistic Risk Assessment (XFPRA) for more NPPs. One of the steps for developing XFPRA is the plant walkdown, which is the central focus of the research. This research provides guidance on preparing for and conducting XF walkdowns to gather the necessary information to better inform the XFPRA process. Major topics that will be addressed include defining key flood characteristics, pre-walkdown preparation, performing the initial walkdown, identifying the need for refined assessments or walkdowns, and documenting the findings in a notebook. This guidance also addresses walkdown team composition, guidance on useful plant drawings and utilizing previous walkdowns or PRAs to inform the XF walkdown process.

3C-2 Updates on the Revision and Expansion of the External Flooding PRA Standard Michelle (Shelby) Bensi*, University of Maryland Recently, a significant effort has been undertaken to revise the ANS/ASME external flooding PRA standard. The draft standard is currently in the process of being reviewed and revised through the standards development process. The proposed revision is significantly more detailed than the existing standard, reflects recent lessons-learned since the last revision, and recognizes the current state of practice. This presentation with share insights from the development of the proposed revision as well as other recent experience with various aspects of external flooding PRA.

3C-3 Update on ANS-2.8: Probabilistic Evaluation of External Flood Hazards for Nuclear Facilities Ray Schneider*, Westinghouse

  • indicates speaker, ^ indicates remote speaker 15

4th Annual Probabilistic Flood Hazard Assessment Workshop 3C-4 Qualitative PRA Insights from Operational Events of External Floods and Other Storm-Related Hazards Nathan Siu, Ian Gifford*, Zeechung (Gary) Wang, Meredith Carr and Joseph, NRC/RES

  • indicates speaker, ^ indicates remote speaker 16