ML20080M170

From kanterella
Revision as of 23:09, 26 March 2020 by StriderTol (talk | contribs) (StriderTol Bot insert)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search
PFHA-2020-Abstracts
ML20080M170
Person / Time
Issue date: 02/19/2020
From:
Office of Nuclear Regulatory Research
To:
Office of Nuclear Regulatory Research
T. Aird
Shared Package
ML20080M135 List:
References
Download: ML20080M170 (18)


Text

5th Annual Probabilistic Flood Hazard Assessment Research Workshop Headquarters, Rockville, MD, February 19-21, 2020 ABSTRACTS 1B-1 An Overview of Past and Future Hydrometeorological Changes in the Northeastern United States L. Ruby Leung and Rajiv Prasad, Pacific Northwest National Laboratory As part of the U.S. Nuclear Regulatory Commissions (NRC) Probabilistic Flood Hazard Assessment (PFHA) research plan to develop regulatory tools and guidance to support and enhance the NRCs capacity to perform thorough and efficient reviews of license applications and license amendment requests, this study summarizes the current state of climate research and results regarding hydrometeorological phenomena that are of interest in safety assessments and environmental impact assessments for commercial nuclear power plants. This presentation will focus on region-specific scientific findings about climate change for the northeast region. Drawing primarily from the NCA reports and peer-reviewed literature, we will briefly review the observed climate, its past changes, and its projected changes, as well as 21st century hydrologic impacts in the northeast region. The northeast region exhibits long-term warming trends in all seasons in the 20th century. Warming is projected to continue in the future, with greater warming in winter and summer than spring and fall. Annual mean precipitation and extreme precipitation show a long-term increasing trend in the 20th century.

Precipitation is projected to increase particularly in winter and spring while changes in summer are not significant. North Atlantic hurricanes are projected to increase in intensity, rainfall, and storm size.

Projections of extratropical cyclone activity changes remain uncertain, but theory suggests that convection associated with extratropical cyclones will become more vigorous even if extratropical cyclone activity may decrease. With warmer temperatures and more moisture, an increase in mesoscale convective system track density and intensity is projected for the mid-Atlantic/northeast region. The northeast region is a hotspot of accelerated sea-level rise in recent decades. Sea-level rise in the region is projected to be highest among cities worldwide due to weakening of the Atlantic Meridional Overturning Circulation. Combining increases in tropical cyclone intensity and sea-level rise, storm surge is projected to increase in the future but a shift of cyclone tracks towards offshore may cancel the effect of increase storm intensity, resulting in little change in storm surge in the future. As warming increases, the ratio of snow to total precipitation is declining and the center-volume date for winter-spring streamflow is shifting earlier in the year. These changes together are affecting seasonality of streamflow in the tributaries of Lake Ontario and show a marked dependence on latitude of the tributary drainage area. Recent efforts point to promising approaches towards using more spatially explicit models over the entire Lake Ontario drainage basin for streamflow simulations.

1B-2 Modeling of Climate Change Induced Flood Risk in the Conasauga River Basin Tigstu T. Dullo,1 Sudershan Gangrade,2 Md Bulbul Sharif,1 Mario Morales-Hernandez,2 Alfred J.

Kalyanapu,1 Sheikh K. Ghafoor,1 Shih-Chieh Kao,2 and Katherine J. Evans2, 1 Tennessee Tech University; 2

Oak Ridge National Laboratory The goal of this study is to evaluate the potential impacts of climate change on flood regimes and infrastructures at a high-spatial resolution through coupled hydrologic-hydraulics models. The hydrologic simulations are conducted using the high resolution Distributed Hydrology Soil Vegetation Model (DHSVM) driven by (1) 1981-2012 Daymet meteorologic observation, and (2) 11 sets of downscaled Coupled Model Intercomparison Project phase 5 (CMIP5) global climate model projections for 40 years in the historical period (1966-2005), and 40 years in the future (2011-2050). Flood simulations are performed using a graphic processing unit (GPU)-accelerated hydraulics model (TRITON) that solves the full 2D-shallow water equations using a new finite-volume numerical scheme. The TRITON model is first evaluated for its sensitivity to several model parameters, namely, the digital elevation model, Mannings roughness, and initial conditions. Then, the TRITON model performance is assessed by comparing to the existing Federal Emergency Management Authority flood inundation maps. Finally, the verified flood model is used to simulate 912 annual maximum streamflow events at 10 m spatial resolution for an ensemble-based flood risk evaluation. The flood simulation results are used to evaluate changes in flood regimes and to assess the vulnerability of infrastructures in a changing climate.

1B-3 KEYNOTE: Causality and extreme event attribution. Or, was my house flooded because of climate change?

Michael F. Wehner, Lawrence Berkeley National Laboratory Extreme event attribution is an exercise in causality. Rather than some deep philosophical statement, an attribution statement is a probabilistic one. However, it is also a conditional statement and is incomplete if the conditions and uncertainties are not clearly specified. We will review a hierarchy of extreme attribution statements types and their uncertainties ranging from those with very few conditions to those that are highly constrained. Real world examples will include interpretations of recent attributions statements about Hurricanes Harvey, Maria and Irma. In particular, we will explore the confidence in the human induced portion of the Harveys record rainfall and flooding in the greater Houston area by examining five independent analyses.

1B-4 Attribution of Flood Nonstationarity across the United StatesClimate-Related Analyses Karen R. Ryberg, Stacey A. Archfield, William H. Asquith, Nancy A. Barth, Katherine J. Chase, Jesse E.

Dickinson, Robert W. Dudley, Angela E. Gregory, Tessa M. Harden, Glenn A. Hodgkins, David Holtschlag, Delbert Humberson, Christopher P. Konrad, Sara B. Levin, Daniel E. Restivo, Roy Sando, Steven K. Sando, Eric D. Swain, Anne C. Tillery, Benjamin C. York, Julie E. Kiang, U.S. Geological Survey As a statistical method, flood-frequency analysis has fundamental underlying assumptions, including an assumption that floods are generated by stationary processes (constant mean within a window of variance). Observed changes in precipitation and temperature patterns, along with continued human modification to the natural landscape, such as dams, agricultural drain tiles, and the expansion of irrigation, can impact flood-frequency analysis and make the estimates become increasingly questionable for some sites or time periods. Yet, flood-frequency analysis remains critical for the

  • indicates speaker, ^ indicates remote speaker 2

appropriate sizing and construction of flood-control infrastructure and for informing decisions related to the risk reduction. As part of a multi-year project funded by the U.S. Federal Highway Administration, trends and change points (nonstationarities that are violations of the assumptions of flood-frequency analysis) in annual peak streamflow data across the conterminous United States were identified. Then, a team of regional experts attributed these changes, where possible, to anthropogenic and environmental factors for which there are long-term data. Once the anthropogenic or environmental changes causing these nonstationarities are better understood, analysts can then begin to make choices about the best approaches for adjusting flood-frequency analyses. This presentation focuses on the climate-related analyses undertaken by regional experts.

1C-1 KEYNOTE: Planned Improvements for NOAA Atlas 14 Process and Products Sanja Perica1, Sandra Pavlovic1,2, Michael St. Laurent1,2, Carl Trypaluk1,2, Dale Unruh1,2, 1

Hydrometeorological Design Studies Center, Office of Water Prediction, NWS, NOAA, Silver Spring, MD, 2

University Corporation for Atmospheric Research, Boulder, CO Since 2004, the Hydrometeorological Design Studies Center (HDSC) of the National Oceanic and Atmospheric Administrations (NOAA) Office of Water Prediction has been in the process of updating outdated precipitation frequency estimates from the 1950s, 60s and 70s for U.S. states and affiliated territories in NOAA Atlas 14. NOAA Atlas 14 estimates are used for a variety of infrastructure design and planning activities under federal, state, and local regulations and are available for download from the Precipitation Frequency Data Server.

Funding for NOAA Atlas 14 work dictates that updates are done in volumes based on state boundaries.

Volumes are typically produced in a serial workflow stretching over many years. This approach ultimately raises concerns over data continuity and currency among different volumes. Ideally, a well-defined, consistent, and reliable funding approach will be set to ensure that estimates are updated simultaneously for the whole country in 10-15 year cycles.

For future updates, HDSC proposes to develop an enhanced suite of products that will, in addition to current products, also have design storms, areal precipitation frequency estimates and confidence intervals of variable widths. All products will be produced using a newly developed non-stationary NOAA Atlas 14 frequency analysis modeling approach that can characterize the uncertainty due to non-stationary climate. As part of that effort, HDSC has been investigating the feasibility of incorporating future climate projections into the process and assessing the added value of new estimates with respect to traditional NOAA Atlas 14 estimates. Initial results indicate several issues that require further investigation; they will be discussed in the presentation.

1C-2 Application of Point Precipitation Frequency Estimates to Watersheds Shih-Chieh Kao, and Scott T. DeNeale, Oak Ridge National Laboratory To support the probabilistic flood hazard assessment (PFHA), probabilistic estimates of areal extreme rainfall depth across various watershed sizes are required. Areal reduction factors (ARFs) have been widely used in hydrologic and hydraulic modeling applications to convert point-based precipitation frequency estimates (such as those found in NOAA Atlas 14) to watershed-scale precipitation frequency estimates. In turn, these watershed-scale precipitation frequency estimates are used to simulate extreme flood stage and discharge for infrastructure design and engineered systems operation. The use of ARF is necessary because high spatiotemporal resolution precipitation observations with long period

  • indicates speaker, ^ indicates remote speaker 3

of records, which are needed for accurate areal rainfall frequency estimation, are generally lacking and do not allow for an appropriate characterization of the associated spatial rainfall patterns.

However, compared to modern precipitation frequency analysis products (e.g., NOAA Atlas 14), the progress of ARF development in the U.S. is relatively slow, and the TP-29 ARFs published in the 1950s are still used in practice today. To improve the understanding of ARF variabilities across different precipitation products, ARF models, return periods, geographical locations, and seasons, this study conducts a comprehensive review of recent ARF research. The report summarizes potential precipitation products for ARF applications and provides use case studies to demonstrate the derivation of ARF in several selected hydrologic regions in the U.S. Based on the results, ARF characteristics and PFHA application challenges are also summarized.

Our overall findings are in line with available literature which suggest ARFs decrease with increasing area, increase with increasing duration, and decrease with increasing return period. The results also show the importance of precipitation data source and ARF fitting method which both contribute to ARF uncertainty. In particular, we find that data length plays an important role in ARF estimation, especially for longer return period ARFs (e.g., greater than 100-year). The study demonstrates the need to improve ARFs with new data and methods for more reliable areal extreme precipitation estimates to support PFHA applications.

One objective of this study is to assist NRC in assessing different classes of fixed-area ARF methods in conjunction with available rainfall data sets to support the development of guidance for application of PFHA. The results of this study are for demonstration purposes only and are not intended to be used for ARF application. Additional research and development efforts, with thorough quality assurance and control performed, should be performed to develop a reliable national ARF product suitable for PFHA application.

1C-3 How well can Kilometer-Scale Models Capture Recent Intense Precipitation Events?

Andreas F. Prein, David Ahijevych, Jordan Powers, Ryan Sobash, Craig Schwartz, National Center for Atmospheric Research (NCAR)

Planning for floods that are associated with very rare and intense precipitation events is challenging due to short observational records and changing climate conditions. Furthermore, shortcomings in traditionally-used estimators of extreme precipitation, such as Probable Maximum Precipitation (PMP) do not allow a quantification of uncertainties in hazard estimates in either a physical or a risk sense.

Recent advancements in atmospheric modeling and computational science offers a promising way forward since they allow the simulation of intense precipitation events in unprecedented detail. These so-called convection-permitting models (CPMs) can explicitly simulate thunderstorms and can accurately represent orography on fine scales and thus are powerful tools for investigating extreme precipitation events. Here we will assess how well recent flood producing rainfall events can be captured by CPMs in different climate regions east of the U.S. continental divide. We use multisensory high-resolution precipitation datasets for the model evaluation and will assess the impact of model horizontal grid spacing (3-km and 1-km), initial conditions (up to 10-ensemble members), and observational uncertainties. Comparing results based on different types of storms (tropical cyclones, mesoscale convective systems, and orographic precipitation) provides a broad overview of skills and deficiencies in state-of-the-art CPMs and allow insights in how applicable they are for flood risk assessment.

  • indicates speaker, ^ indicates remote speaker 4

1C-4 Probabilistic Flood Hazard Assessment of NPP Site Considering Extreme Precipitation in Kore Kun-Yeun Han, Beomjin Kim, Kyungpook National Univ., Daegu, Korea; Minkyu Kim, Korea Atomic Energy Research Institute, Daejeon, Korea The Probable Maximum Precipitation( PMP) considering the climate change scenarios of RCP4.5 and RCP8.5 is computed and compared with the probability rainfall to estimate the LIP (Local Intensive Precipitation) of Nuclear Power Plant site in Korea. The detailed topographic data with high resolutions DEM are constructed and the effects of building, road, and curb at NPP sites are analyzed through several times of walkdown.

In order to evaluate the external flooding risk on NPP, hydrologic/hydraulic analysis are performed. For the external flood hazard analysis, 2D hydrodynamic model is carried out considering LIP and tidal level condition. Based on the simulation results of 2D analysis, flood hazard curves are developed for the inundation depth with frequency and rainfall duration. The internal flooding of SSC (Structure, System and Components) caused by external flooding of the major facilities are also evaluated. The result of this study is expected to be a basis for the waterproof design and planning of various types of flood prevention measures of NPP site.

Keywords: External Inundation; LIP; Hazard Curve; PFHA of NPP Sites Acknowledgement This work was supported by the National Research Foundation of Korea(NRF) grant funded by the Korea government(MSIP) (No. 2017M2A8A4015290) 1C-5 An Analysis of Heavy Multi-day Precipitation Events in CMIP6 Model Simulations in Support of the Fifth National Climate Assessment Kenneth Kunkel, North Carolina Institute for Climate Studies, North Carolina State University, David R Easterling, NOAA National Centers for Environmental Information The Third U.S. National Climate Assessment (NCA3) mainly used climate scenarios generated using the CMIP3 suite of climate model simulations with some generated using CMIP5. The Fourth U.S. National Climate Assessment used the CMIP5 suite of simulations, and also used the Localized Constructed Analog (LOCA) statistical downscaling dataset to generate scenarios of extremes such as changes in heavy precipitation events. With the CMIP6 suite of simulations becoming available the intent of the U.S. Global Change Research Program is to utilize these simulations as much as possible for the Fifth U.S.

National Climate Assessment. One major question raised during the NCA process is how extremes, such as hydrological extremes, have changed and will change in the future. Here we examine heavy precipitation events by finding the largest multi-day events for various sized areas (e.g. 50,000 km2) in the observed record for the eastern United States, then examine both the simulations directly from CMIP5, two statistical downscaling methods driven by CMIP5 simulations, and two simulations from CMIP6 for their ability to produce similar precipitation events. Secondly, we examine the ability of both models and downscaling methods to reproduce the observed spatial coherence of the point precipitation amounts across the simulated precipitation events.

  • indicates speaker, ^ indicates remote speaker 5

2A-1 KEYNOTE: An Overview of NOAAs National Water Model Brian A. Cosgrove1, David Gochis2, Thomas Graziano1, Ed Clark3, and Trey Flowers3 1Office of Water Prediction, National Weather Service, NOAA, Silver Spring, MD, 2Research Applications Laboratory, National Center for Atmospheric Research, Boulder, CO, 1National Water Center, Office of Water Prediction, National Weather Service, NOAA, Tuscaloosa, AL, USA The National Water Model (NWM) has been running in National Weather Service (NWS) operations since August of 2016. Producing 24x7 guidance on streamflow, soil moisture, snowpack and other hydrologic components, the NWM supports the operational activities of NWS River Forecast Centers, the Federal Emergency Management Agency and other government entities, along with research and commercial sectors. Based on the community WRF-Hydro software architecture, it has been rapidly upgraded via a partnership between the NWS Office of Water Prediction (OWP), the National Center for Atmospheric Research and the National Centers for Environmental Prediction. As with prior versions V2.0, implemented in June of 2019, is underpinned by a network of 2.7 million vector-type river reaches for river routing, a 1km land surface grid for land surface modeling, and a 250m grid for surface and subsurface routing of runoff. This latest operational version builds on prior capabilities to provide improved accuracy and first-time ensemble forecast guidance. Additionally, the NWMs expansion to Hawaii marks the first ever provision of operational streamflow guidance to this island domain.

Following on from V2.0, V2.1 will be implemented into operations in early 2021. This significant upgrade will include the assimilation of reservoir outflow data which will greatly improve the accuracy of downstream forecasts. Domain expansion will continue via the inclusion of the Great Lakes drainage area, along with Puerto Rico and the US Virgin Islands. Additionally, calibration will be improved via the use forcings from the new Analysis of Record for Calibration. Improving upon the existing 25-year NWM retrospective, this same forcing dataset will be used to underpin a new 40-year retrospective simulation.

Looking further into the future, subsequent versions will contain upgrades needed to support a variety of additional activities within the NWS and broader hydrologic community. These include a model extension to simulate combined impact of freshwater and coastal flooding, a shallow groundwater model, and an improved NextGen collaborative development infrastructure.

This presentation will provide a brief history of the NWM, give a general overview of the current system, and cover current and emerging NWM products. It will highlight recent and planned NWM upgrades, along with updates on community development and other hydrologic activity areas.

2A-2 Moving Beyond Streamflow: Quantifying Flood Risk and Impacts through Detailed Physical Process and Geospatial Representation using the WRF-Hydro Modeling System David Gochis, Aubrey Dugger, Laura Read, National Center for Atmospheric Research Operational flood and flash flood prediction models, such as the NOAA National Water Model, offer a stable, reliable forecasting service providing complete continental coverage in a 24/7/365 time delivery.

However, various requirements and constraints associated with operational systems can limit their tailoring to specific types of water-related risks, particularly when it comes to understanding changes in future flooding risk. Dynamics associated with changing weather and climate patterns, changing sea levels, and changing land cover/land use conditions can drive dramatic changes in flood risk and often need to be characterized using a risk-based approach. In this presentation we will present a number of different configurations and applications of the community WRF-Hydro modeling system that

  • indicates speaker, ^ indicates remote speaker 6

demonstrate the system's capability to provide meaningful information regarding water-related environmental risks.

2A-3 Extreme Flood Hazard Assessment - Overview of a probabilistic methodology and its implementation for a Swiss river system V.N. Dang, C.A. Whealton, Paul Scherrer Institute The project Extreme Floods in the Aare River (EXAR) has recently been completed in Switzerland.[1] The main objectives of this project were to: 1) provide a simulated hydrological dataset that can be used for flood hazard analysis throughout the Aare River Basin, 2) account for processes that are induced or correlated with flooding events in the hazard analysis, and 3) implement the methodology for multiple sites to assess the flood hazard in the frequency range of 1E-3 to 1E-7/a.

A simulated hydrological dataset was taken as the basis for assigning frequencies to flows with exceedances of less than 1E-3/a. This data was a result of a modeling chain that included a weather generator (GWEX), runoff model (HBV), and simplified routing model of the river system to route the flows from each catchment into the main river (RS Minerve). The result was approximately 300,000 years of hourly flow values simulated with three parameter sets (~900,000 years of data).

In the hazard analysis, each structure was analyzed not only for the possible impacts to the downstream sections of the river but also for local impacts at sites of interest as well. The main structures considered were levees (failure), bridges (clogging with driftwood), weirs (clogging and gate operation failure), and landslides (partial or full blockage of channel). All relevant and non-negligible events were retained in an event tree analysis for the sites and scenarios were simulated with a 2-D hydraulic model (BASEMENT).

The project characterized epistemic uncertainties from each of the models or analyses used in this project; a number of these were quantified and propagated while the significance of others were addressed with sensitivity analyses. Frequency uncertainty was considered from the different parameterizations of the runoff model, which led to different exceedance curves, uncertainty in the probability of landslides, and in some cases the likelihood of driftwood clogging at structures. The water level uncertainty included uncertainty from the hydrologic simulation and possible effects from landslides. The presentation will also discuss the key limitations of the methodology based on the comprehensive implementation experience gained in the project.

[1] Andres N., Badoux A., Hegg Ch. (Ed.) 2019: Grundlagen Extremhochwasser Aare. Hauptbericht Projekt EXAR. Methodik und Resultate. (Bases for the extreme flood hazard on the Aare River. Main report of project EXAR.) Swiss Federal Institute for Forest, Snow and Landscape Research WSL. (in German, forthcoming.)

2A-4 Practical Approaches to Probabilistic Flood Estimates: an Australian perspective Rory Nathan, Infrastructure Engineering, University of Melbourne The influence of hydrologic variability on flood estimates has traditionally been accommodated using simple approaches based on the use of averaged inputs and simplified assumptions about their joint interaction. Such simplifications can be configured to reproduce probabilistic estimates of flood risk, though without independent verification such estimates must be regarded as an act of faith. And even where the means to independently verify probabilistic estimates exist, their extrapolation to conditions

  • indicates speaker, ^ indicates remote speaker 7

beyond those found in the observed record introduces additional uncertainty that is not easily defended.

The Australian national flood guidelines have just been revised after ten years of effort by a large team of specialist practitioners and academics. A key focus of the revised guidelines has been the development and promulgation of practical methods for the explicit consideration of the joint probabilities involved in the transformation of rainfall to flood runoff. The objective of these methods is to achieve a probability-neutral transformation of rainfall probabilities to flood probabilities. At its simplest, the guidelines advocate for the use of an ensemble of temporal patterns, which in many instances is the dominant influence (after rainfall) that influences the magnitude (and/or rarity) of the resulting flood. For more complex problems Monte Carlo approaches are recommended for use with event-based approaches; at their simplest, these approaches can consider the joint occurrence of variable antecedent catchment wetness and temporal patterns, though these frameworks are easily extended to consider the joint interaction with variable spatial patterns of rainfall, antecedent snowpack, and other factors relevant to the site-specific nature of the problem.

The emphasis of these approaches is to consider the influence of aleatory rather than epistemic sources of uncertainty; that is, factors arising from natural hydrologic variability rather than those arising from measurement errors and limitations in our understanding. To support these approaches, a national data base has been developed that provides information on ensembles of point and areal temporal patterns, probabilistic behaviour of antecedent and continuing losses, areal reduction factors, baseflows, and pre-burst rainfalls. This includes tools for the joint probability modelling of estuarine regions, regional estimates of flood quantiles, and a multi-site rainfall simulator for the stochastic generation of daily rainfall at multiple locations.

While the Australian guidelines are supported by an extensive suite of design products, the underlying nature of the methods are generically applicable, and for many practical problems the information required to accommodate the primary sources of uncertainty are readily found in the observed records.

2A-5 Regional-Based Approaches for Developing Synthetic Hydrology and Stage Frequency Curves in the Columbia River Basin Angela M. Duren, Northwest Division, U.S. Army Corps of Engineers, Portland, Oregon A regional-based approach was used in the development of the stage frequency curves for 7 of the 13 dams in the Willamette Basin, part of the larger Columbia River Basin, for use in an Issue Evaluation Study (IES)-level dam safety analysis. This saved both money and time and also allowed for larger regional studies used as the basis of the stage frequency curves to be developed for use in future studies. This includes a regional volume skew study for flow frequency analysis, a regional precipitation frequency curve analysis, regional site-specific Probable Maximum Precipitation (PMP) analysis, and basin-wide hydrologic and reservoir operations modeling via HEC-HMS and HEC-ResSim linkage in a holistic watershed HEC-WAT model. The Willamette work formed the basis of the design for the larger on-going Columbia River Basin (CRB) hydrology studies, in which a regional precipitation and snow water equivalent (SWE) frequency analyses have been done, numerical modelling is being performed for improved period of record meteorologic data and PMP estimation, and regional modeling is being performed for development of synthetic hydrology and stage frequency curves that account for meteorologic and hydrologic uncertainty.

  • indicates speaker, ^ indicates remote speaker 8

2A-6 Reducing uncertainty in estimating rare flood events using paleoflood analyses:

Insights from an investigation near Stillhouse Hollow Dam, Texas Justin Pearce, USACE, Risk Management Center; Brian Hall, USACE, Alessandro Parola, USACE Fort Worth; Brendan Comport, USACE Seattle; Christina Leonard, Utah State University A reconnaissance-level paleoflood investigation was completed to support characterization and reduce uncertainties in large hydrologic loadings near Stillhouse Hollow Dam, Texas. Desktop analysis identified several remnant flights of Holocene terrace surfaces along the Lampasas River near Stillhouse Hollow Reservoir that were used as physical evidence of past large floods. Field geomorphic mapping, soil exposures, and analysis of aerial imagery demonstrated that lower terraces (Qt1, Qt2) were inundated during historic and modern peak flows. A higher terrace (Qt3), formed about 3,600 years ago, had soil profile characteristics suggestive of a non-exceedance boundary for large fluvial discharges. Two-dimensional hydraulic modeling, coupled with interdisciplinary collaboration, estimated that a minimum discharge of about 300,000 cfs would be needed to just inundate the Qt3 terrace, in an unregulated (non-dam) scenario. Historical large flood events (e.g., the 1873 and 1921 floods) were estimated and integrated with the systematic record using perception thresholds. This inclusion has the largest effect on the mean, standard deviation, and skew of the peak inflow frequency curve. Adding the paleodischarge non-exceedance bound estimation (Qt3 NEB) to the systematic-plus-historical record did not substantially change the mean and standard deviation of the peak inflow frequency curve as compared to the systematic-plus-historical record. However, addition of paleoflood NEB to the systematic-plus-historical record reduces uncertainty (measured as 90% confidence intervals) in the peak discharge estimates at the 1/10,000 AEP by about a factor of 1.5. Peak inflow volume-frequency analysis indicates that large discharges along the Lampasas River might occur more often than would be expected from analysis of systematic gage records alone; this information was used to support risk-informed assessments of the overtopping hazard.

2A-7 Improving flood-frequency analyses with a 4,000-year record of flooding on the Tennessee River near Chattanooga, Tennessee Tessa M. Harden1, Jim E. OConnor1, Meredith L. Carr2, 1U.S. Geological Survey, 2Nuclear Regulatory Commission The primary purpose of this comprehensive field study was to use paleoflood hydrology methods to characterize the frequency of recurrence of low-probability floods and to inform and improve estimates of flood risk for the Tennessee River near Chattanooga, Tennessee. The main source of information used to improve flood-frequency estimates was stratigraphic records of large, previously unrecorded floods combined with modern streamflow records and historical flood accounts. The overall approach was to (1) develop a flood chronology for the Tennessee River near Chattanooga using stratigraphic analyses and geochronology from multiple sites at multiple elevations in the study area, (2) estimate peak flow magnitudes associated with elevations of flood evidence using a 1D hydraulic model, (3) combine the information developed for steps 1 and 2 to develop a history of timing and magnitude of large floods in the study reach, and (4) use all available information, including paleoflood, gaged, and historical records of flooding to estimate flood-frequency with a standardized statistical approach for flood frequency analysis.

The stratigraphy, geochronology, and hydraulic modelling results from all sites along the Tennessee River were distilled into an overall chronology of the number, timing, and magnitude of large

  • indicates speaker, ^ indicates remote speaker 9

unrecorded floods. In total, dozens of sites were identified and the stratigraphy of 17 of those sites were examined and described. Flood frequency analyses were performed using the USGS software program PeakFQ v7.2 that follows the Guidelines for Determining Flood Flow Frequency - Bulletin 17C.

Condensing all 17 sites into a single flood chronology for the Tennessee River near Chattanooga revealed eight unique floods in the last 3,500 - 4,000 years. Two of these floods had discharge magnitude of about 470,000 ft3/s, slightly above the 1867 historical peak at the Chattanooga gage (459,000 ft3/s). One was 1,100,000 ft3/s, substantially larger than any other flood on the Tennessee River during the last ~4,000 years. This large flood likely occurred only a few hundred years ago, possibly in the mid-to-late-1600s. Two additional floods in the last 1,000 years had estimated magnitudes of about 420,000 ft3/s and 400,000 ft3/s. The remaining three unique floods identified in the stratigraphy were much smaller than the others-less than 240,000 ft3/s.

Flood frequency analyses for all flood scenarios performed in this study indicate that the addition of paleoflood information markedly improves estimates of low probability floods - most clearly indicated by substantial narrowing of the 95-percent confidence limits. The 95 percent confidence interval for the 1,000-year quantile estimate derived from incorporating the four most recent paleofloods is about 480,000-620,000 ft3/s, compared to about 380,000 - 610,000 ft3/s from the gaged record alone (which includes the historical 1867 flood), a reduction of 38 percent. Similarly, uncertainty for all flood quantile estimates between 100-years and 10,000-years were reduced by 22-44 percent by adding the paleoflood record to the gaged and historical record in the flood frequency analyses. This reduction in uncertainty can lead to more reliable flood hazard assessments.

2A-8 Estimating Design Floods with Specified Annual Exceedance Probabilities Using the Bayesian Estimation and Fitting Software (RMC-BestFit)

C. Haden Smith, P.E., USACE Risk Management Center The U.S. Army Corps of Engineers (USACE) Risk Management Center (RMC), in collaboration with the Engineer Research and Development Center (ERDC) Coastal and Hydraulics Laboratory (CHL), developed the Bayesian estimation and fitting software RMC-BestFit to enhance and expedite flood hazard assessments within the Flood Risk Management, Planning, and Dam and Levee Safety communities of practice.

Design floods for most dams and levees typically have an annual exceedance probability (AEP) of 1:100 (1E-2) or less frequent. In the U.S., high hazard dams are designed to pass the Probable Maximum Flood (PMF), which typically has an AEP of 1:10,000 (1E-4) or less frequent. In order to reduce epistemic uncertainties in the estimated AEP for extreme floods, such as the PMF, it is important to incorporate as much hydrologic information into the frequency analysis as reasonably possible. This presentation demonstrates a Bayesian analysis framework, originally profiled by Viglione et al. (2013), for combining limited at-site flood data with temporal information on historic and paleofloods, spatial information on precipitation-frequency, and causal information on the flood processes. This framework is implemented in the RMC-BestFit software, which is used to evaluate the flood hazard for Lookout Point Dam, a high priority dam located in the Willamette River Basin, upstream of Portland, Oregon. Flood frequency results are compared with those from the Expected Moments Algorithm (EMA). Both analysis methods produce similar results for typical censored data, such as historical floods; however, unlike the Bayesian analysis framework, EMA is not capable of incorporating the causal rainfall-runoff information in a formal, probabilistic manner. Consequently, the Bayesian method considered herein provides higher

  • indicates speaker, ^ indicates remote speaker 10

confidence in the fitted flood frequency curves and resulting reservoir stage-frequency curves to be used in dam and levee safety risk assessments.

2B -1 KEYNOTE: South Atlantic Coast Study: Coastal Hazards System Norberto C. Nadal-Caraballo, PhD; Chris Massey, PhD; Victor M. Gonzalez, PE, U.S. Army Corps of Engineers, Engineer Research and Development Center, Coastal and Hydraulics Laboratory (USACE/EDRC/CHL), Kelly Legault, PhD, USACE, Jacksonville District Seven of the ten costliest U.S. tropical cyclones (TCs) have made landfall within the boundaries of the USACE South Atlantic Division (SAD) region. The devastation caused by recent TCs such as Hurricane Michael and Hurricane Maria have underscored the need for accurate quantification of coastal storm hazards. The South Atlantic Coast Study (SACS) is an on-going effort by SAD and the U.S. Army Engineer Research and Development Centers Coastal and Hydraulics Laboratory (ERDC-CHL) to expand the Coastal Hazards System (CHS) to cover the SAD domain. The CHS is a national program for the quantification of extreme coastal hazards that directly supports a wide range of coastal engineering and science activities within the federal government, private sector, and the academia. CHS includes a database, web-based data mining, and tools for the visualization of Probabilistic Coastal Hazards Analysis (PCHA) results.

The goal of the SACS-CHS is to quantify storm hazards under existing and future sea-level-change (SLC) conditions, in order to aid decision-making and employ modern engineering methods focused on reducing flooding risk and increasing resiliency. It encompasses three U.S. coastal regions: Phase 1, Puerto Rico and the U.S. Virgin Islands; Phase 2, North Carolina to South Florida; and Phase 3, South Florida to Mississippi. Conducting PCHA within these regions requires the development and simulation of synthetic TCs covering the practical physical-parameter and probability spaces. For the SACS-CHS, approximately 2,500 synthetic TCs are being simulated considering present-day conditions and future SLC scenarios, requiring over 250 million CPU-hours in a high-performance computing (HPC) environment. Coastal hazards to be computed include storm surge, wave climate, wind, and currents.

SACS extends the coverage of the CHS to the entire U.S. hurricane-exposed coastline, with the exception of Southern California.

2B-2 South Atlantic Coast Study: Coastal Hazards System Norberto C. Nadal-Caraballo, PhD; Chris Massey, PhD; Victor M. Gonzalez, PE; Efrain Ramos-Santiago; Madison O. Campbell, U.S. Army Corps of Engineers, Engineer Research and Development Center, Coastal and Hydraulics Laboratory (USACE/EDRC/CHL)

Current approaches for probabilistic storm surge modeling rely on the joint probability analysis of tropical cyclone (TC) forcing and responses to overcome the temporal and spatial limitations of historical TC observations. Probabilistic coastal hazard analysis requires the quantification and propagation of uncertainties associated with the use of different data, models, and methods. This is of particular importance for critical infrastructure such as nuclear power plants where the quantification of storm surge hazard is sought for very small annual exceedance probabilities. The U.S. Army Engineer Research and Development Center, Coastal and Hydraulics Laboratory (ERDC-CHL) has performed a comprehensive assessment of uncertainties in probabilistic storm surge models in support of the U.S.

Nuclear Regulatory Commissions (USNRC) efforts to develop a framework for probabilistic storm surge hazard assessment for nuclear power plants.

  • indicates speaker, ^ indicates remote speaker 11

The examination of aleatory variability and epistemic uncertainty associated with the consideration of alternate technically defensible data, models, and methods was based on the application of the joint probability method (JPM). The JPM has become the standard probabilistic model used to assess coastal storm hazard in hurricane-prone U.S. coastal regions. This assessment also considered the use of methods not typically associated with the JPM such as Monte Carlo Simulation and surrogate modeling through the development of Gaussian process metamodels (GPMs). Specific topics that were examined include storm recurrence rate models, methods for defining joint probability of storm parameters, methods for generating synthetic storm simulation sets, and the integration of error terms in the development of hazard curves. The last topic included evaluating methods for calculating the error of the numerical storm surge model, distribution of the error, evaluation of Holland B as a JPM parameter, and characterization of the uncertainty in the integral. The approach followed was informed by USNRC guidance on probabilistic seismic hazard assessment (PSHA), in which uncertainty is propagated through the use of logic trees and quantified through the development of a family of hazard curves.

2B-3 Using Physical Insights in Spatial Decomposition Approaches to Surge Hazard Assessment Jennifer L. Irish1, Donald T. Resio2, Michelle Bensi3, Taylor G. Asher4, Yi Liu1,5, Jun-Whan Lee1, 1Virginia Tech, 2University of North Florida, 3University of Maryland, 4University of North Carolina, 5Environmental Science Associates The import of reliable probabilistic hurricane surge hazard assessment continues to grow as disasters emanating from these events become more prevalent. There have recently been a number of advances in hurricane surge hazard assessment, which consider a very large number of synthetic storms in order to produce a more statistically robust probabilistic assessment. Yet, application of these approaches remains constrained by the computational burden associated with high-fidelity storm surge simulation.

Herein, we present a rapid storm surge predictive model that leverages physical insights along with spatial decomposition in order to reduce the dimensionality respectively in the storm parameter and the geographic spaces. In developing this hybrid predictive model, ease of use by being intuitive, transparent, and reproducible was favored over incremental improvements in surge prediction accuracy. Error and associated with this hybrid predictive model will also be presented.

2B-4 Investigation and Comparison of Surrogate Modeling Applications in Storm Surge Assessment Azin Al Kajbaf1, Michelle Bensi2, 1-Graduate Assistant, Department of Civil and Environmental Engineering, University of Maryland, College Park, MD, akajbaf@terpmail.umd.edu, 2-Assistant Professor, Department of Civil and Environmental Engineering, University of Maryland, College Park, MD, mbensi@umd.edu Major hurricane events in the past two decades have led to significant advancement in simulation models that can facilitate accurate and efficient storm surge estimation. Lack of numerical prediction models that can simultaneously provide high-fidelity results and real-time storm surge forecasts, has motivated the use of surrogate modeling methods (e.g. ANN, GPR) as an alternative approach that can balance efficiency and accuracy in storm surge prediction. With regards to recent efforts in exploring the operational application of surrogate modeling methods for surge prediction, there is a need for a comprehensive framework that thoroughly assesses and compares the performance of the method that are frequently used for storm surge prediction. These methods include Artificial Neural Network (ANN),

  • indicates speaker, ^ indicates remote speaker 12

Support Vector Regression (SVR), and Gaussian Process Regression (GPR; also known as Kriging models).

One of the challenges with applying these methods in current state of practice is that their performance is usually assessed through aggregated error/loss metrics (e.g. R, RMSE) which might give incomplete information regarding performance. Furthermore, no study is available which compare all of these models together. In this study, the performance of the surrogate models of ANN, GPR and SVR for storm surge prediction is explored through a comprehensive framework that examines the stability of performance across training sample sizes, identifies systematic trends in errors, assesses performance in predicting large (i.e., risk-significant) surges, and characterizes the distribution of error.

3A-1 Structured Hazard Assessment Committee Process for Flooding (SHAC-F) for Probabilistic Flood Hazard Assessment (PFHA)

Rajiv Prasad and Philip Meyer, Pacific Northwest National Laboratory, Kevin Coppersmith, Coppersmith Consulting, Norberto C. Nadal-Caraballo and Victor M. Gonzalez, USACE/ERDC/CHL The Pacific Northwest National Laboratory (PNNL) led the development of the structured hazard assessment committee process for flooding (SHAC-F). One of the main goals of SHAC-F is to bring consistency to probabilistic flood hazard assessments (PFHAs). SHAC-F studies can be carried out at three levels, increasing in complexity and levels-of-effort from the lowest to the highest levels. Flood hazard assessments can support a variety of purposes for a nuclear power plant (NPP) permitting, licensing, and oversight activities. Therefore, SHAC-F study levels are structured to explicitly support these purposes. A Level 1 SHAC-F study is designed to support rapid decisions for screening and binning NPP structures, systems, and components (SSCs) into risk categories. A Level 2 SHAC-F study is designed to (1) refine a Level 1 SHAC-F study that did not adequate resolve screening and binning of SSCs of interest or (2) update a Level 3 SHAC-F study considering availability of additional data, models, and methods. A Level 3 SHAC-F study is the most complex and used to support NPP permitting and licensing or probabilistic risk assessments involving plant-wide assessment of flood hazards and associated effects. Regardless of the level, SHAC-F studies must capture the range aleatory variability and the range of epistemic uncertainty reflected in the knowledge of the larger, technically informed flood hazard assessment community.

In a Level 1 SHAC-F study, a flood-frequency analysis using readily accessible hydrometeorological data combined with a relatively simple on-site hydraulic modeling may be performed by a small project technical team with expertise in statistical modeling, regional hydrometeorology, and site hydraulics.

The participatory peer review panel (PPRP) may be structured similarly to the project technical team. In a Level 2 SHAC-F study to refine a previous Level 1 SHAC-F study, additional data collection and model refinement in consultation with experts may be performed. The project team could consist of Technical Integration (TI) teams and spend more time consulting with data and model experts. In a Level 2 SHAC-F study to update a previous Level 3 SHAC-F study, the TI teams would evaluate and integrate additional data, models, and methods. Evaluation and integration may need consultation with data owners and model developers. In a Level 3 SHAC-F study, the project technical team is the largest, consisting of meteorological and hydrological/hydraulic TI teams. The TI teams, led by a Project Technical Integrator, may need support for database and geographical information system management and specialty contractors for data collection or model simulations.

PNNL is working with the U.S. Army Corps of Engineers Coastal and Hydraulics Laboratory (CHL) to adapt the SHAC-F approach to coastal flooding from tropical cyclones. The Coastal SHAC-F levels are defined similarly to the three SHAC-F levels described above. In a Level 1 Coastal SHAC-F study, flood hazards

  • indicates speaker, ^ indicates remote speaker 13

may be estimated based on analyses of observed extreme tide levels using statistical modeling and relatively simple site-scale hydraulic models. A Level 2 Coastal SHAC-F study could leverage existing probabilistic storm surge studies combined with site-scale hydraulic models. A Level 3 Coastal SHAC-F study would perform the full probabilistic storm surge analysis following the joint probability method and a detailed site-scale modeling to estimate site-wide flood hazards.

3A-2 Using HEC-WAT to Conduct a PFHA on a Medium Watershed William Lehman, Brennan Beam, Matthew Fleming, Leila Ostadrahimi, U.S. Army Corps of Engineers, Hydrologic Engineering Center, Joseph Kanney, Meredith Carr, U.S. Nuclear Regulatory Commission The Hydrologic Engineering Centers Watershed Analysis Tool (HEC-WAT) supports the evaluation of hydraulic hazards at sites throughout a floodplain. The example shows how to leverage HEC-HMS (hyrometerological processes), HEC-ResSim (reservoir operations), and HEC-RAS (river hydraulics and dam breaks) to work in concert to evaluate the impact of Aleatory and Epistemic uncertainties on the frequency of loading at a site in the floodplain. Within HEC-HMS, Markov Chain Monte Carlo was used to evaluate parameter sets in the HEC-HMS model, which is a strategy to improve model behavior during simulation. In HEC-ResSim, uncertainty distributions were used based on historic data for starting pool (treated as Aleatory for the IID events) to show the impact of that parameter on the hazard frequency curve. Epistemic uncertainty in precipitation frequency and dam failure (including comparisons with and without dam failure) impact the uncertainty bands around the hazard frequency curves and show how our limited knowledge impacts our ability to describe extreme events. The application required the use of distributed computing and stratified sampling to manage to report the Epistemic uncertainty associated with loading at the location out to 10e-6. This presentation will show how these complex software systems work together to describe a complex natural system to inform and improve our ability to make decisions in light of uncertainty.

3A-3 Paleoflood Analyses for Probabilistic Flood Hazard AssessmentsApproaches and Review Guidelines Tessa M. Harden, Karen R. Ryberg, Jim E. OConnor, Jonathan M. Friedman, Julie E. Kiang, U.S.

Geological Survey Paleoflood studies are an effective means of providing specific information on the recurrence and magnitude of rare and large floods, which can be combined with systematic flood measurements to improve the ability to accurately assess hydrologic risk to critical infrastructure. Paleoflood data also provides valuable information about the linkages among climate, land use, channel morphology and flood frequency. Standards of practice for conducting and reviewing such studies, however, are lacking, inhibiting their effective use in regulatory decision making. This presentation summarizes methods and techniques for preparation, collection, evaluation, and interpretation of paleoflood information, including uncertainties, especially with respect to new statistical approaches available to efficiently use such data in flood frequency analyses. Also presented will be guidance on the levels of study appropriate for specific questions or issues as well as appropriate corresponding levels of technical review.

  • indicates speaker, ^ indicates remote speaker 14

3A-4 Probabilistic Assessment of Flood Hazards Due to Combinations of Flooding Mechanisms: Study Progress and Next Steps Michelle (Shelby) Bensi,1 Somayeh Mohammadi,1 Shih-Chieh Kao,2 and Scott T. DeNeale2, 1University of Maryland; 2 Oak Ridge National Laboratory Flooding of nuclear power plants (NPPs) and other infrastructure can occur as a result of events involving one or multiple coincident or correlated flood mechanisms. Existing approaches for probabilistic flood hazard assessment (PFHA) focus primarily on the occurrence of a single flood hazard mechanism. However, multi-mechanism flood (MMF) events may result in flooding with severity, duration, characteristics, and extent of impacts that differ from the effects of floods involving a single mechanism. Moreover, the estimated frequency of occurrence of flood severity metrics (e.g., flood elevation or depth) may change (increase) when considering the enhanced impacts of MMF events.

Thus, to have a comprehensive estimate of flood hazards for our critical infrastructures, it is important to consider events involving both single and multiple flood mechanisms.

To extend the state-of-practice of multi-mechanism flood analysis, this study focuses on the identification of existing research and development of new methods to probabilistically assess hazards associated with MMF events. This research project is funded by the U.S. Nuclear Regulatory Commission PFHA Research Program with an intent to support the development of future guidance on PFHA. This presentation provides an overview of project research activities focusing on identification of existing approaches for probabilistically assessing MMF events and provides a critique and gap assessment of the current state of practice. It further discusses options for leveraging and extending approaches that show promise (with or without modifications) to support probabilistic assessment of MMF hazards associated with the range of return periods of relevance to NPPs and other critical infrastructure.

3B-1 Risk and Operational Insights of the St. Lucie Flooding Event John David Hanna, Senior Reactor Analyst, Division of Reactor Projects, NRC Region III- Chicago, IL While working in Region II, Mr. Hanna analyzed the risk impact of the St. Lucie findings/violations associated with degraded flood barriers. These impaired barriers revealed themselves during a Localized Intense Precipitation event in January 2014 which deposited 50,000 gallons of water in the Reactor Auxiliary Building. The presenter will discuss the (sometimes counter-intuitive) risk and operational insights associated with these findings, with an eye towards providing recommendations to nuclear plant operators, risk analysts and maintenance personnel.

3B-2 Reflections on Fort Calhoun Flooding Yellow Finding and 2011 Flooding Event

Response

Gerond A. George, Senior Reactor Inspector, Division of Reactor Safety, NRC Region IV - Arlington, TX Senior Inspector Gerond George will briefly share stories and pictorial evidence that led to NRCs identification of risk significant flood protection issues at Fort Calhoun in 2009 and 2010. Mr. George will discuss the influence of the Yellow Flood Finding on OPPDs readiness to protect the plant during the 2011 Missouri River floods. Mr. George will provide pictures of the flood protection equipment installed by OPPD during the 2011 event and the result of those activities. Using this operating knowledge, Mr.

George will provide his insights into how to this experience can enhance risk analysis.

  • indicates speaker, ^ indicates remote speaker 15

3B-3 2019 Cooper and Fort Calhoun Flooding Event Response Patricia Vossmar, Mike Stafford, NRC Region IV The NRC Resident Inspector and former Senior Resident Inspector at Cooper Nuclear Station will present information on the 2019 Missouri River flooding event that affected both Cooper and the permanently shut down Fort Calhoun Station. The presenters will discuss the impact of the flooding event on both nuclear plants, NRC and utility flood response activities, key lessons learned during the flood, and the unanticipated aftereffects of the flood. The presenters will share pictures from their experiences onsite at both nuclear plants, as well as pictures of regional damage from the 2019 flood.

3D-1 EPRI External Flooding PRA Activities Marko Randelovic, Electric Power Research Institute (EPRI)

Qualitative Risk Ranking Process of External Flood Penetration Seals Preventing water from entering into areas of NPPs that contain significant safety components is the function that various flood-protection components serve across the industry. Several types of flood barriers, both permanent and temporary, are used at NPPs. These barriers include external walls, flood doors, and flood barrier penetration seals (FBPSs) that allow cables, conduits, cable trays, pipes, ducts, and other items to pass between different areas in the plant. A comprehensive guidance on the design, inspection and maintenance of flood-protection components has been assembled in EPRIs technical report Flood Protection System Guide. This document includes information related to these topics for a variety of flood-protection components, while focusing specifically on FBPSs. The NRC-RES has initiated a project to develop testing standards and protocols to evaluate the effectiveness and performance of seals for penetrations in flood rated barriers at nuclear power plants. EPRI is currently developing a qualitative risk ranking process for the plants to categorize, or risk-rank installed penetration seals according to the likelihood and consequence of seal failure(s) considering the various metrics regarding seal condition, design, and location. In addition to identifying potentially risk significant FBPS for prioritization of surveillance and/or replacement, plants performing an external flood probabilistic risk assessment (PRA) may use this process to identify which penetrations may need to be explicitly modeled in the PRA. The intent of this guidance is to provide a process to categorize and rank penetration seals with regard to likelihood of failure and the significance of a loss of the penetration sealing capability.

External Flooding PRA Walkdown Guidance As a result of the accident at Fukushima Dai-ichi, the need to understand and account for external hazards (both natural and man-made) has become more important to the industry. A major cause of loss of AC power at Fukushima Dai-ichi was a seismically induced Tsunami that inundated the plants safety-related systems, structures and components (SSCs) with flood water. As a result, many nuclear power plants (NPPs) have reevaluated their external flooding (XF) hazards to be consistent with current regulations and methodologies. As with all new information obtained from updating previous assumptions, inputs and methods (AIMs), the desire exists to understand the changes in the characterization of the XF hazard and the potential impact to the plants overall risk profile. This has led to an increased need to develop a comprehensive External Flooding Probabilistic Risk Assessment (XFPRA) for more NPPs. One of the steps for developing XFPRA is the plant walkdown, which is the central focus of the research. This research provides guidance on preparing for and conducting XF walkdowns to gather the necessary information to better inform the XFPRA process. Major topics that

  • indicates speaker, ^ indicates remote speaker 16

will be addressed include defining key flood characteristics, pre-walkdown preparation, performing the initial walkdown, identifying the need for refined assessments or walkdowns, and documenting the findings in a notebook. This guidance also addresses walkdown team composition, guidance on useful plant drawings and utilizing previous walkdowns or PRAs to inform the XF walkdown process.

External Flooding PRA Guidance The objective of this report is to develop a generic roadmap and associated guidance to support the development of External Flood PRA consistent with meeting Category II requirements of Part 8 of the ASME/ANS RA-Sb-2013, Standard for Level 1/Large Early Release Frequency Probabilistic Risk Assessment for Nuclear Power Plant Applications, including:

(1) Identification of external flood hazards applicable to the site (2) Definition and characterization of the external Flood Hazard considering event and plant-specific issues (3) Estimation of associated external flood hazard frequencies (4) Development of external flood fragility models for flood significant Systems, Structures and Components (SSCs)

(5) Development of external flood hazard scenarios As warning time and ad-hoc system operation may be a unique feature for some external flood scenarios, this report also addresses the role of preparatory/preventive (pre-event) and post-event human actions in mitigating and responding to the external flood hazard with considerations for modeling non-safety grade equipment for long term operation. The guidance includes example applications for representative external flooding scenarios, including those resulting from local intense precipitation, riverine flooding and storm surge scenarios.

3D-2 KEYNOTE: Computational Methods for External Flooding PRA Curtis L. Smith, Idaho National Laboratory The Idaho National Laboratory is demonstrating next-generation risk-assessment methods and tools that support decision-making by combining physics-based models with probabilistic quantification approaches. Integrating the two worlds of physics and probability using a simulation framework leads us to predictions based upon an approach called computational risk assessment. During our external flooding research and development, we have identified four factors that are key to enhanced analysis:

temporal (timing issues), spatial (location issues), mechanistic (physics issues), and topological (complexity issues). We will discuss the computational approach for external flooding risk assessment, focusing on these four factors. And, while these newer methods and tools can provide increased realism in our risk approaches, their greater benefit is to provide a risk-informed engineering framework for design and operation.

  • indicates speaker, ^ indicates remote speaker 17

3D-3 External flooding PSA in IRSN - Developments and insights Maud KERVELLA, Gabriel GEORGESCU, Claire-Marie DULUC, IRSN External flooding PSA is a relatively new subject in France. The first studies have been carried out by EDF (French NPPs operator) beginning in 2018. These studies take into account riverine flooding or coastal flooding depending on sites. They have been reviewed by IRSN in the frame of the Periodic Safety Review VD4 900 (review ended in July 2019). On its side, IRSN is developing its own external flooding PSA study, which takes into account coastal flooding for the Gravelines site in France. The followed methodology is similar to EDFs one completed by several aspects, which were also discussed with EDF during the PSA review (like: reliability of flooding protection components, uncertainties on the phenomena studied taken into account, etc.). Aspects linked to HRA are also important in such studies. A working group has been created on IRSN side to discuss possible approaches to quantify the human reliability relevantly. An overview of the IRSN and EDF flooding PSA methodologies and of main insights gained from these studies will be given in the presentation

  • indicates speaker, ^ indicates remote speaker 18