ML15237A323

From kanterella
Revision as of 09:29, 5 February 2020 by StriderTol (talk | contribs) (Created page by program invented by StriderTol)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search
Attachment 2 - A.15-02-023 Prepared Testimony of Sam Blakeslee
ML15237A323
Person / Time
Site: Diablo Canyon  Pacific Gas & Electric icon.png
Issue date: 07/14/2015
From:
Alliance for Nuclear Responsibility
To:
Japan Lessons-Learned Division
DiFrancesco N, NRR/JLD, 415-1115
Shared Package
ML15237A311 List:
References
A.15-02-023
Download: ML15237A323 (56)


Text

Case No: A.15-02-023 Witness: Sam Blakeslee, Ph.D.

Application of Pacific Gas and Electric )

Company for Compliance Review of Utility )

Owned Generation Operations, Electric Energy )

Resource Recovery Account Entries, Contract ) Application 15-02-023 Administration, Economic Dispatch of Electric ) (Filed February 27, 2015)

Resources, Utility Retained Generation Fuel )

Procurement, and Other Activities for the Period )

January 1 through December 31, 2014. )

(U 39 E) )

_________________________________________________________)

PREPARED TESTIMONY OF SAM BLAKESLEE BEFORE THE PUBLIC UTILITIES COMMISSION OF THE STATE OF CALIFORNIA JULY 14, 2015 Sam Blakeslee, Ph.D.

1101 Marsh Street San Luis Obispo, CA 93401 Tel: (805) 543-4366 Fax: (805) 543-4728 samuel@blakeslee-blakeslee.com July 14, 2015 4:38:00 PM 1

TESTIMONY OF DR. SAM BLAKESLEE Application 15-02-023 July 14, 2015 Over a decade ago, the 2004 Integrated Energy Policy Report of the California Energy Commission (CEC) stated that it, strongly supports the following nuclear recommendations: The Legislature should develop a suitable state framework to review the costs and benefits of nuclear power plant license extensions.

With that backdrop as a freshman Assemblyman I authored, the legislature passed, and the governor signed AB1632 (Blakeslee), one of the very few attempts to address this IEPR recommendation to the legislature. Touching on the future of nuclear power the following language was included in the bill:

In the absence of a long-term nuclear waste storage facility, the commission shall assess the potential state and local costs and impacts associated with accumulating waste at Californias nuclear power plants. The commission shall further assess other key policy and planning issues that will affect the future role of nuclear powerplants in the state.

New CEC activities required under AB1632 included:

Compilation and assessment of existing scientific studies that have been performed by persons or entities with expertise and qualifications in the subject of the studies, to determine the potential vulnerability, to a major disruption due to aging or a major seismic event, of large base-load generation facilities, of 1,700 megawatts or greater. 1 Given the considerable legislative debate and legislation passed on scores of environmental and energy issues from plastic bags to fracking, its noteworthy that AB1632 may be the only instance in the past decade in which the California 1 AB 1632, Blakeslee. Energy: planning and forecasting. As chaptered September 29, 2006.

July 14, 2015 4:38:00 PM 2

legislature directly addressed the future of nuclear power. Whats particularly noteworthy is that the issue was addressed with respect to the need to understand the potential vulnerability to major disruption due to a seismic event.

Notwithstanding the subsequent seismically induced shutdown of the Karaiwa-Kashiwazaki nuclear plant, the Fukushima nuclear disaster, the forced shutdown of San Onofre, the discovery of new earthquake faults, and the effort to relicense Diablo, no additional legislation has been passed regarding the future of nuclear power in California. Therefore I would argue that the existing legislative and regulatory body of policy surrounding implementation of AB1632 should strongly inform how the states agencies and regulatory bodies deal with seismic issues at Diablo Canyon Nuclear Power Plant.

Pursuant to this legislation the CEC recommended, and the California Public Utilities Commission (CPUC) directed, 2 that certain state-of-the-art geophysical data studies be performed to acquire better insight into these issues. Its important to note ahat these studies were directed by the state, not the Nuclear Regulatory Commission, and as such needed to be responsive to state interests. Among the specific recommendations of the CEC were:

  • The California Energy Commission recommends that PG&E should use three dimensional geophysical seismic reflection mapping and other advanced techniques to explore fault zones near Diablo Canyon;. PG&E should report on their progress and their most recent seismic vulnerability assessment for Diablo Canyon in the 2009 IEPR. This action will supplement PG&Es Long Term Seismic Program and help resolve uncertainties surrounding the seismic hazard at Diablo Canyon. Given the potential for an extended plant shutdown following a major seismic event, the Energy Commission, in consultation with appropriate state agencies, should evaluate whether these studies should be required as part of the Diablo Canyon license renewal feasibility study for the CPUC.
  • PG&E should assess the implications of a San Simeontype earthquake beneath Diablo Canyon. This assessment should include expected ground motions and vulnerability assessments for safetyrelated and 2 California Public Utilities Commission Decision 10-08-003, August 12, 2010.

July 14, 2015 4:38:00 PM 3

non safetyrelated plant systems and components that might be sensitive to longperiod motions in the near field of an earthquake rupture. 3 To ensure objectivity and that the states issues were properly addressed, the CPUC established the Independent Peer Review Panel (IPRP) consisting of the following state agencies: California Geological Survey, California Coastal Commission, California Emergency Management Agency, California Energy Commission, California Public Utilities Commission, and the County of San Luis Obispo. 4 The IPRP was tasked with responsibilty for addressing issues identified by the California Public Utilities Commission, the California Energy Commission, and the recommendations resulting from the enactment of AB 1632.

On September 11, 2014, Pacific Gas & Electric publicly released their Central Coastal California Seismic Imaging Project (CCCSIP) studies resulting from the recommendations of the California Energy Commissions AB 1632 report.

At the request of Senator Boxer I reviewed the findings of the CCCSIP and on December 3rd 2014 testified before the U.S. Senate Environment and Public Works Committee. In that oral and written testimony I noted the relevance of the newly discovered and re-interpreted earthquake faults near the Diablo Canyon Nuclear Power Plant (DCNPP), which evidenced a marked increase in the size, faulting styles, connectivity and proximity to the facility. I also detailed the significance of PG&Es use of new analytical methods to calculate ground motion caused by these earthquake faults. As has been widely reported and acknowledged by the utility, these new methods significantly lowered the levels of predicted shaking at the plant relative to prior analysis, notwithstanding the marked increase in the size and proximity of newly discovered earthquake threats.

3 California Energy Commission 2008, An Assessment of Californias Operating Nuclear Power Plants: AB 1632 Committee Report, CEC1002008108CTF, November 2008, pp. 6-7.

4 Decision 10-08-003, pp.14-16.

July 14, 2015 4:38:00 PM 4

I was not alone in identifyingand questioningthe conclusions PG&E presented in its CCCSIP. The IPRP raised similar concerns. Subsequent to the issuance of the CCCSIP, the IPRP held three public meetings and based on its own review of the CCCSIP issued IRPR Reports 7, 8 and 9 in the period spanning November 21, 2014 through March 6, 2015. PG&E did not respond to the IPRPs requests for responses to their individual reports as they were issued, but waited to reply with one omnibus response issued on April 22, 2015. A forensic examination of various early drafts of PG&Es final IPRP response of April 22, now available through the discovery process of this Proceeding, makes clear how PG&E vetted certain responses to the IPRP that might have produced a more useful and credible work product in the CCCSIP report. Unfortunately those decisions were made unilaterally and without interaction or concurrence of the IPRP, the entity tasked with providing technical quality control feedback to the utility and the CPUC. This issue will be explored at greater depth in the remainder of the latter portions of this Testimony.

PG&E has been largely non-responsive to questions, issues and potential report deficiencies raised by the IPRP in Reports 7, 8, and 9. When asked by a reporter when the utility would provide its response to the IPRP PG&E Spokesman Blair Jones said, the IPRPs comments will be addressed, perhaps through the long-term seismic program, or as an update to the SSHAC analysis with the NRC 5. Its important to note that this response essentially means that, if PG&E does choose to work on the issues raised by the IPRP it will do so through a process managed by the NRC, not California agencies or regulators. If the state accepts this position it, by default, loses its seat at the table in the discussion of seismic hazards because the state has no ability to compel or direct the NRC to take up issues of importance to the state.

5 New Times March 18th, 2015. Redacted: What remains unknown from the NRC and PG&E about Diablo Canyon, by Colin Rigby. Volume 29, Issue 34.

July 14, 2015 4:38:00 PM 5

Fortunately there is still time and opportunity to redress these problems so that the many tens of millions of dollars spent by ratepayers to acquire and analyze new geophysical data will tackle the risk and reliability issues so important to the state.

The Commission must require the IPRP to stay engaged in this process until such time as the states concerns are adequately resolved. Therefore, the CCCSIP as submitted on September 11, 2014 should be considered a draft report and should not be considered complete until the states priorities, especially those identified by the IPRP, are properly addressed and incorporated into a final report.

The remainder of this Testimony will address three key areas:

ISSUE 1. Vital concerns of AB 1632 omitted from consideration in the CCCSIP ISSUE 2. Technical deficiencies in key chapters of the CCCSIP ISSUE 3. PG&Es procedural deficiencies with regard to IPRP input and critique ISSUE #1: CCCSIP ignored concerns regarding reliability and non-safety SSCs Significance: Examination of the reliability of Diablo requires assessment of possible shaking of all relevant Structures, Systems and Components (SSCs) including the safety componentsuch as the power block and turbine building AND of the non-safety components. However, the CCCSIP reviewed only shaking within the power block and turbine buildings and ignored other components of the facility related to reliability. The CECs interest in understanding the impact of earthquakes on non-safety components can be found in the issuance of their own report, which states:

The non-safety related systems, structures and components (SSCs) of the plant are most vulnerable to damage from earthquakes. Damage to non-safety related SSCs is the greatest source of seismic-related plant reliability risk [emphasis added].

And again, included in the CECs AB1632 report is the following recommendation:

July 14, 2015 4:38:00 PM 6

Evaluate the seismic vulnerability and reliability implications for the nuclear plants non-safety related SSCs from changes to seismic design standards that have occurred since the plants were designed and built. 6 [emphasis added].

PG&E commissioned work on this issue by Enercon Service Inc., which issued a March 2010 report on the subject. 7 But that report fails in two key ways: First, the Enercon Service analysis was performed many years before the newly discovered and reinterpreted seismic hazards were identified 2014. The study therefore failed to use the relevant seismic threat information that was appropriate to examine the reliability of non-safety SSCs at Diablo. Second, the 2010 analysis was based on an Electric Power Research Institute database, which included instances of maximum recordings of earthquake shaking of between 0.2 and 0.8g at a variety of electrical generation facilities. Most of these recorded accelerations in this database are lower than the maximum accelerations PG&E predicts in its 2014 CCCSIP for earthquakes on the Hosgri, Shoreline, San Simeon, and Los Osos faults for locations with rock velocities of 760 m/s. Some of Diablos important non-safety SSCs are located in areas of lower-velocity rock than those assumed for the foundations of the power-block and turbine building.

An internal PG&E document entitled DCPP Seismic Hazard Update Summary contains the subject results from a 12/6/13 CNO Seismic Update Meeting. 8 In subsection 4. Presentation of AB 1632 Results, the following item is included:

e. J. Conways M8 deterministic request:

ii. It was decided to only evaluate the critical SSCs in any frequency of exceedance ranges to show they can perform their safety function. 9 6 An Assessment of Californias Operating Nuclear Power Plants: AB1632 Report, November 2008.

7 Pacific Gas and Electric Company Supplemental Reports Recommended by the California Energy Commission. Attachment 2.3, Seismic Assessment of Diablo Canyon Power Plant Non-Safety Related Structures, Systems, and Components.

8 CNO is an abbreviation for PG&Es Chief Nuclear Officer.

9 DCPP Seismic Hazard Update Summary, results from 12/6/13 CNP Seismic Update Meeting, p.3. J. Conway refers to PG&Es previous Chief Nuclear Officer.

July 14, 2015 4:38:00 PM 7

Thus, nine months before the release of the CCCSIP, PG&E had pre-determined not to evaluate the non-safety SSC areas of concern to state energy regulators, although the AB 1632 recommendations specifically call for this.

Recommendation: It is imperative that free-field accelerations be calculated for all relevant locations within the Diablo plant footprint, including those where non-safety SSCs are located. The predicted accelerations, some of which will occur at locations of low-velocity fill such as the switchyards, can then be available to properly examine whether or not non-safety SSCs could withstand the shaking anticipated from these new seismic hazards using the EPRI database. If the EPRI database is statistically inadequate to test how non-safety SSCs would perform given the large Diablo accelerations, then forward modeling or other approaches could be employed to address the issue. Finally, if these updated calculations are not performed, the states confidence regarding the reliability of the plant will rest solely upon outdated analysis of data that failed to include the seismic threats identified by the AB1632 research. It is difficult to understand why the study of the non-safety SSCs was commissioned by the utility before the seismic threat was properly understood and quantified, and why their December 2013 AB 1632 plan excludes them from analysis.

ISSUE #2 Technical deficiencies in key chapters of the CCCSIP A) Significance: The description of the critically important ground motion conclusions in Chapters 11 and 13 is very brief, consisting of only two-dozen spreadsheet-heavy pages within a larger report of well over a thousand pages. The CCCSIPs presentation of shaking predictions is poorly organized and opaque making it difficult to isolate, quantify, and test how different methodological approaches, underlying assumptions, parametric values, and ad-hoc adjustments combine to influence the final shaking predictions. It is vitally important to assess whether or not the conclusion of the plants reliability is largely dependent upon the details of July 14, 2015 4:38:00 PM 8

the technical approach selected by the utility, and whether or not different conclusions would result from alternative methodologies, parameters, or assumptions. The material in the CCCSIP make it impossible to perform this assessment.

Recommendation: I recommend that the computation and discussion be organized in a manner that is commonly used in a signal processing approach, which separates out sequential linear processes at each stage of signal modification. Where possible, write the equation that governs ground motion predictions as a function of the earthquake source spectrum and a multiplicative series of various transfer functions that affect that source signal. This formalism will allow analysts to examine the relative significance of each of the terms and apply greater focus on those parameters that contribute to large amounts of amplification or de-amplification.

For example, where A(f) is the 84% predicted acceleration and the equation is something like:

A(f)= b(f)

  • c(f)
  • d(f)
  • e(f) *f(f)* g(f) b=source term - that allows analysis of the effects of different fault rupture areas, fault geometries, and stress drops, c=transmission term - that allows analysis of the effects of different propagation distances and Kappa effects, d=near-surface term - that allows analysis of different near-surface attenuation and scattering effects assuming different 2D and 3D impedance and attenuation properties, and/or e=single-station sigma term - that allows the analysis of adding progressively more earthquakes to constrain uncertainty due to site effects, f=downward-projection term to adjust shaking from reference free-field station to power-block and turbine-building foundation depths - that allows the analysis of how rock types and fill affect shaking at the foundations of structures, g=upward-projection term to adjust shaking from power-block and turbine-building foundation depths to locations of equipment in buildings - that allows the analysis of how shaking is transmitted through the structure assuming different damping factors.

This transfer function representation allows the reader to see the size and scale of the relative effects, see how different parameters affect the transfer function (some July 14, 2015 4:38:00 PM 9

of these effects may be simple scalars not vectors), and to assess whether or not the results are consistent with our understanding of the basic underlying physics of the problem.

As a simple example of the value of such an approach, NRC staff provided a graphic

[see FIGURE A, page 29] of a site effect transfer function for Diablo using a number of different assumptions 10. As can be seen in the adjacent figure, in each of the cases the NRC results were more conservative than PG&Es. The de-amplification effect at 2 Hz averaged about 10% using the three NRCs methods whereas the PG&E approach tripled the de-amplification effect. This graphic helps show the significance of choosing one method or parametric value over another. The CCCSIP should present information in a similar manner so that it is more accessible to the average technical reader.

As an additional benefit, by breaking the analysis into these various transfer functions it will be much simpler to back out terms f and g, for example, to calculate free-field shaking at a range of locations throughout the footprint of the facility. This would make the results of the CCCSIP much more flexible, relevant, and useful for future applications to new questions, methodologies, adjustments, or data.

B) Significance: The new predictions of Diablo ground motion based on the new methodologies, ad-hoc adjustment factors, and parameter selection have markedly lowered estimates of ground motion, relative to the approach used when the plant was licensed.

As I argued before the U.S. Senate Environment &Public Works Committee, 11 an NRC license amendment process should be required if these methods are to be adopted 10 Research Information Letter 12-01. Confirmatory Analysis of Seismic Hazard at the Diablo Canyon Power Plant from the Shoreline Fault Zone, September 2012.

Figure 5-13: Comparison of corrective factors developed by NRC.

11 Testimony of Sam Blakeslee, Ph.D., US Senate E&PW Committee, December 3, 2014. Attached as Appendix.

July 14, 2015 4:38:00 PM 10

due to the NRCs own rules regarding the adoption of new analytical methods that are less-conservative than those employed previously. The NRCs own Inspector General raised this very issue when criticizing the NRC in 2014 for allowing staff to make key technical decisions without a license amendment process; decisions which ultimately led to the closure of San Onofre 12. Although the state does not determine which methodologies are used for NRC licensing purposes, the state does have an interest in staying apprised of these issues when making its own policy decisions regarding the future of Diablo. Why? Because, those technical failings by NRC staff had a multi-billion dollar impact on California ratepayers; and, obviously, had an enormous effect on the reliability of the San Onofre facility. Although the state has no recourse against the federal government for their mistakes at San Onofre, it can require the utility to use more conservative methodologies or parameters when performing state-directed studies that analyze the potential for future disruption.

I want to respectfully acknowledge that a significant amount of research has gone into developing these new methods, which were designed to provide an accurate statistical modeling of the thousands of recorded earthquakes in the PEER database 13.

Given that frank statement one might wonder why Im concerned that these new statistical methodologies may not be particularly successful at predicting ground motion at Diablo.

The fundamental problem is that only a tiny amount of the data used to create these new methodologies, perhaps less than 1%, include recordings of large earthquakes

(>M7), in the extreme near field (<2km), on hard rock sites (>1km/s) - which is the exactly the circumstance at Diablo (see figure 1.2 and note the small number of data points of large earthquakes recorded in the box delineated between the distances 0-2 km and between M7-8). [see FIGURE B, Page 29].

12 October 2nd 2014, NRC Office of the Inspector Generals San Onofre Report.

13 Figure from PEER NGA-West2 Database Presentation; Ancheta et. al. May 2013.

July 14, 2015 4:38:00 PM 11

This is not just a sampling problem that can be fixed through the application of a simple estimate of uncertainty derived from the variability in the data. First, there just isnt enough data to build a robust statistical estimate of uncertainty for this circumstance. And, second, the near-field physics of fault ruptures include a number of earthquake phenomena that can be modeled by using far-field datasets. In the extreme near-field starting and stopping phases, stress drop heterogeneity, directivity, and other effects can be quite significant. Those effects are not well represented in models that are constructed using the larger dataset of earthquakes that fall outside the 0-2 KM and M7-8 recordings.

To make up for the limitations in the data, a number of simplifying assumptions and ad-hoc corrections were made in the CCCSIP which control the model predictions for large earthquakes, in the extreme near field, on hard-rock sites. For example, In the CCCSIP, PG&Es models assume that shaking does not increase much with increasing proximity once within a few kilometers of a fault. In other words, the shaking predicted for locations less than 1 kilometer from a fault are only modestly larger than those predicted for locations 5 kilometers from a fault. Similarly, their models also assume that, above a certain level, increased earthquake magnitudes produce relatively small increases in shaking for locations close to the fault. In other words, for these models, shaking from a Magnitude 7.5 in close proximity to the fault may be only modestly larger than shaking from a Magnitude 7.0. A paper by PG&Es lead engineering seismologist describes this underlying assumption in their models as follows:

The main features of the functional forms of the five NGA models are summarized in Table 3. Saturation at short distances is a feature of ground motion models that leads to weaker magnitude scaling at short distances than compared with the magnitude scaling at larger distances. Saturation causes a pinching of the ground motion for different magnitudes at short distance. .andIn ground motion studies, a model is said to have full July 14, 2015 4:38:00 PM 12

saturation if there is no magnitude scaling of the median ground motion at zero distance. 14.

So in the very range where the data is least available, and where Diablo is most vulnerable, the functional form of the models results in shaking predictions that minimize the effect of increasing proximity and earthquake magnitude. The importance of this assumption, which is built into the model predictions, is difficult to overstate. Are these good assumptions? It is impossible to know given the paucity of strong motion data in the ranges and magnitudes relevant to Diablo. My point is not to make a broad general criticism of the new methodologies in those ranges where the statistics are robust; rather, it is to question the accuracy, reliability, and, just as importantly, the error bars ascribed to those predictions when the methodology is applied to the large faults rupturing in very close proximity to Diablo.

Beyond the near-field and large earthquake saturation assumptions, which are used to cope with one set of data deficiencies, still other assumptions and corrections are employed to deal with another deficiency. The NGA-WEST 2 database also does not include an adequate sampling of rocks that reflect the velocities at certain key locations at Diablo; so instead, the Report uses a scaling factor, one that the NRC itself has found to be less-conservative than alternative scaling factors that could have been adopted. Not only is the velocity-scaling factor somewhat ad-hoc, but the underlying velocity estimates themselves are highly debated because the two wells that best measure the velocities near the foundation yield significantly different velocity values.

Yet another effort to address this deficiency in velocity information has led to use of another technique - tomographic inversions of travel-time delays. However, tomographic velocities do not resolve the uncertainty as these statistical inversions are highly dependent on ray coverage and are notoriously difficult to tie to known 14 Comparisons of the NGA Ground-Motion Relations; Abrahamson, et. al.

Earthquake Spectra, Volume 24, No. 1, pages 45-66, February 2008.

July 14, 2015 4:38:00 PM 13

well data in the absence of detailed and extensive calibration efforts. Tomographic velocity analysis is commonly relied upon to make statistical estimates of the locations of relative velocity variations, not precise measurements of specific velocities (Squires, Blakeslee, Stoffa 1992 15; Eastwood, Lebel, Dilay, Blakeslee 1994 16). It would be a serious mistake to believe that uncalibrated tomographic velocities are an acceptable substitute for the precise and accurate velocities needed to accurate compute amplification factors and site effects. Given the relative ease and low cost, its not clear why more wells arent simply drilled so that the velocity question, which is so important to the final shaking predictions, can be resolved once and for all through direct measurements.

Beyond this array of problems is the effort in the report to perform statistical analysis called single-station sigma in an attempt to constrain the site effect. This site effect is due, in part, to the reverberations, attenuation, and scattering produced by the unique configuration of heterogeneous near-surface impedance contrasts at the site. But yet again, this single-station sigma calculation is based on a tiny dataset. In this case the Diablo Canyon site has recorded only two large earthquakes on strong motion instruments suitable for use in making the single-station sigma correction. Furthermore these earthquakes were recorded from azimuths that dont reflect some of the most critical potential sources of nearby earthquake threats, which lie to the west rather than the north or east.

Finally, when one evaluates the new methodologys simplifying assumptions for large earthquakes in the extreme near-field; when one reviews the lack of data for recordings on hard rock sites; when one examines the well-log and tomographic data used in the high velocity de-amplification factors; when one reviews the statistics of the single-station site-effect adjustments based on just two large 15 Eastwood, J. Lebel P., Dilay A. and Blakeslee S, 1994, Seismic monitoring of steam-based recovery of bitumen, The Leading Edge, 13, 242-251.

16 Squires, L.J., S.N. Blakeslee, and P.L. Stoffa, 1992, The effects on statics on tomographic velocity reconstructions: Geophysics, 57, 353-362.

July 14, 2015 4:38:00 PM 14

earthquakes; and when one then notes that ALL of these adjustments serve to LOWER shaking predictions relative to the earlier methodologies and parameters used during the licensing process, it becomes eminently reasonable to seek objective third party review of how the results might differ if alternative methodologies or parameters were selected. This had been the role of the CPUCs IPRP, but the utility has been largely non-responsive to most of the its technical feedback regarding concerns about the factors that affect shaking estimates, choosing instead to shift the analysis of these issues away from CPUC-controlled IPRP to the NRC-controlled venue.

Recommendation: Because of the issues and problems inherent in applying these new methodologies for large earthquakes, in the near field, on hard rocks, it is important to rigorously test the range of possible predictions using a reasonable a realistic range of methodologies and parameters. The CCCSIP approach is to pick its preferred methods and parameters to compute a median and then estimates uncertainty with the assumption that the basic methods and parameters are accurate. A more useful and appropriate way to compute uncertainty, i.e., the range of shaking possibilities, is to vary the methods and parameters used in the analysis and then explore the space of shaking solutions that are computed. Specifically, by utilizing the most-conservative set of methods and parameters that is supportable by the data, one can estimate an upper-bound to likely shaking. Similarly, by using the least-conservative set of methods and parameters that is supportable by the data, one can estimate a lower-bound of likely shaking. A forward modeling approach that varies the methods and parameters based on the likelihood that each is known with some level of certainty will create a density function of shaking estimates between these two end members. This density function would be a more useful estimate of the median and the 84 percentile of potential shaking outcomes than could be expected given the uncertainty in the available data and in the methodologies employed to make those estimates. This approach is very different from the approach used in the CCCSIP, which relies upon creating error bounds from the large PEER database that has little bearing to the Diablo situation or from a July 14, 2015 4:38:00 PM 15

single-station sigma correction that even the NRC declined to use due to a lack of supporting data .

ISSUE 3. PG&Es procedural deficiencies with regard to IPRP input and review Significance: The IPRP, in their three-part review of the CCCSIP, has raised many concerns similar to those I have outlined, as well as requesting additional information and answers from PG&E. PG&Es reply to all three reviews, in one omnibus document, has been largely non-responsive, a platitudinous recitation of the quality of their analysis process, or an assertion that issues will be explored at some unspecified point in the distant future.

As designed and empaneled by The Commission, the IPRP was to have an important role insuring that the states perspectives and needs were addressed by the CCCSIP:

In addition to PG&Es proposal to employ outside consultants and subject its seismic studies to peer review, this commission will convene is own Independent Peer Review Panel (IPRP). The commission will invite the CEC, the California Geologic Survey, the California Coastal Commission, and the California Seismic Safety Commission to participate on the panel. Under the auspices of the California Public Utilities Commission (CPUC), the panel will conduct a peer review of the seismic studies including independently reviewing and commenting on the study plan and completed study findings. Our order in this application will require PG&E to submit its study plans and completed study findings to the IPRP for review prior to implementation. Should a dispute arise it should be resolved informally but if that is not attainable the Commission has authority to halt the associated rate recovery. 17 Chronologically, it should be noted that PG&E held no public meetings with the IPRP from the time the IPRP issued their Report #6 (which first expressed critical concern regarding PG&Es assumptions for ground motion calculation) on August 17 California Public Utilities Commission, D.10-08-003.

July 14, 2015 4:38:00 PM 16

12, 2013 until October 23, 2014more than a month after the final CCCSIP was released simultaneously to both the public and the IPRP. This embargo of non-communication encompassed the principle period during which the data from the field investigations was collated, analyzed and prepared for the CCCSIP by PG&E.

Subsequent to the release of the CCCSIP, the IPRP did convene three public meetings to discuss the study: October 23, 2014 (meeting #7), November 17, 2015 (meeting

  1. 8) and January 8, 2015 (meeting #9). Subsequent to each IPRP meeting, the group issued a report, with a corresponding report number. Although the IPRP expressed at its public meetings a desire to have PG&E respond to each of their reports individually, PG&E decided unilaterally to respond to all three of the IPRP reports in one document, at the conclusion of this initial period of IRPR review.

As indicators of PG&Es failure to address the valid concerns and questions of the IPRP (as raised in the IPRPs numbered reports) I will provide one documented example from each meeting/report for IPRP numbers 7, 8 and 9. I will then contrast PG&Es final response to the IPRP of April 22, 2015 with earlier drafts of PG&Es responses obtained through the discovery process in this proceeding. What becomes apparent is that at an earlier draft stage, PG&E admits to gaps in their data and interpretation and a willingness to explore additional field analysis and interpretation to answer the concerns of the IPRP. What emerges in the much later final response to the IPRP is PG&Es attempt to substitute the gathering of empirical data with which to understand any increase in seismic hazard with probabilistic estimates reliant on unknown uncertaintiesuncertainties that could have been empirically reduced had PG&E actually followed through on its earlier intentions. This change in PG&Es approach happened behind the scenes, and without the discussion and responses that should have followed each of the IPRPs sequential reviews of the CCCSIP.

July 14, 2015 4:38:00 PM 17

EXAMPLE 1 IPRP report No. 7 notes with concern methods for determining the slip rate on the Hosgri and Shoreline fault:

An unstated limitation in all the LESS studies is that they were designed to image sediments at shallow to intermediate depth to complement the proposed HESS studies. Because of this, the technologies used in the 2D and 3D LESS studies all have significant limitations when imaging the shallowest sediments. Since the shallowest sediments are the youngest, they record evidence for most recent fault displacement.Uncertainty in slip rates because of uncertainty in the ages of offset channels could also be reduced with a well-executed shallow coring program. Very high resolution profiling of the youngest sediments with sampling to determine ages of those sediments offers the best chance to constrain slip rates representative of seismic hazard at the DCNPP, particularly along the relatively fast-slipping Hosgri fault. If older deposits are targeted, such as those recorded by the LESS studies, then significant improvements in age control are needed in order to reduce to uncertainties associated with the slip rates. The acquisition of more absolute dating control may decrease the slip rate uncertainties both in terms of age control, and possibly feature correlation. 18[emphasis added]

PG&Es final response to the IPRP of April 22, 2015 replies to this concern in the following manner:

The relative merits of other offshore investigations of the Hosgri fault slip rate that were not part of the CCCSIP are subject for the SSC SSHAC study; the SSC SSHAC report documents the findings of the TI team regarding the Hosgri fault slip rate and its uncertainty that is used in the updated PSHA.

IPRP Report No. 7 suggested areas for additional research, particularly to develop better constraints on age estimates of offset features mapped by the LESS, with an emphasis on the offsets of younger (<20ka) rather than older offset features. Collecting data to improve the age dates and/or identify additional younger offset features in the offshore region is a major task. As shown by the tornado plot on Figure 1 below (which is equivalent to Figure 4 of IPRP Report No. 9), the current uncertainties in the Hosgri and Shoreline slip rates, which incorporate large age uncertainties, lead to 18 IPRP report #7, p. 11.

July 14, 2015 4:38:00 PM 18

less uncertainty in the total seismic hazard than current uncertainties in the rock ground motion models and site amplification models.

Therefore, collection of data to improve the age dates and/or resolution of offset features along the Hosgri or Shoreline fault will likely be a lower priority. The final prioritization of any additional studies to reduce uncertainty in the hazard will be developed as part of PG&Es ongoing Long Term Seismic Program (LTSP). 19 In summary, PG&E dismisses addressing the IPRP Report No. 7 concerns as a major task; not as significant as reducing the current uncertainties in the rock ground motion models; and thus a lower priority to be dealt with in a future program.

Contrast this with a draft of PG&Es earlier answer to the IPRP, dated from December 2014:

PG&E also agrees that the ages of the offset channels and paleo-strandlines imaged in both the Hosgri and Shoreline fault zone LESS surveys are not well constrained. The use of sequence stratigraphy and sea-level curves enabled PG&E to bracket the possible ages of these features; however, absolute age dating is necessary to reduce the uncertainty associated with the slip rates.

  • PG&E is examining options for conducting a shallow-depth coring program (e.g. Vibrocoring) to sample younger sediments offset by active faulting as well as confirm the ages of regional unconformities identified in the 2013 seismic stratigraphy report and the 2014 sea level low-stand models.

The coring program would be conducted in conjunction with a high-resolution CHIRP seismic survey to precisely locate coring targets and to provide acoustic stratigraphic controls at shallow depth. Both programs would be conducted as part of the PG&E Long-Term Seismic Program. 20 [emphasis added]

Recommendation: Had PG&E provided the above earlier response to the IPRP in a timely fashion, the alternative methods for exploration and analysis could have been openly discussed and the merits considered. Instead, PG&E withheld these commentsin which they agree with the deficiency and propose a practical remedyonly to later dismiss them by statistically lowering their importance, 19 PG&E IPRP response of April 22, 2015 P. 4 20 PG&E draft response to IPRP Report No. 7, December XX, 2014. p. 3.

July 14, 2015 4:38:00 PM 19

regarding their execution as a major task, and punting their completion to a future date. It should also be added that while the preparation of the NRCs required SSHAC study had a fixed date, the AB 1632 study did not; and that there was no hard budget cap on the AB 1632 studies that would have allowed costs to constrain the work proposed in the December draft response. The IPRPs request for an examinationand if warrantedexecution of well executed shallow coring program should be implemented before the CCCSIP can be considered final.

EXAMPLE 2 IPRP Report No. 8 contains a subsection reviewing the CCCSIPs Onshore seismic interpretation program (ONSIP) and notes:

The IPRP is not convinced that the interpretations of the down-dip extensions of faults are well constrained, even in the case of well-documented surface faults. Similarly, faults interpreted from the seismic sections, but not corroborated by surface mapping, (e.g. faults interpreted between the San Miguelito and Edna faults) are possible, but are by no means unique interpretations of the data. Overall, the IPRP is not convinced that projections of faults beyond the very shallow subsurface represented unique interpretations of the data. 21 In PG&Es draft March 25, 2015 response to the IPRP, PG&E provides a conciliatory acknowledgment of the potential weaknesses in their study raised by the IPRP:

Often it is difficult to discern the downdip expression of faults exposed at the surface in the seismic reflection data, particularly faults that were at some point likely acting as normal-normal-oblique faults during the first interval of Miocene extension. In part multiple-stages of contraction and extension since the initial Miocene extension are likely to subsequently deform initially near-planar faults.

Consequently, it may be difficult to map some older faults to depth due to polystage deformation. [emphasis added]

[I]n contrast to 3D mapping in a volume, mapping of steep-to-vertical structures associated with lateral reflector truncations will not be as compelling in 2D seismic reflection data alone 21 IPRP Report No. 8, December 17, 2014, p. 5.

July 14, 2015 4:38:00 PM 20

because the existence or lack of existence of along-strike continuity cannot be assessed. Unfortunately, only 2D seismic imaging is available in most of the coastal areas containing the Los Osos and Edna faults due to permitting limitations. Most of the best 3D imaging is in areas of the inland Irish Hills are [sic] dominantly composed of Franciscan assemblage rocks. The permitted 2011 3D acquisition geometry was not able to consistently image shallow Tertiary 3D structure due to the low velocities of shallow Tertiary rocks and the primarily 2D source acquisition geometry in the Irish Hills areas within the surficial Tertiary rocks. 22 [emphasis added]

Contrast this earlier analysis/response from PG&E with the final version in their IRPR omnibus reply a month later:

The CCCSIP Report presents a preferred model of fault location and geometry in the Irish Hills that is consistent with the best available geographic, geophysical, and seismic-reflection data. The preferred model was the result of a rigorous process that included extensive analysis of map and well data, state of the art seismic-reflection processing and thoughtful consideration of alternative interpretations. 23 Somehow, in the space of one month the deficiencies and weaknesses PG&E expressed in draft form to answer the IPRPs concerns have been transformed into a single paragraph of confident and bold assertion of fact in PG&Es final response to the IRPR. Notwithstanding PG&Es confidence, the IPRP was, and remains correct, in a conclusion from their Report No. 8:

CONCLUSIONS IPRP review of the tectonic model is based on the CCCSIP report and presentation. The IPRP has not had time, to review the seismic data processing in detail. In addition, a full auditing of the seismic data acquisition and processing sequence would require the IPRP to retain outside consulting services. 24 22 Pacific Gas and Electric Companys Response to Independent Peer Review Panel Reports 7, 8, and 9, March 20, 2015, pp. 6-7.

23 PG&E IPRP response of April 22, 2015, p.6.

24 IPRP Report No. 8, December 17, 2014, p. 7.

July 14, 2015 4:38:00 PM 21

The thorough review of the seismic data processing, as referenced by the IPRP, is a vital part of assuring the validity of PG&Es findings. Processing and interpretation can introduce artifacts and other biases into the results.

Nearly two years earlier, PG&E seemed quite aware that a request by the IRPR for additional reprocessing and review of the data could become a problem that would require mitigation.

As part of a ten-page March 28, 2013 submittal to PG&Es Executive Project Committee by Senior Vice President and Chief Nuclear Officer Ed Halpin, et al, the document identifies this particular risk from IPRP review, suggests a seemingly reasonable mitigation strategy, and offers a contingency plan just in case. These are found in a table with headings of Risk Description; Mitigation Strategy and Contingency Plan; and Impact on Cost. The potential risk attributed to IPRP Review is described as follows:

IPRP recommends additional processing of data or interpretations after their review of project results. The project results and conclusions are to be provided to the Independent Peer Review Panel (IPRP) as a condition of authorized CPUC funding for this project.

They could recommend additional processing methods be applied or other interpretation techniques be utilized. The IPRP make-up does not have members who are experienced in processing and interpretation, but they could seek an independent review by others.

Mitigation: When presenting the results to the IPRP PG&E will stress that advanced processing methods and interpretation techniques recommended by industry and academia experts were used. Make processed data available to IPRP before the technical reports are provided for their review.

Contingency: The IPRPs charter is to review results. PG&E could challenge any recommendation for additional processing or interpretation to be out of their scope. 25 25 Opening Brief, A14-02-008, exhibit-6, p.2, A4NR.

July 14, 2015 4:38:00 PM 22

Contrary to PG&Es Mitigation and Contingency plans, there is absolutely no prohibition on the IPRP 1) taking the time necessary to complete a thorough review of the data; 2) complete an ex-facto audit of all data processing; and 3) retain outside consulting services to complete the work. In fact, the Commissions authorizing decision forming the IPRP explicitly considered this outcome:

3.4. Whether Outside Experts should be Retained

[T]his need was recognized by us in D.10-08-003, which declared, The IPRP may employ consultants and experts. Costs incurred by the PRP shall be reimbursed by PG&E and recovered in the DCSSBA.

[footnote omitted] We have no in-house scientific or technical expertise to review seismic studies or perform analyses. Outside help is needed to ensure that the enhanced seismic studies are scoped out properly at the front end and reviewed properly during the course of the studies pursuant to the recommendations in AB 1632. 26 Recommendation: It is critically important that the IPRP be given the ability to revisit their earlier concerns regarding audit and re-analysis of seismic data and its interpretation. As D.10-08-003 makes clear, the IPRP should be able to engage the outside consulting necessary to complete this task. Absent an informed re-evaluation of the data processing, the CCCSIP cannot be considered final.

EXAMPLE 3:

IPRP Report No. 9 concludes:

While the estimated VS values in the tomographic model correspond to expected relationships of VS with depth, with removal of low-VS material by grading, and general range of VS for different geologic units these values do not correspond well to values previously measured in boreholes. PG&E has not reconciled these differences, nor have they provided estimates of uncertainty in the velocity values in the tomographic model. 27 26 Decision 12-09-008 September 13, 2012, p.23 27 IRPR Report No.9, March 6, 2015, p. 14.

July 14, 2015 4:38:00 PM 23

In PG&Es March 20, 2015 draft response to IPRP Report No. 9 critique of Vs profiles and soil velocity, PG&E proposes:

  • PG&E response o Respond to questions provided after the Jan workshop that address many of the issues raised above
  • List questions w/ responses o Conduct new set of downhole measurements to calibrate the tomography models as part of LTSP 28 [emphasis added]

In contrast, PGEs formal response to the IPRP does not appear interested in acquiring additional empirical evidence, but rather to subsume this action to refinement of their computer modeling, as indicated in their final response to the IRPR:

[T]he 3D velocity model is being revised using additional constraints such as dispersion curves at selected locations, Vs profiles form the borings, and site amplification measured across the DCPP site area consistent with the IPRP comments on the calibrations of the 3D model. 29 Recommendation: To provide resolution to the uncertainties noted by the IPRP, it would be worthwhile to require PG&E to follow through on their earlier draft reply intentions and drill a number of relatively shallow wells around the site so that rock velocity data can be acquired with a high degree of confidence. Such wells are not very expensive, are easily logged, and can resolve one of the most critical sources of uncertainty in this analysis. It is very common to instrument such wells so that measurements of surface and downhole wavefields can be acquired, thereby allowing highly accurate, empirically-derived near-surface transfer functions to be calculated (Blakeslee & Malin 1991) 30. As PG&Es earlier response indicates, PG&E was going to place this task in their LTSP or Long Term Seismic Program. The 28 Pacific Gas and Electric Companys Response to Independent Peer Review Panel Reports 7, 8, and 9, March 20, 2015, p.12.

29 PG&E IPRP response of April 22, 2015, p. 7.

30 Sam Blakeslee and Peter Malin (1991), High-frequency site effects at two Parkfield downhole and surface stations, Bull. Seism. Soc. Am. 81, pp. 332-345.

July 14, 2015 4:38:00 PM 24

questions raised by the IPRP are a direct outgrowth of the AB 1632 investigation, and as such, there is no need to delay their implementation in a future PG&E seismic program. This concern must be addressed before the CCCSIP can be considered final.

The greatest area of ongoing concern can be found in PG&Es response to the IPRPs critique of PG&Es use of single-station-sigma as an attempt to constrain path effects at Diablo Canyon. This was discussed in greater technical detail at Issue 2, Section B of this testimony. The IPRP concludes their Report No. 9:

  • The single-station sigma approach has significant effects on calculated earthquake shaking.

o Calculated ground motions using the 3D tomographic model should reflect uncertainties in that model, which have not been described.

o The site term based on two recorded earthquakes may represent other factors, rather than site conditions. IPRP is not convinced that this factor is adequately constrained for use in ground motion calculations. 31 PG&Es tornado chart of uncertainties puts the greatest uncertainties on this very model by which mathematical reductions in hazard are achieved, and was presented at the IPRPs January 8, 215 public meeting. PG&Es response to the IPRPs concern voiced at that public meeting can be gleaned from a PG&E email approximately one month later, between Stuart Nishenko, PG&E Geosciences Department, and Erik Jacobsen of Regulatory Relations:

5) Gibson and others (e.g Blakeslee) were posturing that they want to spend more time delving into ground motion issues - this is beyond the original AB 1632 scope and outside the expertise of the IPRP (once again).
6) Regardless, well still need to have a meeting to discuss how to wrap the IRPR process up (or at least this iteration) 32 31 IRPR Report No.9, March 6, 2015, p. 15.

32 PG&E email, February 8, 2015.

July 14, 2015 4:38:00 PM 25

When dealing with the single greatest (and most uncertain) variable in the seismic hazard question, I would not regard the inquiries of the IPRP members, nor myself, as posturing. Particularly when PG&Es final response to the IPRP glosses over the problem with an admission of the flaw, and a punting of the solution to a future long-term project, which PG&Es Dr. Norman Abrahamson told the IPRP at their January meeting would not be complete until 2025. That, of course, is the end of the plants current license:

The currently available broadband data from small earthquakes does not provide improved constraints on the DCPP site term because the station coverage is too sparse. Improved source, path, and site response terms are part of a separate longer term study being planned with the Southern California Earthquake Center. This project plan will be developed as part of the overall effort to reduce the uncertainty in the seismic hazard. 33 Dr. Nishenkos suggestion within PG&E that this issue is beyond the scope and outside the expertise of AB 1632 and the IPRP, respectively, is baseless and unsubstantiated. Indeed, the issue of obtaining accurate data with which to calculate the ground motion and hazards is clearly under the purview of AB 1632 and the IPRP:

As ground motion models are refined to account for a greater understanding of the motion near an earthquake rupture, it will be important for PG&E to consider whether the models indicate larger than expected seismic hazards at Diablo Canyon and, if so, whether the plant was built with sufficient design margins to continue operating reliably after experiencing these larger ground motions. 34 As to the IPRPs expertise, this Testimony has already established that the IPRP has the ability to contract with outside consultants as needed to help further identify and constrain the seismic hazard at Diablo Canyon.

33 PG&E IPRP response of April 22, 2015, p. 7 34 California Energy Commission 2008, An Assessment of Californias Operating Nuclear Power Plants: AB 1632 Committee Report, CEC1002008108CTF, November 2008, p.6.

July 14, 2015 4:38:00 PM 26

Recommendation: The IPRP is well within its scope of duties to insist that the PG&Es final CCCSIP include estimates of anticipated shaking levels should these new assumptions prove unfounded.

Conclusion In my review of the CCCSIP Ive been troubled by how consistently the more mundane but critically important issue of reliability: Diablos potential vulnerability to disruption, so important to California, is largely unaddressed. The Report reads more like a document created to deal with relicensing for a federal audience rather than reliability for its intended state audience. Furthermore, most of the CCCSIPs analysis consistently argues for use of the less-conservative licensing basis, less-conservative ground motion prediction methodologies, less-conservative velocities, and less-conservative single-station site effects with little effort to quantify the uncertainty inherent in each of those assumptions. To top it off, all of these less-conservative approaches are then used to analyze only the power-block and turbine buildings, that represent only the portion of the plant that is regulated by the NRC for safety purposes, even though this is a rate-payer funded study driven by state legislative bill AB 1632 which focused on state reliability issues.

It is my hope that the IPRP, which consists of representatives of state agencies, will ensure that these state issues are properly included in the final AB1632 report before the CCCSIP can be considered, as PG&E has labeled it, final. A response by the utility that is less than responsive to IPRPs concerns, or states that concerns will be a subject of further research in the coming years, or states that the issues will be addressed through proceedings with the NRC rather than the state, should not be considered acceptable by the CPUC given the significance of seismic hazards to state ratepayers. It may be a very long time before the state has another such opportunity to assert its role, independent of the NRC, to obtain the information it has paid for and that state policymakers should have in their possession as critical issues of earthquake hazards and the future of Diablo are considered.

July 14, 2015 4:38:00 PM 27

Finally, although this testimony is principally designed to help inform the California Public Utilities Commission in its current deliberations, I sincerely hope it will also be useful to others who seek to base their decisions on good peer-reviewed science -

including senior management within PG&E, state agencies that participate on the Independent Peer Review Panel (IPRP), and to the state legislature which is considering legislation regarding the longer-term future of the IPRP.

Respectfully submitted.

July 14, 2015 4:38:00 PM 28

FIGURE A FIGURE B:

July 14, 2015 4:38:00 PM 29

Statement of professional qualifications:

My name is Sam Blakeslee, and I was elected to represent the 33rd California Assembly District from 2004 through 2010. From 2010 through 2012 I represented the 15th Senate District. I attended U.C. Berkeley where I earned bachelor's and master's degrees in geophysics, and then earned a Ph.D. from the University of California, Santa Barbara for research in seismic scattering, micro-earthquake studies, and fault-zone attenuation. I previously worked as a research scientist at Exxon's research lab in Texas, where I received a patent for inventing an innovative technique that used medical cat-scan mathematics to create detailed images of geologic formations. Later, I moved into management and became a Strategic Planner, where I was responsible for creating and managing Exxon budgets.

July 14, 2015 4:38:00 PM 30

WRITTEN STATEMENT

BY SAM BLAKESLEE, PH.D.

CALIFORNIA STATE SENATOR, FORMER

CALIFORNIA SEISMIC SAFETY COMMISSIONER, FORMER

TO THE

SENATE COMMITTEE ON ENVIRONMENT AND PUBLIC WORKS

DECEMBER 3, 2014

Senator Boxer, Ranking Member Vitter, and Members of the Committee,

Thank you for the invitation to testify at todays hearing titled NRCs implementation of the

Fukushima near-term task force recommendations and other actions to ensure and

maintain nuclear safety. The Fukushima meltdowns raised important concerns about

nuclear reactors and one of those concerns relates to seismic safety. As a geophysicist and

former California State Senator, I authored AB 1632, a bill that required PG&E to conduct

seismic hazard research of the faults near the Diablo Canyon Nuclear Power Plant (Diablo)

housed in the community that I reside in and represented for 8 years as a state legislator.

Just two months ago, PG&E published the Coastal California Seismic Imaging Project

(CCCSIP) Report and the results were astonishing. The Report documents the presence of a

number of earthquake faults discovered after the design and construction of the plant that

have been found to be larger and more dangerous than previously understood. In a post-Fukushima regulatory environment, it is important that policymakers and regulators

understand the ramifications of these findings.

EXECUTIVE

SUMMARY

PG&E has a long history of grappling with Californias earthquake faults when trying to site

its nuclear plants. It had previously proposed a nuclear power plant on the California coast

at Bodega Bay but abandoned the plan when it was discovered that the site was to be built

overtop the Shaft Fault and within 1000 feet of the San Andreas Fault. Later, PG&E built a

small nuclear power plant on the California coast at Humboldt Bay, but the plant was shut

down after the discovery of three faults within few thousand meters of the plant. PG&E

selected the location for the Diablo plant, representing that the seismic activity in the area

was minimal.

In the late 1970s, when Diablo was still under construction, data surfaced on the presence

of a large active fault (named the Hosgri) located just three miles offshore from the plant.

PG&E first denied its existence. When that assertion was disproved, it argued the fault was

likely inactive. When PG&E had to concede it was active, it argued it was not capable of

producing particularly large earthquakes. It turned out it that was capable of generating

very large earthquakes.

1

In a recent replay of these events concerning a newly discovered fault system, the Shoreline

fault was discovered in 2008 and analyzed with state-of-the-art methods and found to be

capable of generating an M7.3 earthquake within a mere 600 meters of the plant.

There is no getting around the fact that PG&E has consistently downplayed seismic hazards

on the coast near its nuclear plants. Especially disturbing is that during these past decades

the NRC has repeatedly relaxed its seismic standards to accommodate the operation of

Diablo Canyon.

Now that the data about the faults near Diablo is indisputable, PG&E has changed tactics

and declared the plant safe on the basis of a new set of equations it has developed. PG&E

has undertaken major revisions to the complex ground motion equations that have been

used to estimate how much shaking can be produced by earthquakes. Unsurprisingly,

PG&Es changes to its methodologies have dramatically reduced estimated shaking at the

plant from all hypothetical earthquakes. So far, NRC has largely gone along with these

changes.

With PG&Es history of playing down seismic concerns these recent developments are

cause for deep concern. So is PG&Es documented history of co-opting the very regulatory

bodies tasked with overseeing it. Just this year:

  • PG&E was found to be inappropriately, and possibly illegally, lobbying California

Public Utilities Commissioners and staff to successfully judge shop in a case before

the CPUC. The revelation resulted in the firings of three senior PG&E executives, the

reassignment of the CPUCs chief of staff, and the decision by the President of the

CPUC to recuse himself from future PG&E decisions and to not seek re-appointment.

The CPUC was just fined a $1.05 million for this back-channel lobbying.

  • PG&E was indicted on 12 criminal charges related to safety violations in its gas

distribution, including an accusation that PG&E officials obstructed a federal

investigation and that the utility knowingly relied on erroneous and incomplete

information to avoid inspections that would have exposed risks that ultimately

killed 8 people in a 2010 gas pipeline explosion

  • PG&E was discovered, through email disclosures, to be exploring how and when the

Diablo Canyon Independent Peer Review Panel could be disbanded. This is the

state-mandated panel tasked with providing third-party quality control of seismic

risk analysis at Diablo that is quantified by the Report, which is my subject here.

In 2013, because of steam generator failures, San Onofre, Californias only other nuclear

power plant was permanently shut down at great cost to ratepayers, shareholders, and grid

operations. Last month, the Office of the Inspector General at the NRC issued a report

criticizing the NRCs failure to call for a license amendment process, which might have

identified the shortcomings of the utilitys technical analysis that ultimately led to those

leaks. The safety ramifications of steam generator leaks at San Onofre, as serious as they

were, are dwarfed by the risks to the public should PG&Es Diablo seismic analysis prove to

be incomplete or inaccurate. You would think that after Fukushima the NRC would go

beyond a check the box review process when confronted, as it is at Diablo, with the

2

possibility of a 7.3 magnitude earthquake within a half-mile of the plant. So far we have

been disappointed.

Remarkably, in all the years of its operation, the facility has never gone through a formal

license amendment process to deal with even the Hosgri Fault discovered in the 1970s.

Instead, its possible ramifications were more or less explained away in a separate

document. More significant faults have been discovered since, which speaks poorly of

PG&Es original examination of the area, and of the NRCs supervision of that process. One

should not be discovering such faults after building a plant. The potential earthquakes

affecting the plant have increased with each major study. But whats equally striking is that

the shaking predicted by PG&E for these increasing threats has systematically decreased as

PG&E adopted less and less conservative analytical methodologies, and they did so with

NRC approval.

It is time to end this hodge-podge of licensing rationalizations. We know a great deal more

about seismic issues than we did when Diablo Canyon was licensed. Its time for the NRC to

reassess the seismic standards for the plant and submit them to a formal licensing

amendment process. The thing that both PG&E and NRC fear most is a public hearing in

which they would have to justify what they have done. It is also what we need most to

assure seismic safety, and it is what the public deserves.

INTRODUCTION

In 2005, as the elected State Assemblyman representing the Central Coast and as a

geophysicist, I became concerned that PG&Es prior seismic hazard analysis in the vicinity

of the Diablo Canyon Nuclear Power Plant had failed to utilize modern state-of-the-art

geophysical techniques that have proven highly effective at mapping seismic faults. In

2006, I authored, the state legislature passed, and Governor Schwarzenegger signed

AB1632, which directed the California Energy Commission to assess existing scientific

studies to determine the potential threat of earthquakes to the future reliable operation of

Diablo. After extensive review the California Energy Commission concluded that significant

seismic uncertainty existed and charged PG&E with the task of acquiring new state-of-the-art geophysical data to reassess the seismic threats to Diablo. In the furtherance of AB1632

the California Public Utilities Commission provided $64M of California ratepayer funds to

compensate PG&E for the Coastal California Seismic Imaging Project that resulted in the

Report.

At the time of the bills passage few appreciated the potential threat that large earthquake

faults posed to operating nuclear facilities. Since then the publics awareness of the

importance of the issue has increased significantly:

  • In 2007 the Kashiwazaki-Kariwa Nuclear Power Plant, the largest in the world, was

severely damaged and shuttered due to an M6.6 earthquake 19 kilometers offshore

from the facility.

  • In 2008 the USGS discovered a previously unknown Shoreline Fault only 600 meters

from the Diablo Nuclear Power Plant and only 300 meters from the intake.

3

  • In 2011 the Fukushima Daiichi nuclear disaster resulted in the meltdown of three of

the plants six reactors, triggering an emergency review by the NRC of US nuclear

reactors and their ability to withstand shaking from earthquakes. This tragedy was

caused by an earthquake and Tsunami far larger than the utility believed possible,

which produced greater shaking than the plant was designed to withstand.

Two months ago, eight years after the passage of AB1632, PG&E issued its Report, which

will likely be relied upon by state and federal regulators in the course of their immediately

upcoming deliberations regarding PG&Es request to extend the operating license of the

Diablo through 2044-2045. My review of this Report addresses important historic,

technical, and regulatory issues that are central to the final conclusion of the Report;

specifically, that the facility has been shown to be safe from seismic threats.

PG&Es Report makes a number of key findings regarding earthquake threats. In virtually

every instance, the faults surrounding Diablo are now understood to be larger and more

connected than previously believed as recently as 2011. Of course the plant was initially

licensed assuming these seismic threats were non-existent. Whereas the Hosgri Fault had

previously been believed to be the most dangerous fault near Diablo, newly released

research shows that the prior Hosgri maximum earthquake assumption is eclipsed by five

other fault-rupture threats:

1. SHORELINE FAULT: The newly discovered Shoreline Fault located within 600

meters of the plant, is now twice as long as thought in 2011 and almost three times

as long as the lower bound proposed in 2009. With a length now understood to be

45 km long it is capable of generating M6.7 strike-slip earthquake, which is larger

than estimated in PG&Es previous 2009 and 2011 reports.

2. SAN LUIS BAY FAULT: The newly reinterpreted 16 km San Luis Bay Fault located

within 1,900 meters of the plant, is capable of generating a M6.4 reverse

earthquake, which is larger than previous estimated in PG&Es 2011 report.

3. LOS OSOS FAULT: The newly reinterpreted 36 km Los Osos Fault located within 8.1

km of the plant is capable of generating a M6.7 reverse earthquake which is smaller

than the M6.8 estimate in PG&Es 2011 report, but still estimated to produce more

ground motion than the Double Design Earthquake (DDE), also known as the Safe

Shutdown Earthquake in the license.

4. JOINT SHORELINE/HOSGRI FAULT SYSTEM: The newly reinterpreted 145 km joint

Shoreline/Hosgri Fault system now assumes that the Hosgri Fault and Shoreline

Fault connect, whereas previously the two were considered to be wholly separate

and incapable of failing in a larger single rupture. A joint Shoreline/Hosgri strike-slip rupture within 600 meters of the plant could theoretically generate

approximately a M7.3 earthquake according to the Report.

5. JOINT HOSGRI/SAN SIMEON FAULT: The newly re-interpreted 171 km joint

Hosgri/San Simeon Fault system now assumes that the Hosgri Fault and San Simeon

Fault connect, whereas previously the two were considered to be wholly separate

4

and incapable of failing in a larger single rupture. A joint Hosgri/San Simeon

rupture within 4.5 km of the plant is capable of generating a M7.3 strike-slip

earthquake, which is larger than the previously estimated M7.1 utilized in numerous

prior reports. The newly defined Hosgri Fault is considerably longer than

previously presumed by PG&E and NRC.

The predicted ground motion generated by this list of earthquake scenarios are all greater

than the current ground motion estimates for a M7.3 Hosgri Fault earthquake

located 4.7 kilometers from the facility. This result is remarkable as the enormous

Hosgri Fault, which can be seen easily on oil company seismic lines and passes the plant at

a distance of only three miles, had been argued for many years to be the greatest threat to

the facility. (Note: from a regulatory perspective the Hosgri Fault had previously been

treated as the controlling fault, which is to say the fault posing the greatest possible

seismic threat to Diablo.)

However, in a seeming contradiction, rather than finding that larger or closer faults

produce greater shaking and therefore a greater threat, PG&E argues in the Report that

ground motion will be lower than the levels previously estimated. In other words, these

newly discovered and re-interpreted faults are capable of producing shaking that exceeds

the shaking from the Hosgri, yet that shaking threat would be much reduced from prior

estimates.

Though discussed only in passing in the Report, the reason for this seeming contradiction is

quite important when assessing whether or not the plant is safe or whether it is operating

within its license conditions. The reason the earthquake threat purportedly went down

when new faults were discovered is because the utility adopted significant changes to the

methodology utilized for converting earthquakes (which occur at the fault) into ground

motion (which occurs at the facility). This new methodology, which is less-conservative

than the prior methodology, essentially de-amplifies the shaking estimated from any

given earthquake relative to the prior methodology used during the licensing process.

DIABLO LICENSING BACKGROUND

The Diablo Canyon Nuclear Power Plant was licensed through a strictly adjudicated

process that defined the Safe Shutdown Earthquake as the maximum earthquake potential

for which certain structures, systems, and components, important to safety, are designed to

sustain and remain functional. In the unique parlance of the Diablo Canyon Nuclear Power

Plant this Safe Shutdown Earthquake was defined as the Double Design Earthquake. The

NRC licensing process ensures that the detailed operability requirements of the American

Society of Mechanical Engineers Boiler and Pressure Vessel Code are met at the higher

ground motions. i The Design Earthquake (DE) for Diablo was defined during the

construction permit process as the largest of four possible earthquake scenarios. The DE

was assumed capable of generating a peak ground acceleration of 0.2 g. The Safe Shutdown

Earthquake was then defined for Diablo as 0.4g, which is to say the plant must be able to

shut down safely if a hypothetical earthquake generates double the 0.2g of shaking that

was estimated to be possible from known surrounding threats. This hypothetical Safe

Shutdown Earthquake is known as the Double Design Earthquake (DDE) and is a key

element in establishing safety standards during the licensing process.

5

This formal NRC licensing process, which defined the DDE as the Safe Shutdown

Earthquake for enforceable regulatory purposes, occurred prior to the discovery of the

Hosgri Fault. Upon its discovery the USGS analyzed the Hosgri Fault and determined that

it could generate a M7.5 earthquake at a distance of 4.5 km. The NRC negotiated with PG&E

to create the 1977 Hosgri Evaluation (HE) exception under the theory that the plant could

withstand shaking from this newly discovered fault under a narrow and specific set of

assumptions. The HE used considerably less-conservative assumptions than those used

for the DDE, which was applied to all other earthquake threats. The reduction of safety

margins by the use of these special assumptions for the Hosgri Fault was quite

controversial, and was strongly criticized by NRC Commissioners Gilinsky and Bradford in

an opinion they issued on the Diablo seismic matters in 1981.ii The DDE is the Safe

Shutdown Earthquake for Diablo and applies in the Current Licensing Basis to all faults that

can affect Diablo, with the exception of the Hosgri Fault, to which the 1977 HE exception

and its methodology and assumptions uniquely apply. Because of the differing

assumptions the HE exception did not and was never intended at the time to eliminate or

supersede the DDE standard.

To operate within its license the utility has been required to show that the plant will not be

exposed to shaking beyond either the DDE basis or the less-conservative HE exception for a

potential Hosgri earthquake. Later, the 1977 HE exception was modified to assume a

slightly smaller M7.2 earthquake but with a slightly more dangerous reverse component of

slip. The combination of the two changes produced a modified spectrum that changed only

modestly with small enhancement at higher frequencies. That modification became known

as the 1991 LTSP spectrum;iii however, it never became part of the Current Licensing Basis.

(For the rest of this letter the Hosgri shaking estimates will be described as the HE/LTSP

spectrum due to the fact that the HE and LTSP are used somewhat interchangeably and

differ only slightly, even though the differences are important from a historic and

regulatory perspective).

In 2008 history repeated itself and, as in the case of the Hosgri Fault, another offshore fault

was discovered, but this time even closer to the plant. USGS found the Shoreline Fault

within 600 meters of the reactors and within 300 meters of the intakes. When considering

that the fault runs to a depth of 16 km, spatially the nuclear power plant lies virtually

overtop the new fault. In the immediate aftermath of the discovery, PG&Es data

demonstrates that the nearby faults could produce ground motions significantly higher

than the 0.4g peak acceleration permissible under the DDE standard (see table below - note

this analysis occurred prior to the seismic studies described in the Report which found that

the faults were larger than assumed in table).

Table: Comparison of Reanalysis to Diablo Canyon SSEiv

Local Earthquake Peak Ground

Fault Acceleration

DDE 0.40g

Shoreline 0.62g

Los Osos 0.60g

San Luis Bay 0.70g

Hosgri 0.75g

6

In the face of this conflict with the license, PG&E began to compare the new seismic threats

not to the DDE in the license, but rather to the HE/LTSP spectrum. If PG&E could ignore

the DDE Safe Shutdown Earthquake standard in the license, PG&E could simply seek to

prove that the newly discovered seismic threats were bounded by the HE/LTSP spectrum,

with their less conservative assumptions - ergo, notwithstanding the newly discovered and

re-interpreted faults, the plant could be said to be operating consistent with its license.

Dr. Michael Peck, the Senior Resident NRC Inspector at the Diablo Canyon Nuclear Power

Plant, was concerned that the newly discovered and re-interpreted faults (Los Osos,

Shoreline, San Luis Bay) had been shown by PG&E to produce greater shaking than the .04g

peak acceleration DDE design basis. He stated that the only approved exception to the DDE

was the 1977 HE exception, which applied only to the Hosgri Fault, and that the exception

was not transferrable to these other nearby faults - ergo a license amendment was required

to correct the inconsistency between the existing license and the new seismic threats.

Buttressing Pecks argument that the less strict spectrum was not to supersede or replace

the DDE, on October 12th, 2012 the NRC wrote to PG&Ev: The DCPP Final Safety Analysis

Report Update states in Section 2.5,

the LTSP material does not alter the design bases for DCPP. In SSER 34 the NRC

states, The Staff notes that the seismic qualification basis for Diablo Canyon will

continue to be the original design basis plus the Hosgri evaluation basis

(emphasis added).

Faced with newly estimated ground motions in excess of the DDE Safe Shutdown

Earthquake license requirement, PG&E proposed revising its license to eliminate the DDE

requirement and have the HE/LTSP spectrum, with its considerably less protective

methodological assumptions, apply not just to the Hosgri Fault as an exception to the DDE,

but to all faults. The NRC declined to accept the request for review because it failed to meet

certain required standards.

CRITICAL ISSUE EXPLORED

I do not seek to engage on Pecks important regulatory issue of whether the utility can now

legally disregard the DDE standard and instead meet only the less-conservative HE

exception. That is a matter for the NRC to determine based on its safety and regulatory

standards and, hopefully, informed by the post-Fukushima understanding of the dangers of

lax regulatory oversight. In the aftermath of this disagreement between the Senior

Resident NRC Inspector at Diablo Canyon Nuclear Power Plant and NRC staff, deliberation

on this regulatory issue is now the subject of a lawsuit filed before the US Court of Appeals

for the District of Columbia.

Instead, this analysis seeks to explore a different issue; specifically, is PG&E correct when it

asserts that the utility has shown that the new seismic threat is bounded by the 1977 HE

exception? (By exploring only this second issue I do not mean to minimize the importance

of the first issue, but this second issue is central to the critical conclusion of the Report). In

other words, the question is whether or not the new seismic threats have in fact been

shown to produce shaking that is smaller than the HE basis exception when the same

associated analytical methods used to create the HE basis exception are applied to the new

seismic threats.

7

Why is it important to add this caveat about the same associated analytical methods?

Because the rest of the NRC statement cited above under SSER 34 goes on to say,

The Staff notes that the seismic qualification basis for Diablo continues to be the

original design basis plus the Hosgri evaluation basis, along with associated

analytical methods, initial conditions, etc.(emphasis added).

If the utility seeks to argue that the 1977 HE exception can be used as an alternative

standard to avoid the stricter DDE standard, which is controversial in itself, then the

methods which were used to compute the HE exception become of paramount importance.

This analysis seeks to document that the associated analytical methods used by the utility

to analyze the new seismic threats in the Report are markedly less-conservative than those

used for the 1977 HE exception.

Why is this change in methodology important, particularly when the methodology is less

conservative? Under 10 CFR 50.59, a license amendment is required when the Final Safety

Analysis Report (FSAR) is inadequate to describe the circumstances at the plant and there

is a

departure from a method of evaluation described in the FSAR (as updated) used in

establishing the design bases or in the safety analysis. NRC regulations define such

a departure as: "(i) Changing any of the elements of the method described in

the FSAR (as updated) unless the results of the analysis are conservative or

essentially the same; or (ii) Changing from a method described in the FSAR to

another method unless that method has been approved by NRC for the intended

application."

The NRC requires a license amendment when there is a departure from a method of

evaluation that established the design basis unless that departure is essentially the same or

more conservative. If the utility is allowed to employ less-conservative analytical methods

to obtain more optimistic results then prior safety standards could be lowered without the

full understanding or regulatory concurrence of the NRC.

It was this very problem that led to the shutdown of the San Onofre SONGS plant. The

failure of the NRC to recognize the need for a license amendment to replace San Onofres

steam generators was identified by the Office of the Inspector General at the Nuclear

Regulatory Commission as a missed opportunity to identify weakness in Edisons technical

analysesvi. There is a marked difference between NRC staff review of a utilitys change in

methodology versus the rigor and process associated with a license amendment.

This analysis contends that because a true apples-to-apples comparison was never made in

the Report between the Hosgri and the new seismic threats using analytical methods that

are conservative or essentially the same as those used for the Hosgri evaluation.

Therefore, it is inaccurate to assert that the new seismic threats are shown to be bounded

by the Hosgri evaluation basis - as that phrase has any bearing for regulatory purposes.

This contention is important because - If PG&E is allowed by the NRC to reject both the

stricter standard of the DDE and the conservative analytical methods used when the 1977

HE exception was authorized, then the NRCs prior seismic safety licensing standards will

have been, for all practical purposes, circumvented.

8

Making this particularly troubling is that this circumvention will have been achieved

without a license amendment process, which would ensure a more robust process for

including analysis of differing and minority findings and opinions - findings and opinions

which have been proven over time to be right, more often than not.

GROUND MOTION PREDICTION RETROSPECTIVE

Methodologies employed to assess potential shaking at the nuclear power plant can be

broken into three broad categories:

1) SOURCE: Estimated energy released by a specific earthquake on a given fault - based on

equations that involve factors such fault mechanics, stress drop, radiation pattern,

directivity, rupture history, rupture length and width, etc.

2) PROPAGATION: Estimated attenuation and amplification factors that convert the

energy released during the fault rupture process to the actual observed free field

ground motion at a particular site, based on:

a. TRANSMISSION EFFECTS: Energy transmission involves absorption and scattering,

otherwise known as attenuation, incurred along the propagation path from the

earthquake to the vicinity of the particular site, and

b. SITE EFFECTS: Site amplification and de-amplification effects due to the stiffness of

the rocks and soils of the particular site and the impedance contrasts that give rise

to a variety of scattering and reverberation effects.

3) TRANSFERENCE: Estimated shaking adjustments from reference free-field station to

power-block, turbine-building foundation levels, and then to structures, systems, and

components throughout the facility - based on certain projection, coherence, and

damping factors.

This analysis seeks to examine #1 and #2a and #2b cited above.

A Ground Motion Prediction Equation (GMPE) is used to predict shaking at a particular

distance from an earthquake based on a variety of parameters. A GMPE represents the

statistical relationship that best fits the empirical distance-attenuation observations from

some database of earthquake recordings. Some of the parameters used to make the

estimate include: size of earthquake, fault mechanics, geometry of the fault to the recording

station, and the velocity of the rocks immediately below the recording station. GMPEs

incorporate a large range of phenomena and effects.

Since discovery of the Shoreline Fault PG&E has significantly changed the GMPE equations

used to analyze potential shaking at Diablo. The following summarizes the changes and

their net effect on seismic hazard estimates. To help track the evolution of GMPEs they are

informally numbered in the following retrospective. (GMPE-1, nomenclature for the

purposes of this letter would be the methodology used for the DDE and the HE exception

from the construction permit).

In 1991, PG&E constructed the LTSP spectrum, which assumed a M7.2 earthquake at a

distance of 4.5 km and used a GMPE (GMPE-2) derived from their own distance-

9

attenuation relationship based on a database of strong-motion recordings of earthquakes

at a range of distances along with regression analysis

In 2008 the Shoreline Fault was discovered which triggered a requirement that PG&E

assess whether or not shaking caused by the newly discovered fault was bounded by the

DDE and the HE exception, as required by its current operating license. Rather than use the

same GMPE to perform that analysis PG&E began introducing new methodologies making it

difficult to perform historical comparisons with earlier standards approved through the

NRCs regulatory process.

PG&E, in an initial sensitivity reportvii to the NRC, assumed that the length of the Shoreline

Fault was as much as 24 km long with a depth of 12 km and capable of generating a M6.5

earthquake. It then used an assortment of different recently developed GMPEs, known as

the Next Generation Attenuation models, to create a new averaged GMPE (GMPE-3) to

compute shaking estimates at the plant caused by a Shoreline earthquake. GMPE-3

resulted in a de-amplification effect of median estimated shaking, relative to the prior

methodology, i.e. a decrease in shaking, relative to GMPE-1 or GMPE-2. This new GMPE

was justified based on the use of the Pacific Earthquake Engineering Research Center

(PEER) database of some 3,600 earthquake recordings. Using GMPE-3 PG&E reported that

the shaking was substantially lower than, or bounded by, the LTSP/HE spectrum1.

In 2009, NRC staff used PG&Es proposed GMPE-3 equations but then analyzed the

Shoreline Fault assuming it was 24 km long with a depth of 16 km, which was more

conservative than PG&Es depth of 12 km. Using these parameters, and including a 1

standard deviation of magnitude estimate, the largest possible earthquake was computed

to be M6.85 rather than M6.5. Assuming the somewhat larger earthquake their analysis

found,

The motions are very close to the LTSP/HE in the high-frequency range but fall

below the LTSP/HE in the long-period range. and seismic loading levels

predicted for a maximum magnitude earthquake on the Shoreline Fault are slightly

below those levels for which the plant was previously analyzed in the Diablo Canyon

Long Term Seismic Program (emphasis added).

Using GMPE-3 shaking from an assumed 24 km Shoreline Fault was found to be very close

to and only slightly below the LTSP/HE spectrum when using the new GMPE-3

methodology (emphasis added).

The five NGA GMPEs which, when averaged, produce GMPE-3 are each shown in figure 10

from the NRC report. The NRC staff analysis also tested the significance of using the lower-bound estimate of rock velocity rather than the best estimate (lower velocity corresponds

to higher shaking). Using a rock velocity of 800 m/s instead of 1,100 m/s resulted in a

spectrum that, exceeds the LTSP spectrum by a small amount over some frequencies. In

summary, by using reasonable but somewhat more conservative approaches to the three

available variables (the NGA model selection, earthquake magnitude estimate, or rock

1 The LTSP and HE spectra are very similar and are used almost synonymously in some reports cited herein. To avoid confusion caused by switching back and forth, a single term LTSP/HE will be used in some instances even though they differ from a regulatory basis.

10

velocity) the spectrum was found to be very close or exceedsby a small amount. This

result was quite significant because it showed that, even in the early days when the

Shoreline Fault was still believed to be relatively small, shaking could exceed the LTSP

Spectrum assuming certain models and certain rock parameters. The Chiou & Youngs (08)

GMPE (dotted blue line) exceeds the LTSP Spectrum (solid black line) at about 7 Hz and

above, the others are just a little below, hence the characterization that they are very close

(emphasis added).

This result naturally raises important questions about the effect of the new GMPE applied

to the Shoreline. For example: would estimation of shaking on a 24 km rupture of the

Shoreline Fault have exceeded the LTSP if GMPE-1 was used rather than GMPE-3? Given

what is shown in Figure 10 it appears that the answer would likely be yes if the difference

between GMPE-3 vs GMPE-1 was anything other than de minimis, but that analysis was not

performed in the 2009 Shoreline report.

The effect of which GMPE methodology is employed is highlighted in a NRC staff remark

when it wrote, epistemic uncertainty in the GMPEs, which tends to be higher in the

magnitude-distance ranges with sparse available seismological data (such as large

magnitudes at short distances). Generally the GMPEs are the largest source of

uncertainty in the ground motion values produced in seismic hazard analysis (emphasis

added). Here the NRC staff acknowledges that the new GMPEs are the source of the greatest

uncertainty, and, that uncertainty is greatest for large earthquakes at short distances,

which is exactly the situation for Diablo.

In 2011, PG&E issued its Report on the Analysis of the Shoreline Fault Zone, Central

Coastal California assuming the same maximum M6.5 earthquake along a 23 km fault, but

introducing a number of new factors creating yet another new GMPE, named here as

11

GMPE-4. The utility started with its 2009 GMPE-3 equation but then added a new hard-rock effect. The rationale for this equation was inferred from work by Silva (2008). The

result adjusted estimated shaking downward still further from GMPE-3. Silvas work,

which was specific to a particular range of rock hardness along with other factors, did not

include the actual rocks at Diablo. Therefore PG&E extrapolated the findings of the

published paper so they could be applied to Diablo where a faster rock velocity of 1,200

m/s was assumed (faster rocks equate to lower shaking).

Additionally, PG&E created new equations to reduce the standard deviation of the

estimated shaking. Because 84th percentile shaking estimates are the sum of the median

shaking plus one standard deviation the total spectrum can be lowered either by reducing

the median, reducing the standard deviation, or lowering both.

With the issuance of the 2011 report PG&E reduced both the median and the standard

deviation used in the analysis of the seismic threats - the first through yet another new

GMPE with hard-rock de-amplification effects; and second, through a statistical approach

described as single-station sigma.

Using this new GMPE-4 the resulting spectrum that was no longer slightly below and

very close to the LTSP/HE spectrum, per the prior NRCs findings of 2009 (emphasis

added). The new margin was significantly larger thereby allowing PG&Es to again assert

that the LTSP/HE spectrum was not at risk of being exceeded by shaking on a M6.5

earthquake on the Shoreline Fault. Note how the PG&Es methodology to compute shaking

changed not once but twice in the short period of time since the discovery of the Shoreline

Fault in 2008. Both those changes produced reduced estimates of shaking from the newly-discovered Shoreline Fault.

In 2012, NRC staff issued its Confirmatory Analysis of Seismic Hazard at the Diablo Canyon

Power Plant from the Shoreline Fault Zone. The report details staffs review of PG&Es

report. NRC staff decided to lower their maximum possible earthquake from M6.85 to

M6.7, which was closer to PG&Es figure of M6.5 (smaller earthquakes correspond to lower

shaking). Similarly staff decided to revise their estimate of rock velocity upward from

1,100 to 1,200 m/s which was the figure used by PG&E (faster velocities correspond to

lower shaking).

They also reviewed PG&Es new hard-rock de-amplification adjustment and pointed out a

number of problems with the approach including uncertainty in the estimate of Kappa, a

factor that describes damping in basement rock. When NRC staff explored alternative

methodologies they found, the NRC results are conservative relative to the PG&E results at

virtually all frequencies.viii Nonetheless, NRC staff incorporated a new hard rock effect and

added that factor to GMPE-3. Staff elected not to use add single-station sigma effect to

further lower the 84th percentile of shaking. They did however agree with PG&Es

conceptual approach, albeit they noted statistical unreliability of its use at Diablo due to

small amounts of available data.

To issue this report, NRC staff acquiesced to PG&Es use of the

1) Use of the new NGA GMPEs,
2) Averaging of NGA GMPEs to eliminate outliers,
3) Smaller earthquake magnitude estimate,

12

4) New hard-rock rock scaling factor,
5) Increased site rock velocities, and
6) New statistical single-station sigma.

The net effect of adding these factors allowed the NRC to issue a confirmation in 2012 of

PG&Es assertion that the Shoreline Fault would produce shaking below the LTSP/HE

spectrum.

In 2014, after the offshore seismic studies were completed, PG&E issued its Coastal

California Seismic Imaging Project (CCCSIP) Report. The Report concluded that the

Shoreline Fault is 45 km long (a tripling of the utilitys 2009 lower-bound figure) and that a

hypothetical joint Hosgri/Shoreline Fault rupture would be 145 km long generating a M7.3

earthquake within 0.6km of the plant (corresponding to a factor of 30 greater released

energy relative to the earlier lower-bound estimate). The Report also details the size and

location of the Los Osos and San Luis Bay Faults and the potential earthquakes they could

generate. Again, all of these threats produce shaking that is greater than their new

calculations of shaking from the Hosgri, which had previously been identified as the

controlling fault

In Chapters 11 and 13 PG&E analyzes the new seismic threats, which are markedly larger

than those analyzed in 2011 using their new GMPE-4 which was used successfully with the

Shoreline Report to calculate lower levels of shaking than the earlier methodology.

Analyzing the new threats using GMPE-4 the Report finds that even a massive M7.3

earthquake linking the Hosgri and Shoreline Faults, with rupture occurring within 600

meters of the reactors, could not exceed the LTSP/HE spectrum. Demonstrating just how

effective these less-conservative methodologies are in lowering estimates of shaking,

without a single retrofit, Diablo becomes virtually invulnerable to any imaginable

earthquake regardless of size and proximity.

Evidence of the total cumulative effect of these new methodologies can be inferred by

looking at the before and after calculations of shaking of a hypothetical Hosgri Fault

earthquake. Such a comparison shows that the peak acceleration is reduced from 0.75g to

0.46g! The de-amplification effect is even larger than suggested by this 38% decrease in

estimated shaking because the before Hosgri earthquake is smaller than the after

Hosgri earthquake, which now assumes a joint rupture on the Hosgri/San Simeon Fault

System. The importance of using a new methodology that reduces peak accelerations by at

least 38% is never singled out for mention in the Report, nor is the prior less-conservative

methodology applied to the new seismic threats.

IMPORTANCE OF NEW GMPEs

These changes to GMPEs, documented in the prior section, are crucial to the fate and future

of Diablo and give rise to two important questions.

First, from a technical perspective: Are these rapidly evolving GMPEs appropriate

for application to Diablo given the statistics and science embedded in their

assumptions?

Second, from a regulatory perspective: Are these rapidly evolving GMPEs

appropriate for application to Diablo when dealing with the safety margins and

13

adjudicated rules that define how nuclear power plant licenses are enforced or

amended?

In this retrospective of evolving GMPEs Ive made no arguments regarding the technical or

scientific merit regarding the half-dozen changes to GMPEs that have occurred. This is a

rapidly evolving field of research for which there is insufficient data to provide a simple

yes or no answer. Instead it is more appropriate to identify concerns and to point out

alternative interpretations to the existing data. Therefore, I address the first question in an

attached appendix, which can be read separately from this letter.

However, as a former policymaker I do believe there is a clear-cut answer to the second

question, which I will address here. Making this GMPE chronology troubling from a

regulatory and safety perspective is that, as newly discovered or re-interpreted faults are

progressively understood to be larger and more dangerous than previously believed the

newly derived methodologies adjust shaking downward just sufficiently to accommodate

the new threat. In fact, the safety of the facility no longer depends on whether or not

dangerous faults actually surround the nuclear power plant and are capable of generating

earthquakes that exceed the shaking predicted from the previously defined controlling

fault. That question has been answered unequivocally and the PG&E Report acknowledges

the presence of such earthquake faults. Instead the safety of the facility depends upon the

reliability of new less-conservative equations, which are going through major revisions

literally with each newly issued report.ix

These facts raise significant regulatory issues that need to be addressed at the highest

levels of the NRC. In this instance we see a nuclear power plant that is found to be exposed

marked greater seismic threats than ever envisioned during the licensing process. This

increase has happened not once, with the discovery of the Hosgri Fault, but twice. With

this years report an entire new class of earthquake threats have been identified that

eclipse the prior Hosgri Fault threat. This fact alone should galvanize the NRC to act. But

what makes the situation even more dire is that the methodologies used by the utility to

analyze the new threats have changed as well. If the utilitys associated analytical methods

to compute ground motion were the same or more conservative the debate would be solely

on the scientific questions surrounding the earthquake potential introduced by the new

faults. However, in this case the associated analytical methods to compute ground motion

beneath the plant are markedly less-conservative than those ever used before. These

methods are less-conservative than when the plant was licensed and less conservative than

even six years ago when the Shoreline Fault was first discovered. If the prior

methodologies used during licensing were applied to these new faults it is possible, and

perhaps likely, that shaking would exceed both the DDE and LTSP/HE spectrum. If true,

this means that the plant is currently operating beyond the tolerances established under its

license. That is why this is a critical regulatory issue. Threats are going up at the same time

the utilitys preferred method for analyzing all such threats has become markedly less

conservative. From a regulatory perspective, it is this simultaneous convergence of higher

threats and less-conservative methodologies that requires the NRC to act immediately.

14

CONCLUSION

In summary: The geophysical methodologies to locate faults and assess the size of potential

earthquakes are well established and have been tested for innumerable instances over

many decades. Similarly, the estimation of site effects when dealing with relatively simple

geology is well understood. This history has allowed regulators to rely comfortably on the

long record of published findings on these important elements of seismic hazards.

However, the geophysical methodologies for determining ground motion in the extreme

near-field are in a rudimentary state of development. Similarly, the estimation of

broadband site effects when dealing with highly complex and heterogeneous 3D geology is

a difficult technical problem and an active area of research (see appendix).

PG&E has progressively used methodologies that produce less-conservative results to

analyze the steadily increasing seismic threat. With each successive generation of new

information about the threat prior methodologies are modified and more sanguine results

are obtained.

Of course, from a research perspective, the fact that a whole series of new methodologies

are being explored and new equations are being tested, albeit with limited data (see

appendix), is a good thing. However, it is a quite perilous thing from a regulatory

perspective, which requires high-levels of scientific and statistical certainty based on large

datasets and well-vetted methodologies. The regulatory determination of safety should not

hang tenuously upon the results of an ongoing science experiment. When faced with such a

situation nuclear regulators must rely upon the existing, more conservative, and

historically accepted methodologies to assess risk.

But beyond the imprudence of relying upon rapidly evolving methodologies to obtain

lower risk estimates at a nuclear power plant, there is a regulatory reason why such an

approach is not allowable. The NRC stated,x The Staff notes that the seismic qualification

basis for Diablo Canyon will continue to be the original design basis plus the Hosgri

evaluation basis, along with associated analytical methods, initial conditions, etc.

(emphasis added). Clearly the Hosgri evaluation basis, if it is to have any regulatory

meaning, can only be applied to a new seismic threat if the same, or more conservative,

analytical methods are employed to compare the two. This however is not how the utility is

treating the Hosgri evaluation basis. Instead, the utility employs significantly less

conservative analytical methods and then states that the lower shaking produced by new

seismic threats is bounded by the HE exception.

Finally, if altogether less-conservative methodologies are to be used to analyze altogether

new and more dangerous faults it is important that such analysis be performed at arms

length through a transparent, rigorous, and strict license amendment process so that the

public can have confidence that safety is the foremost consideration of the NRC. This is why

such analysis should be performed through the course of a license amendment process.

My overarching concerns with the Report include:

  • Disregard of DDE basis: In a post-Fukushima setting the NRC must insist upon the high

and robust seismic safety standards at the nations only nuclear power plant that is

15

ringed by numerous nearby faults capable of earthquakes, each larger than the

earthquakes envisioned from previously assumed controlling fault. However, to-date

the NRC has ignored the cautions of experts and even its own resident inspector who

has declared the plant is operating beyond its current operating license based on the

DDE.

  • Weakening of HE basis: The 1977 HE basis was allowed as an exception that applied

only for an earthquake on the Hosgri Fault. However, while the utility is ignoring the

DDE standard and is applying the HE exception to all faults, it is also simultaneously

seeking to weaken the 1977 HE exception by creating new associated analytical

methods that are markedly less conservative.

  • Lack of Transparency: Notably, the Report never makes an apples-to-apples

comparison wherein the same associated analytical methods are used to analyze new

seismic threats and the HE exception. Nor are lower-velocity parameters input to the

new analytical methods to assess their sensitivity to critical real world parameters and

uncertainties at the site. The public is never given the opportunity to see the cumulative

effect of each generation of new GMPEs or the range of effects due to rock velocity

selection. This makes it impossible for PG&E to accurately assert in the Report that,

from a regulatory perspective, the new seismic threats are shown to be bounded by

the HE basis. From what data are shown by the Independent Peer Review Panel such a

transparent and apples-to-apples analysis would likely prove the opposite.

  • Rapidly Evolving Analytical Methods: The utility is relying upon less-conservative

methodologies that are evolving and changing rapidly, which reduces reliability and

confidence from a regulatory perspective. The velocity parameters themselves, upon

which some of these new methodologies depend, are in serious dispute. Furthermore,

the methodology to compute extreme near-field ground motion in a setting ringed by

large strike-slip and reverse faults is nowhere near developed enough to ascribe

certainty to median or variance estimates of probable shaking.

  • More Seismic Threats to Come?: Two future possible seismic threats remain unknown

due to data limitations. It is not clear that the poorly imaged faults under the Irish Hills

have been properly identified in the geologic cross-sections which could mean a whole

new category of undiscovered threats may exist directly under the plant. The quality of

the seismic data obtained onshore just under the Irish Hills is poor and due to the

virtual absence of relevant geologic information from deep wells it is difficult to

differentiate between active and dormant faults in the seismic data. Whether or not

another class of active thrust faults exist under the plant remains an open question. The

current data cannot be used to rule out such a possibility and the compressional nature

of the topography argues that such faulting could be inferred. Additionally, the study

area used by PG&E does not include the area that connects the more northerly San

Simeon Fault with the San Gregorio Fault. The Report agrees that the Hosgri Fault is

connected with the San Simeon Fault, which has caused the maximum possible

earthquake to increase significantly. If the San Gregorio Fault to the north is similarly

16

connected then the Report has underestimated the maximum earthquake that Diablo

might need to survive.

  • Troubling History: The utility has a long and remarkable history of producing sanguine

technical reports that get the seismic hazard analysis at Diablo exactly wrong.

Whenever new data has emerged identifying possible new seismic threats the utility

has mobilized its internal and external experts to sequentially argue that nearby faults

simply didnt exist, they did exist but were inactive, they were active but not large, and

then that they were large but segmented and unconnected. Now that the evidence about

the size and location of the faults is indisputable - the argument has suddenly changed

again. Now the utility declares that although the faults are quite large, nearby, and

interconnected the prior equations used during the licensing process to predict shaking

should be abandoned and replaced with less-conservative methodologies which allows

the utility to claim that the plant is safeeven from a M7.3 within 600 meters of the

facility. One must ask, if the utility has been proven to be wrong so many times in

the past on so many similar issues and given the high stakes of mishandling this

critical issue, should the utilitys new-found conclusions be relied upon without

the direct regulatory oversight of the NRCs license amendment process? As a

scientist and a policy maker I believe the responsible answer is No.

In conclusion, if the NRC were to decide to rely upon the utilitys assertion that the facility

is operating in conformance with its license based on these new evolving less-conservative

equations the NRC would be allowing the HE exception to be markedly weakened by the

utility without the third party objectivity, regulatory safeguards, and technical rigor of the

license amendment process. Such a decision in the aftermath of the difficult lessons of

Fukushima could come back to haunt the NRC, the utility, and more importantly - the

public.

17

APPENDIX

TECHNICAL CONCERNS

The following are some of the reasons I believe that these less-conservative equations and

evolving GMPEs are still very much a work in progress, making it premature to apply the

methodology to the Diablo Canyon Nuclear Power Plant. If the only possible way to prove

that the plant is operating below the LTSP/HE basis is through the use of these new

equations then a formal adjudicated license amendment process, especially if the LTSP/HE

exception is to be relied upon in lieu of the DDE safety standard.

CONCERN #1 - Methodology limitations in applying PEER derived GMPEs distance-attenuation predications for extreme near-field applications: The Next Generation

Attenuation models, which is the basis for GMPE-3 and many of the other subsequent

GMPEs, is derived from the PEER database of some 3,600 recordings. The various peer-reviewed and published attenuation-distance equations are based on robust statistical

best-fits to the very large PEER dataset. However, the proximity of the plant to the

Shoreline and the San Luis Bay Faults are only 0.6 km and 1.9 km. Out of this entire PEER

dataset only a couple dozen recordings exist within 2 km of the fault and of those only 8

recordings occur with 0.6 km. This number of recordings is insufficient to create a

statistically significant estimate of ground motion in this extreme near-field setting. Any

statistical estimate of an empirical distance-attenuation relationship in which over 99% of

the data occur in a range outside of the distance where the relationship will be applied is

unreliable for determining a mean or variance of shaking.

The uncertainty in the extreme near-field estimates of ground motion using NGA GMPEs is

not reduced through an averaging approach. All of the GMPEs constructed from various

subsets of the PEER dataset include the same systematic under sampling of extreme near-field recordings and over sampling of far-field earthquakes. Because this error is

systematic rather than random the averaging process cannot be relied upon to improve

confidence of extreme near-field shaking estimates.

The new Next Generation Attenuation models used for GMPE-3 and the even-newer

GMPE-4 both suffer from data limitations that make them problematic for reliable

application to Diablo. Simply adding geologic, site effect, and statistical correction factors

to the underlying NGA equations does not overcome the statistical problem inherent in

applying these equations in the extreme near-field.

CONCERN#2 - Methodology problems in PG&Es site-specific adjustments to shaking

estimates at Diablo: As stated above, the Next Generation Attenuation models, which is the

basis for GMPE-3 and GMPE-4 is derived from the PEER database of some 3,600 recordings.

The vast majority of these recordings occurred in rock types that differed significantly from

the rocks types under the Diablo Canyon Nuclear Power Plant.

The NRC pointed out in September 2012 that there are,

only 51 recordings with sites defined with Vs30>=900 m/s. This is less than

1.4% of the database. There are only 15 recordings with Vs30>=1,200 m/s (less

than one-half of one-percent).Hence, applying a Vs30 of 1,200 m/s directly in the

18

GMPEs increases uncertainty, as this value is beyond the range well constrained by

the observational data.xi

To deal with this deficiency NRC staff and PG&E began constructing a variety of rock type

correction factors and single-site correction factors. These new adjustments were derived

from the utilitys own sparse database.

Such an effort could be justified if the proper dataset were available; however, the Diablo

database is inadequate for this purpose. Over the past few years only a handful of strong-motion instruments at Diablo have recorded just two relevant-sized earthquakes (e.g.,

>=M6.0). These two earthquakes are the M6.0 Parkfield earthquake at a distance of 85 km

and the M6.5 San Simeon earthquake at a distance of 35 km. It is simply not possible to

perform rigorous statistical analysis on a sample size of two.

What makes the small size of this dataset even more troubling is that neither of these two

reference earthquakes occurred to the west or south of the plant, which is where the

Hosgri, Shoreline, and San Luis Bay Faults are located. Any site-specific Greens function2

derived from the small amount of existing strong motion data would not include

information about how the site responds to energy from a large earthquake arriving from

the west or south.

Wellbore velocity profiles obtained at the site prove that the underlying soft and hard rock

environment is neither homogeneous nor layer-cake 1-dimensional. Instead a high degree

of 3D complexity with significant impedance heterogeneity is evident in the geology

underlying the plant. Therefore a single azimuthally-independent site response will likely

fail to incorporate the 3D heterogeneity at the site. Any empirically calculated Greens

function based on limited-azimuth data from the north and east will be unreliable in

predicting strong ground motion from the Hosgri, Shoreline, and San Luis Bay Faults.

Finally, neither of these two reference earthquakes occurred in the near field. A near-field

earthquake cannot be treated as a virtual point source at a fixed azimuth. Instead a near-field earthquake must be treated as a distributed source whose azimuth varies as the

rupture propagates up to, along side, and then past the nuclear power plant. This areal

source propagates signal to the recording site from a range of azimuths and inclinations,

potentially with different Greens functions. Two relatively distant point-source signals,

Parkfield and San Simeon earthquakes, from the east and north cannot be used to infer the

shaking from a rupture on the Hosgri or Shoreline Faults that actively propagates in the

near-field past the plant, and/or stops directly adjacent to the plant to the west.

Given the significant number of large active faults that surround the plant, a dangerous

neighborhood to be sure, it is imprudent to base the safety of the plant and the community

solely upon reliance on site effects derived from this small dataset.

Future possible research designed to create a numerically simulated 3D site effect (which is

reportedly underway and will become GMPE-5) to get around the deficiencies of both the

2 Greens Function: A mathematical term of art defining a system response to an impulse signal which can be used to describe, through convolution and superposition, a systems response to a more complex signal

19

empirical data sets identified above, would face significant challenges. Accurate numerical

elastic wave-equation simulation of a site-specific Greens function would require a 3-D

velocity and impedance structure below and around the facility that extends to

considerable depth, includes surficial topographic features, and accounts for accurate P-S

and S-P and surface-wave conversions calculations, complex ray bending, critical

refracting, scattering and focusing effects. To construct such a simulation would require

higher-resolution and deeper data than is currently available from the wellbore or near-surface tomographic information.

If somehow such difficulties could be overcome, the numerically simulated site response

would still need to be tested to determine how well it predicted the shaking generated by

an actual earthquake >=M6.0 impinging on the site from the west and originating in the

near-field. A prediction without a test to assess the accuracy of the prediction would be

insufficient for regulatory purposes.

CONCERN #3 - Methodology problems in estimating shaking caused by an earthquake

located in the extreme near field: This issue is different from the statistical issue regarding

the paucity of data available in the near-field recordings or the lack of data for the rock-types in question - which were covered under concerns #1 and #2, respectively. At

progressively greater distances from an earthquake the particulars of the dynamic rupture

process becomes less important relative to the larger effects of total energy release and

energy attenuation during transmission. However, in the extreme near field the location of

a recording station relative to an earthquakes rupture history, asperity locations,

heterogeneous stress drops, and starting and stopping phases, directivity, and a host of

other effects become very important - in some cases the largest effect under consideration.

Due to the location of the Shoreline, Los Osos, and San Luis Bay, and Hosgri Faults these

effects would likely be significant. As more extreme near-field recordings have been

obtained, although still relatively few in number, it has become clear that a simple

estimation of an earthquakes magnitude and distance from a site may be insufficient to

make precise estimates of shaking.

For example, in 2004, 48 strong-motion recordings within 10 km of the San Andreas Fault

were made of the M6.0 Parkfield earthquakexii. This dataset was used to test three different

attenuation-distance equations. These equations are shown to do a good job of making

accurate predictions for distances beyond about 10 km, but the observed shaking becomes

highly variable in close proximity to the fault. Rather than finding accurate predictions of

mean shaking in the extreme near-field the paper notes,

Peak ground acceleration in the near-fault region ranges from 0.13 g at Fault Zone

4, to 1.31 g at Fault Zone 14, ten times larger, to over 2.5 g at Fault Zone 16 (where

the motion exceeded the instrument capacity and the actual maximum value is still

being estimated).

20

Figure 3: Shakal et. al., 2004 showing remarkably high and low accelerations in the

extreme near-field (rupture started where the star is shown and then propagated to

the north-east and south-west where they stopped)

The dense strong-motion Parkfield recordings are relevant to the conclusions of the Report

for a number of reasons.

  • First, these extreme near-field areas of high and low acceleration are not well predicted

by a distant-dependent GMPE estimate of shaking. In this extreme near-field setting

the particulars of how ruptures start and stop, the direction the rupture propagates, the

potential focusing effect of the velocity structure of the fault zone, the locations of

specific asperities become major factors that affect ground motion. These factors are

not included in the current generation of GMPEs, which were never intended to

describe these complex phenomena that are significant effects principally in the

extreme near-field.

  • Second, the Parkfield data shows that the high degree of variability in the extreme near-field is not a spatially random phenomenon. Instead the highest levels of acceleration

are systematically found near the ends of the fault where stopping phases radiated

21

energy during the rupture process of this specific earthquake. If the nuclear power

plant happens to be located in a zone of focused seismic energy the 84th percentile

estimate from the GMPE estimate will likely underestimate the observed shaking.

  • Third, PG&E has argued in the Report that while an earthquake on 100 km of the Hosgri

Fault could jump to the 43 km of the Shoreline Fault creating a 143 km rupture, the

likelihood of such an event is purportedly low. They contend that a north-to-south

Hosgri rupture that jumped to the Shoreline would terminate due to bending and

segmentation before rupturing the full length of the Shoreline Fault. If PG&E is right in

this assertion they would be correct to reduce the component of shaking that is derived

from the size of the earthquake. But they would then need to account for the markedly

higher accelerations produced by stopping phases that would radiate from the

segments and asperities associated with terminating the rupture near the facility.

Given the high accelerations observed in the Parkfield dataset, an earthquake that

propagates the 100 km length of the Hosgri and only 20 km of the Shoreline but

violently stops directly adjacent to the nuclear power plant could in fact be more

dangerous than a scenario involving the full 145 km of propagation

There are a few ways to demonstrate the significant influence of these new equations. One

obvious demonstration is to review the reduction in estimated shaking from an earthquake

on the Hosgri Fault relative to PG&Es earlier estimates when creating the HE/LTSP

spectrum.

22

(Figure 7a from IPRP Report)

As seen in Figure 7a from the Independent Peer Review Panel (IPRP) report and in a

number of other related reports, the new less-conservative equations cause a major

reduction in shaking across the entirety of the frequency spectrum from a hypothetical

earthquake on the Hosgri fault (compare blue lines which use the newly devised methods

with black lines which use the prior methods, in figure 7a above). In the frequency range

from 2-10 Hz the less-conservative methodologies have cut the maximum estimated

acceleration from 2 g down to about 1.3 g. At the peak-frequency range, from 30-100 hz,

the maximum estimated acceleration as been reduced by a third from .75 g to under .50 g.

In fact the de-amplification effect is even larger than this comparison suggests because the

blue lines, which represent the shaking on the re-interpreted Hosgri, assume a larger

rupture on the Hosgri Fault than the earthquake that was used to initially create the 1977

HE basis exception.

More importantly, as can be seen in Figure 7a the shaking from the Los Osos, Shoreline, San

Luis Bay Faults all exceed the re-interpreted Hosgri (red, yellow, green lines are all above

the blue line). One can reasonably conclude that, if the original analytical methods

had been used to estimate ground motion, the new seismic threats would exceed the

original HE and LTSP spectra.

This conclusion is supported by the sensitivity analysis shown in Figures 7b and 7c, which

test the importance of various parameters to the new GMPE and site effects. The same

IPRP report cited previously states,

These two figures also show that if DCPP site had a Vs30 value of 760 m/s rather

than 1,200 m/s, and if the site behaves more like an average site in ground motion

amplification, some deterministic spectra would exceed the 1991 LTSP

spectrumxiii (figure 7c below).

In fact, it is more than just some. Under the scenario shown in Figure 7c the IPRP shows

that the LTSP/HE spectrum is exceeded by all of the newly discovered and re-interpreted

seismic threats, including earthquakes on the Shoreline Fault, the Los Osos Fault, and the

San Luis Bay Fault (note that the red, yellow, and green lines are all above the solid black

line). The fourth and largest hypothetical earthquake scenario, a M7.3 rupture on a joint

Hosgri/Shoreline Fault, is not shown on this figure but could reasonably be assumed to

exceed the LTSP/HE as well.

23

(Figure 7c. from IPRP Report)

This sensitivity analysis shows that the cumulative effect of less-conservative fast rock

velocities along with less-conservative GMPEs is clearly not a small issue, nor is it a only an

academic issue. The IPRP reviewed the limited wellbore data (see IPRP Report 6 Figure 4)

and concluded that the wellbore velocities appeared to be lower than those estimated by

PG&E, which could result in the conclusion that PG&E has underestimated shaking from

new seismic threats even if the new equations are allowed. The IPRP challenged PG&Es

use of wellbore data at the ISFSI site to justify the higher 1,200 m/s velocity and instead

focused on the velocities measured in the wellbore data closest to the facility.

Specifically, IPRP Report #6 says,

Consider the three usable measured profiles, A-2, C, and D, the mean value at 10 m

is approximately 800m/s, considerably below PG&Es mean of 1200m/s. and If A-2 had the same velocity as C at a depth of 5m, consistent with the relative

weathering described in the borehole logs, the mean velocity at that depth would be

about 650m/s, also below PG&Es mean value of 1000m/s.

24

This appendix does not seek to weigh in on the question of which velocities are appropriate

to use when computing site effects at Diablo. Instead, these stated concerns are intended to

demonstrate that:

First, the de-amplification effects of moving from GMPE-1 to GMPE-4 are very large

and likely determinative of whether or not the new seismic threats would produce

shaking above the HE exception; and

Second, even if one were to accept the use of GMPE-4, which is problematic for the

reasons previously stated, the critically important rock velocities upon which the

de-amplification factors are based are complex, in dispute, and arguably lower than

those used by PG&E, which would mean that shaking would be significantly larger

than stated in the Report. Indeed, a conservative approach toward this technical

question would have used of the lowest velocities found in the well data rather than

the highest.

25

i Peck Differing Professional Opinion - Diablo Canyon Seismic Issues

ii Opinion of Commissioners Gilinsky and Bradford on Commission Review of ALAB-644

(Diablo Canyon Proceeding, Dockets 50-275 OL and 50-323 OL)

iii Safety Evaluation Report related to the operation of Diablo Canyon Nuclear Power Plant,

Units 1 and 2. Docket Nos. 50-275 and 50-323; NUREG-0675, Supplement No. 34

iv PG&E submitted to the NRC Report on the Analysis of the Shoreline Fault, Central Coast

California, January, 7, 2011, ML 110140400

v Oct 12, 2012 NRC letter from Joseph M. Sebrosky, Senior Project Manager to Mr. Edward

D. Halpin, Senior Vice President and Chief Nuclear Officer PG&E;

Subject:

Diablo Canyon

Power Plant, Units Nos. 1 and 2 - NRC Review of Shoreline Fault

vi Office of the Inspector General US NRC: NRC Oversight of Licensees Use of 10 CFR 50.59

Process to Replace SONGS Steam Generators Case No.13-006

vii Research Information Letter 09-001: Preliminary Deterministic Analysis of Seismic

Hazard at Diablo Canyon Nuclear Power Plant from Newly Identified Shoreline Fault.

Pages 8-10.

viii U.S. Nuclear Regulatory Commission, 2012. Confirmatory Analysis of Seismic hazard at

the Diablo Canyon Power Plant from the Shoreline Fault, Research Information Letter.

x Oct 12, 2012 NRC letter from Joseph M. Sebrosky, Senior Project Manager to Mr. Edward

D. Halpin, Senior Vice President and Chief Nuclear Officer PG&E;

Subject:

Diablo Canyon

Power Plant, Units Nos. 1 and 2 - NRC Review of Shoreline Fault

xi NRC Research Information Letter 12-01, page 56 (September 2012)

xii Shakal, A., Grazier, V., Huang, M., Borcherdt, R., Haddadi, H., Lin, K., Stephens, C., and

Roffers, P.: Preliminary Analysis of Strong-Motion Recordings from 28 September 2004

Parkfield, California Earthquake. Submitted to SRL November 5, 2004

xiii IPRP Report No. 6, August 12, 2013

26