ML22145A317

From kanterella
Jump to navigation Jump to search
Some Formal Considerations in the Application of Best Estimate Plus Uncertainty Results in Licensing Basis Analysis to Risk Informed Decision Making for Reactor Safety
ML22145A317
Person / Time
Issue date: 06/14/2018
From: Yuri Orechwa
NRC/NRR/DSS/SFNB
To:
Yuri Orechwa NRR/DSS 301-415-1057
Shared Package
ML22145A298 List:
References
Download: ML22145A317 (39)


Text

Some Formal Considerations in the Application of Best-Estimate Plus Uncertainty Results in Licensing Basis Analysis to Risk "Informed" Decision Making for Reactor Safety Yuri Orechwa NRR/DSS/SNPB June,14,2018 I. Introduction The following technical discussion is motivated by the current surge in interest in Risk-Informed Regulation. The operative word being "Informed" as opposed to "Based". The distinction appears to be, at best, murky among the staff, especially those associated with the review of results based on mechanistic computer codes and their use in Licensing Basis Amendment reviews. The purpose of the present work is not to argue in favor of either flavor, but rather to try to illuminate the basic decision theoretic principles inherent in Probabilistic Risk Assessment that are associated with mechanistic licensing basis analysis; the bread and butter of DSS. The basic premise is that under-standing the formulation of Risk-Based Analysis is the 'sine qua non' for making Risk-Informed decisions.

Our reference regulatory context is touched on in the introduction to the recently issued "Guidance of the Treatment of Uncertainties Associated with PRAs in Risk-Informed Decision-making."

(Ref. I) Wherein the relevant elements of the Commission '-s policy statement (Ref. 2) are given as:

"the use of PRA technology should be increased ... in a manner that complements NRC ' s determinis-tic approach ... and supports the NRC's traditional defense-in-depth philosophy ... to reduce unneces-

2 I Ora~_ Working_ Testing_A.nb sary conservatism associated with current regulatory requirements and guides." The Commission further notes, "Treatment of uncertainty is an important issue for regulatory decisions."

We address the following statement of the authors of Ref. I, " Deterministic analyses that are per-formed in risk-informed licensing applications also contain uncertainties; however, they are addressed via defense-in-depth and safety margin. As such, a systematic process that is used to identify and understand deterministic uncertainties is not always needed." We disagree. (Ref. 3,4)

Both the "depth" of the defense-in-depth and the "magnitude" of the safety margin must be esti-mated to be useful in risk-informed decision making. An example is the regulatory evolution of the acceptance criteria in Large Break LOCA Analysis. (Ref. 5) This evolution was due to the continu-ous advances in computing power and experimental databases, and has increased the amount of information that is in a best-estimate plus uncertainty analysis. This has, furthermore, allowed performance acceptance criteria to be stated probabilistically, as opposed to in terms of conservative bounds. This process is bound to continue and requires a formal structure to allow its routine and consistent application in reactor safety analyses. Therefore, given the complexity of a nuclear power plant, our discussion is at best formal and to a great extent heuristic.

II. Functional Framework: Information flow and the concept of risk.

The heart of nuclear reactor safety analysis can be characterized by paraphrasing a well-worn expres-sion "follow the radioactive isotopes." Basically it is the journey from the birth of radioactive fission products (FPs) in the confinement of the fuel of the nuclear reactor to the environment outside the reactor due to accident conditions, and its interaction with the public as consequent detrimental health effects due to them. The journey is long and very complicated; it entails a sequence of very complex computer codes, experimental data and engineering judgment.

A. Heuristic Structure of PRA Reactor Analysis Current risk assessments of light-water reactors are broken into three sequential levels of analysis.

(Ref.6)

Level 1. System analysis - An assessment of plant design and operation consisting of the definition and quantification of accident sequences, component data, and human reliability that could lead to core melt. (Ref. 7)

Level 2. Containment analysis - An analysis, conditional on LI , of the response of the reactor containment to core damage.

Ora~_ Working_ Testing_ A.nb I 3 Level 3. System consequence analysis - An assessment, conditional on LI and L2, of the transport of radio-nuclides through the environment and the public-health consequences.

Clearly in a full L3 PRA the initial conditions with regard to fission products released, and their subsequent health effects, are set at LI. Level I is also the domain of analysis of licensing basis analysis (LBA). There the acceptance metrics (for loss-of-coolant accidents, for example) stem from 10 CFR 50.46, where acceptance is in terms of peak clad temperature and clad oxidation so as to prevent fuel fission product release. In PRA the connection is through success or failure with regard to prevention of fuel fission product release and therefor with severe core damage (SCD).

While temperature and oxidation are clearly measurable and quantifiable parameters, severe core damage has no universal quantitative definition. (Ref. 8) This reference points to ASME (RA-Su-2009) where "uncovery and heatup of the reactor core to the point at which prolonged oxidation and severe fuel damage are anticipated and involving enough of the core, if released, to result in off-site public health effects." This statement to be useful for any quantitative analyses requires a quan-tifiable surrogate or surrogates that provide the linkage between the qualitative definition above and quantitative experimentally measurable parameters and quantitative computer code outputs from best-estimate plus uncertainty LBAs. The linkage is made at the LI level of PRA analysis. The key issue, we believe, is associated with the fact that LBA and PRA perform somewhat different analy-ses, both analytically and in intent. We shall focus on deconstructing this difference to allow consis-tent risk informed conclusions.

B. Information-Linking Structure of a PRA Let us assume a simple reactor system designed so that an initiating event (IE) activates sequen-tially engineered safety systems ESS 1 and ESS2 as indicated in Fig. I. These safety systems have finite probabilities of failure, and, thereby, there is a finite probability of adverse consequences. We measure the risk of unacceptable consequences through a quantitative combination of the frequency of the initiating event (flE), the probability of some combination of failures of the safety systems, that is event sequences (ESi), to activate, and the probability of exceeding acceptance metrics as defined by PRA or LBA.

Fig. 1 Generic Linked Structures of a PRA

4 I Draft_ Working_ Testing_A.nb Initiating Event Safety System 1 Safety System 2 Event Sequence i Consequence IE SSI SS2 ESi PRA LBA The unifying structure for the simple PRA of Fig. I is depicted for our discussion in Fig.2 by an event tree that orders the safety functional responses (SS I and SS2) that mitigate the initiating event. A function event tree is, in general, developed for each postulated initiating event, since each initiator may require a unique plant response. Similarly LBAs also have the same postulated initiat-ing event and safety system structure. There are 4 end states in Fig. 2.

1-PSS2 ESI P{X>LI ES I}

1-PSSI PSS2 P{X>LI ES2}

ES2 flE l-PSS2 ES3 P{X>LI ES3}

PSSl PSS2 P{X> LI ES4}

ES4 Fig. 2. Generic Simple Event Tree for PRA and LBA Analysis Where we define:

Draft_ Working_ Testing_A. nb I 5 flE - Initiating Event frequency PSS 1 - Conditional Probability of SS I failure PSS2 - Conditional Probability of SS2 failure PES 1 = (I-PSS I )*(I-PSS2) -Conditional Probability of Event Sequence I PES2 = (I-PSS I )*PSS2 - Conditional Probability of Event Sequence 2 PES3 = PSS I *(l -PSS2) - Conditional Probability of Event Sequence 3 PES4 = PSS 1*PSS2 - Conditional Probability of Event Sequence 4 P {X>LI ES I} - Probability of a damage surrogate X exceeding limit L given ES 1 etc.

Note: We shall not address the computation of the point estimates of the conditional branching probabilities of the event tree. They are assumed to have been derived for each safety system by a separate fault tree analysis.

C. Representations of Risk The basic expression for Risk consists of three terms. (Ref.10) For an LI analysis as shown in Fig.

2, the three terms are:

flE - the frequency per year of an initiating event, (for example one from a set IE=

{LBLOCA, SBLOCA, SBO, ATWS, etc.})

ES - an event scenario (i.e. a sequence of safety system actions (failure/non-failure)))

X>L - an outcome, generally a surrogate for core damage (CD) where the surrogate value X has exceeded the acceptance limit L.

Based on the notation of the event tree in Fig. 2, we can formulate the Risk R ij of an adverse event (x;j>L) due to an initiating event IEi resulting in an activation of a sequence of safety systems resulting in an event sequence ESij, that occurs with frequency flE;, which can be expressed as a joint probability distribution function (1)

6 I Draft_ Working_ Testing_A.nb The total risk for the reactor, therefore, is given as (2) where N is the number of postulated initiating events and M the number of safety systems.

The joint probability distribution function for the contribution to the total risk R by a specified initiating event IEi and event sequence ESij can be decomposed into the product of three probability density functions, (3)

This expression can be viewed in the context of Fig. 2 as having two components:

First, we can consider flE analogous to a multiplicative constant and drop its contribution to risk from further analysis without loss of the probabilistic content of our argument.

RPRA = P { ESij I IE;} - probabilistic methods that focus on the probabiity of operation ofthe plant safety systems that define the event sequences.

RLBA = P { ~j>L IIEi, ESij, } - deterministic methods concerned with the probability of the transient behavior of the reactor core exceeding a specified limit for a particular sequence of events.

Thus, in the expression for the plant risk (Eq. 3), the link between the PRA analysis component and the LBA analysis component is through the event sequence ES for a given initiating event IE. That is, the probability of the event {X>L}, i.e. core damage, is dependent on the probability of the event

{ES} , which is specified by a particular safety system failure sequence. Here an important dichotomy between LI PRA risk analysis and a best-estimate plus uncertainty LBA is revealed.

Two distinct analyses are performed that exhibit a clear symmetry.

In a traditional PRA analysis we have:

P {ES! IE} = a probability density function of the failure of the safety systems in ES.

P{X~ LIES,IE} = 0/ 1 (indicating no-CD/CD, based on events {X:::;L} / {X>L}.

Draft_ Working_ Testing_A.nb I 7 On the other hand, in a best-estimate plus uncertainty LBA analysis we have:

P {ESI IE} = I (failure of only the limiting safety system)

P {X>LIES,IE} = a probability density function of core damage.

We, therefore, have four ways of expressing risk of CD dependent on the level of information content:

I. (P{ESI IE}= l)*(P{X<LIES,IE} = l) which would imply no CD for a bounding ES and bound-ing calculate mechanistic result. That is Appendix K like analysis.

2. (P{ESI IE} = l)*P{X>LIES,IE} which is a best-estimate plus uncertainty result of CD for the limiting ES.
3. P{ESI IE} *(P{X>LIES,IE} = l) which is the traditional PRA result of CD frequency where the uncertainty is based on the failure probabilities of the safety systems ES.
4. P{ESI IE}*P{X>LIES,IE} which is a risk based PRA, where both the probability of failure of the ES and the best-estimate plus uncertainty of CD form a quantified joint uncertainty probability function P {IE,ES,X>L}.

See Appendix I for graphic illustrations.

D. Consistency of Information Let us consider the information content of the above four expressions of risk at level LI as follows:

The first is a deterministic bounding statement of the risk. The fourth the true probabilistic state-ment of the risk. And two as an improvement over one, and three as an approximation to four.

We previously indicated, that the integrated overall risk analysis of a plant is based on three separate but coupled analyses Ll , L2 and L3 , and, thereby, requires well defined interfaces of the damage states between the three parts. Our focus is limited to the 1st interface between Ll and L2. Clearly any decision with regard to a consistent analysis with regard to the safety of the reactor that takes all available information (such as defense-in-depth) into account is critically dependent on the probabil-ity of the estimate of CD at Ll.

E. A Hypothesis Test Framework for Best-Estimate Analysis Acceptance Criteria

8 I Draft_ Working_ Testing_A.nb Let us consider the statistical aspects of this issue through an analogy with the estimation of the material unaccounted for (MUF) in materials control and accountancy, (Ref. 11) A similar analogy has been applied to the determination of control of set-points for nuclear safety-related instrumenta-tion, where the relevant statistical approach has been well developed. (Ref. 12) An analysis based on confidence intervals is presented in Ref. 3.

As a problem in decision making under uncertainty, the central issue in the assessment of risk with regard to nuclear power plants has been shown above to be associated with the information with regard to core damage initiated under accident conditions. In particular, the estimation of the compo-nent P {X>LIES, IE} of the joint CD uncertainty probability function P {IE, ES, X>L}.

Table I. Generic Analogy Material Accountancy Estimation Core Damage (CD) Estimation MUF = B-1 W = X-Y Where:

MUF = material unaccounted for W = proxy proportional to CD B = book inventory Y = damage threshold proxy I = physical inventory X = operating parameter proxy Let us consider material accountancy. The right hand side of the expression for MUF consists of random variables, since Band I are measured. We therefore have the expectation value of MUF as E(MUF) = E(B) - E(I). We can then formulate a simple decision rule:

E(MUF) ~ 0 no material missing E(MUF) > 0 material missing This of course is too simple to be useful, for it matters "how much" is missing, and especially in light of that our decision is made "under uncertainty" due to measurement errors. We therefore need to take into account the "value" of the missing material, for example one critical mass, and the variance in the measured values.

A similar argument can be made in our situation of core damage estimation. As pointed out in the

Draft_ Working_ Testing_A.nb I 9 introduction, core damage can be assessed in terms of proxy parameters , both computed and mea-sured. In Table I the analogy of the level of damage to MUF is the random variable W, which is the difference between proxies for damage, i.e. W = X - Y. For example, the peak clad temperature computed with a best-estimate plus uncertainty code (X) and the measured threshold temperature of material damage (Y). We therefore get an analogous decision rule:

E(W) ~ 0 no core damage E(W) > 0 core damage A graphical representation of the information available for making a statistical decision with regard to CD estimation is given in Fig.3 below. (The numerical values in the probability distribution functions are only for illustrative purposes of the methodology)

Ho: N(0,40)

PDF

_ ..,..._____.__w

-200 -100 100 200 Wo=O kcrit Fig. 3. Graphical Setting for a Hypothesis Test of Core Damage The decision we want to make is based on the question of the whether the predicted core damage level W is due to only random code error and measurement damage threshold error. In statistical terms, we test the so called null hypothesis Ho by choosing a "significance threshold" damage kcrit for the realized value W of our analysis, (i.e. the level of computer code prediction of the experimen-tally measured damage threshold level). In light of Fig.3 the critical value kcrit is fixed through a definition of the probability of error.

10 I Draft_ Working_ Testing_ A.nb a= Prob{W> kcrit I Ho} (4)

That is, a is the probability that "Ho is not correct" is concluded by a result W>kcrit, when in fact Ho is true.

Table II. Probabilities of Correct and Erroneous Decisions "Decision" "True State" Ho "True State" HA Accept Ho 1- a /3 Reject Ho a 1 - /3 The general decision theoretic frame work in terms of the hypothesis tests in Table II and of the graphic setting in Fig. 3, results in decisions and their probabilistic interpretations are as follows:

a= P{W > kcrit I Ho} P{Rejecting Ho is true I Ho is true} = Type I error 1 - a = P{W~kcrit l Ho} P{AcceptingHoas truel Ho is true}= No error

/3 = P{W< kcrit I HA} P{Accepting Ho is true I HA is true} = Type II error l - /3 = P {W~ kcrit IHA} P {Rejecting Ho is true I HA is true} = No ~rror Key to making a decision by this Hypothesis Testing decision rule is the choice of kcrit.

Since our information for making a decision with regard to core damage due to a reactor transient consists of two probability density functions, one of operating parameters X and the second of experimentally determined damage threshold parameters Y, the requisite damage event {X<Y}, (i.e.

the code under predicts damage), is represented by the random variable W=X-Y as shown in Fig.3.

Thus, in those cases where predictive mechanistic code calculations are used as an "instrument" for the determination of core damage, the rejection of the null hypothesis Ho is by its definition the.first indication of damage and it's "weight" is given by a. However, this alone is not a sufficiently strong indication; it may only be a " false alarm". Given the above hypothesis testing framework we can introduce more information to clarify the significance of the false alarm. To this end, we can formu-late a second indicator 1-/3 as a "detection probability" that there is damage when there is damage.

Draft_ Working_ Testing_A.nb J 11 In light of the formal relations between the quantities introduced in Fig. 3, if Ho is true, then the following relation holds:

Ho:

or (1-a) + a=l.

Under the hypothesis Ho the observation kcri1 is written as kcrit = Wo + Za er, where Za is the a fractile of the standard normal distribution.

Similarly, if HA is true, we have:

or /3 + (1-/3) = l and kcrit = WA - ZJ -/3 er.

Equating these two expressions for k crit , we obtain the key relationship between damage, instrument precision, false alarm probability and detection probability as:

(5)

This relationship between the four parameters a, /3, M = WA - W0 , and er is displayed graphically in Fig.4. We generally refer to Mas the damage margin. Where we have defined E(X) as the analytic limit for the code calculated values, and E(Y) as the operating limit reflecting a CD threshold.

In principle, given the value of three of the parameters, the value of the fourth can be determined.

That is, it establishes a relation between a, the probability of rejecting no damage, when this is the true state, i.e. a "false alarm", the computational and experimental variance cr2 , M the assumed damage margin, and (I -/3) the probability ofrejecting the conclusion of no damage margin. We note, that because a hypothesis test is based on sample data, it does not conclusively rule out either the null or the alternative hypothesis. In most applications, failing to reject a null hypothesis does not prove it is true, and rejecting a null hypothesis does not prove it is false. The key is defining and fixing a and f3 at suitably small values. (Ref. 13)

12 I Draft_ Working_ Testing_A. nb We use Fig. 4, however, only for the phenomenological behavior of these parameters, and in their role in determining the initial state of core damage for the L2 PRA analysis.

0.8 M/<:r ;;!: 0 M/<:r = 0 0.6 1 - /3  ;

0.4  ;

0.2  ;

0.0 w.;~ ~..___,._~~ . _ . _ ~ ~ - - ' - ~~__J__~ ~_w 0.0 0 .2 0.4 0.6 0.8 1.0 1 - Q Fig. 4. Graphical representation of the relationship between a, p, WA- Wo, and u.

Graphically displayed in Fig.4 are properties that reflect the following relevant infom1ation in Eq.5 to which we can give the following interpretation:

1. The margin to damage threshold: M.
2. The accuracy of the estimate of the damage margin threshold: CT.
3. The sensitivity of detecting that the damage margin has been exceeded: (1 - /3).
4. The credibility that the damage margin has been exceeded: (I - a).

Thus, the "instrument" (i.e. our mechanistic best-estimate plus uncertainty code) governed by Eq. 5 has the following physically correct properties:

Draft_ Working_ Testing_A.nb I 13

1. The probability of "detection" of damage increases with the amount of damage. (Here we make the physically reasonable assumption, that the level of damage is monotonic in the proxy variables.

Thus, to first order, for phenomenological argument, we can assume it is linear.) This is a natural requirement for any detection system.

2. The probability of detection of damage increases with decreasing standard deviation a-, which reflects an increase in the precision of a "measurement instrument", and, therefore, a more valid statistical decision can be made.
3. The probability of detection increases with increasing false alarm probability. This is characteristic of detection systems; the more sensitive the system the higher its false alarm rate.

Ill. Formulation of the Initial Damage for L2 and L3 PRA analysis based on Ll LBA results .

Our objective to this point has been to formulate an analysis framework that enhances the traditional PRA analysis at LI by exploiting the results obtained routinely today in LBA of reactor transients.

With continued advances in computing power and modelling, information content in LBA will also improve and potentially add information to PRA analysis. It was shown that the key element in this argument is that the probability P {X> L jES,IE} contains the necessary information to formulate the initial condition for the subsequent L2 and L3 analyses. We have deconstructed the routine LBA results in a decision making framework based on classical Hypothesis Testing to arrive at four inter-related parameters: M, a-, a and {3.

Given that in best-estimate plus uncertainty analysis the allowable margin M, and the level of a type-I error a (generally 0.05), are specified, and the uncertainty in the estimate of the proxy variable is computed via Monte Carlo calculations, we also have a-. The specific value of a- is, of course, dependent on the number of histories and converges via the Central Limit Theorem as the number of histories increases. We can, therefore, compute /3, the level of a type-II error by Eq. 5 in a consis-tent way for the specific analysis.

We can then quantify the level of core damage as (aM). We need to further take into account that this value is only know with a detection probability of (1 - /3). We then get our sought after result, since P{X>LIES,IE} = (a)*( 1- /3 ), Eq.6

14 I Draft_ Working_ Testing_A.nb we have M*P{X>LJES,IE} = (aM)*( 1-/3) Eq. 7 as the initial condition; the ' source' term for L2 and L3 PRA analysis.

IV. Conclusion Clearly the estimated LBA uncertainty of the transient behavior of the reactor under accident condi-tions coupled with a PRA analysis at Level 1 can contribute to the objective of " treating of uncer-tainty ... for regulatory decisions" to "reduce unnecessary conservatism associated with current regulatory requirements and guides." This can be achieved most fruitfully by "a systematic process that is used to identify and understand deterministic uncertainties." as demonstrated in this technical discussion and elaborated in the attached appendices.

Appendix I discusses the treatment of the level of information and its source with regard to core damage that is brought to bear in the analysis of the probability of core damage in Licensing Basis Analyses.

Appendix II demonstrates a categorization of a event sequece space that assigns equal risk to each category, where the relationship between aR*, the risk, y the level of confidence, and a fixed sample size N is continuous and monotonic. However, the division of the level of confidence into categories is difficult, if not impossible in practice, to associate with physically meaningful values, such as, for example, one year in the case of the boundary between the AOO and DBE event frequencies in the LWR formulation . However, the inconsistency between current licensing basis accident analysis acceptance criteria and PRA risk goals is clear, and , therefore, makes the selection of design-basis accidents (DBAs) into endogenously predefined categories questionable.

A formalism is not an algorithm. The introduction of the requisite calculations into current practice is likely to be evolutionary. At a minimum, effords to move toward implementing extended PRAs will bring much of the calculational machinery of best-estimate with uncertainty into the analysis.

This would allow the approximation and bounding of the total risk of the system in the context of the formalism. This would put in place the necessary machinery for natural improvements in the analysis as greater computing power and multiscale physics modelling become more routine in the analysis of reactor systems.

Appendix III briefly touches on the PRA based PBMR Design Certification Application submitted

Ora~_ Working_ Testing_A. nb I 15 to the NRC in 2006 in light of the discussion in this White Paper.

References l) U.S. Nuclear Regulatory Commission, "Guidance on the Treatment on Uncertainties Associated with PRAs in Risk-Informed Decisionmaking: Final Report," NUREG-1855, Revision l (March 2017).

2) U.S. Nuclear Regulatory Commission, "Staff Requirements Memorandum-SECY-98-144- White Paper on Risk-Informed and Performance-Based Regulation," SRM-SECY-98-144, Washing-ton,D.C. , March 1,1999.
3) H.F. Martz, et al., "Combining mechanistic best-estimate analysis and Level 1 probabilistic risk assessment," Reliability Engineering and System Safety, Vol. 39 (1993).
4) Y. Orechwa, "Formal Considerations in Establishing Risk-Consistent Acceptance Criteria for Reactor Safety," Nuclear Technology, Vol. 170, June 20 I0.
5) Y. Orechwa, "Best-Estimate Analysis and Decision Making Under Uncertainty," Proc. Int.

Mtg.Updates in Best-Estimate Methods in Nuclear Installations Safety Analysis (BE-2004), Novem-ber 14-18, Washington D.C., American Nuclear Society (2004)

6) U.S. Nuclear Regulatory Commission, "PRA Procedures Guide," NUREG/CR-2300, Vol. 1 (January 1983).
7) U.S. Nuclear Regulatory Commission, " Compendium of Analyses to Investigate Level I Proba-bilistic Risk Assessment End-State Definition and Success Criteria Modeling Issues,"

NUREG/CR-7177, (May 2014).

16 I Draft_ Working_ Testing_ A.nb

8) U.S. Nuclear Regulatory Commission, "Confinnatory Thennal-Hydraulic Analysis to Support Specific Success Criteria in the Standardized Plant Analysis Risk Models - Surry and Peach Bot-tom," NUREG/CR-1953, (September 2011).
9) U.S. Nuclear Regulatory Commission, "MELCORE Computer Code Manuals, Vol. 1: Primer and User's Guide, Version l.8.6.2005," NUREG/CR-6119, Rev. 3, 2005.

I 0) U.S. Nuclear Regulatory Commission, "Evaluation of Sever Accident Risks: Methodology for the Containment, Source Term, Consequence, and Risk Integration Analyses," NUREG/CR-4551 SAND86-1309 Vol. 1, Rev.I. (1993).

11) Rudolf Avenhaus, "Safeguards Systems Analysis: With Applications th Nuclear Material Safe-guards and other Inspection Problems," Plenum Press, New York (1986).
12) Y. Orechwa, "Statistical Considerations in the Detennination and Control of Setpoints for Nuclear Safety-related Instrumentation" White Paper, NRR/DSS/SRXB, (January 2005).
13) U. S. Nuclear Regulatory Commission, "Statistical Methods for Nuclear Materials Manage-ment," NUREG/CR-4604,(December 1988).
14) U.S. Nuclear Regulatory Commission, "Feasibility Study for a Risk-Informed and Perfonnance Based Regulatory Structure for Future Plant Licensing," NUREG-1860, Vols. 1 and 2, (December 2007).
15) U.S. Nuclear Regulatory Commission, "An Approach for Determining the Technical Adequacy of Probabilistic Risk Assessment Results for Risk-Informed Activities," Regulatory Guide 1.200, February 2004.
16) G. Saji, "A New Approach to Reactor Safety Goals in the framework of INES," Reliab. Eng.

Syst. Saf. , 80, 143 (2003).

17) A. Guba, M. Makai, and L. Pal, "Statistical Aspects of Best-Estimate Methods - I," Reliab. Eng.

Syst. Saf. , 80, 217 (2003).

18) Letter from Edward G. Wallace PBMR (Pty) Ltd to N. Prasad Kadambi USNRC,

Subject:

"PBMR White Paper: PRA Approach," June 13, 2006.

Draft_ Working_ Testing_A.nb I 17

19) White Paper: "Methodological Issues in the Application of PRA in the Demonstration of Ade-quate Safety in the PBMR Design Certification Application," Yuri Orechwa, NRR/DSS/SNPB, January 19, 2007.
20) F.R. Farmer,"Reactor Safety and Siting: A proposed Risk Criterion," Nucl. Saf., 8, 539-548 (1967).

Appendix I. Deconstruction of the level of "information" in CD estimation based on Ll LBA.

The basic question we ask is: Having computed a damage proxy value X and an experimentally determined a damage threshold Y, can we conclude that the reactor system will incur core damage under a specified transient? That is, what is Prob{X>YIX,Y} = ?. (A.I. I)

The answer to this question, for application to estimation of risk via Eq. 3, is critically dependent on the information that is available to us with regard our estimates of X and Y, and how it is applied in computing Eq. A.I. I. In particular and in the most general case, for our illustrative purpose, let us consider two proxy variables that are both random variables, and, therefore, represent "imperfect" information in the form:

X - N( Xo, a-x) and Y - N( Yo, <Ty),

18 I Dra~_ Working_ Testing_ A. nb where we have assumed normality for simplicity.

We can then define four different levels of information that we might bring to the problem via the probability density functions of the random variables X and Y. All perfect information: a-x=O and o-r=O, all imperfect information: a-x>O and a-y>O, and the two combinations, perfect/imperfect a-x=O, o-r>O, and imperfect/perfect o-x>O, o-r=O. These are illustrated in Figs. A.I. I - A.1.4.

0.5 0.4 0.3 0 .2 0.1 60 80 100 120 140 Yo Xo Fig. A.1.1. Y- Constant (perfect)/ X - Constant (perfect)

Therefore: Prob {Xo> Yo\Xo,Yo} = 1.

0.05 0.04 0.03 a

Yo Xo

Draft_ Working_ Testing_A. nb I 19 Fig. A.I .2. Y - Constant (perfect)/ X - Random (imperfect)

Therefore: Prob {X > Yo IX, Yo} = a.

0.05 0.04 0.03

/3 '

0.02 0.01 0.00 60 80 120 140 Yo Xo Fig. A.I .3. Y - Random (Imperfect) / X - Constant (perfect) therefore : Prob {Xo > Y I Xo, Y} = /3.

20 I Draft_ Working_ Testing_ A.nb 0.05 0.04 0.03 0.02 0.01 Yo Xo Fig. A.1.4. Y - Random (imperfect)/ X - Random (imperfect)

Therefore:Prob{X>YIX,Y} = a/{3, where Z1-a + Z1-p = (Xo - Yo)lax+v Figs. A.I. I and A.J.2 characterize the historic evolution of acceptance criteria from conservative bounds to best-estimate plus uncertainty evaluated limits with regard to X. (Ref. 5) The intent of the present analysis is to shed some further light on the imperfections in our information with regard to estimation of CD and its consequences on making safety decisions.

Some further "missing" information in the traditional approach is illustrated in Fig. A.1.3. In that, our knowledge of damage is not a constant as given by acceptance limits, but rather a random variable Y. The current approach is to choose a conservative limit YL and compare the computed proxy value X as Prob{Xo>YLIXo, Yo} ~ CD.

This formulation in practical analysis (i.e. Fig. A.I.3) is not meaningful. Since only one constant value Xo is representing the proxy for the transient behavior of the system. This, however, does not diminish it's value in pointing to missing information in the previous two traditional approaches.

We thus come to Fig. A.1.4, wherein both the best-estimate of the computed reactor performance proxy variable X and the proxy variable Y for the onset of damage are random variables. Thus they are associated with uncertainties given by crx and <ry respectively. This representation allows the combination of a in Fig. A.1.2 and f3 in Fig.A.1.3 via Eqs. 5 and 6 to give the desired probability as P{X > YI X, Y} = (a)*(l-/3).

Draft_ Working_ Testing_A.nb I 21 The effect on risk of the "missing" information shown in Fig. A.I.3 can be illustrated as follows.

(Ref.3) The definition ofrisk as applied to PRA in the context of nuclear reactor safety analyses was given by Eq. 3. Namely, (A.I.4) where i = postulated initiating event, and j = a particular sequence of safety system failures.

For simplicity of illustration, let us focus on one initiating event and one particular sequence of safety system failures. This reduces Eq. 3 to R= flE*P{ES !IE}*P{X>L IIE,ES}, (A.I.5) or more simply (Ref.4)

R = f*P{X>Y}. (A.1.6)

That is, f ~ System performance ( event sequence frequency)

P{X>Y} ~ Mechanistic damage analysis (Prob{CD}).

Thus, in view of Eq. A.I.6, at constant risk R the probability of core damage P(X>Y) is inversely proportional to f, the frequency of occurrence of a particular safety system failure sequence ES for a specific initiating event IE.

We illustrate the impact of interpreting the probability of core damage P(X>Y) probabilistically based on best-estimate plus uncertainty analysis and interpreting it as a binary result (0/ 1). In Fig.

A.I.5 below the red curve represents the traditional PRA P {X>L} = (0/ 1) and the black the best-estimate plus uncertainty PRA .

The impact to the overall CD frequency will be a shifting of the CD frequency contributions from traditionally-defined severe accident events to higher frequency of occurrence non-severe accident events. The net impact is expected to increase the point estimate of overall CD frequency and widen its distribution. When very conservative evaluation models are used, that are bounding in the sense of zero probabilities of CD for all non-severe event sequences, the two approaches of course are

22 I Draft_ Working_ Testing_A.nb identical. (Ref.3) no CD - CD -

1 t

P{X>Y}

l 0 f:1~0 Fig. A.1.5 Comparison of the interpretation of P{X>Y} between traditional PRA and one based on best-estimate plus uncertainty LBA results.

Draft_ Working_ Testing_A. nb I 23 Appendix II. Risk Consistent Partitioning of Event Sequences into Risk Categories The introduction of additional information to PRA analyses, in the form of a quantitative representa-tion of the uncertainty with regard to core damage at level L 1, as demonstrated also has an impact on the partitioning of event sequences into risk categories. (Ref. 4) This, in effect, changes the grouping of event sequences in the traditional event classes, and, thereby, the initial state for L2 analysis. The traditional partitioning of the event space {ES}, following an initiating event IE, is into three categories of licensing basis events (LBEs) used in regulatory analyses. These are labeled as: Anticipated Operational Occurrences (AOO) = {ES 1}, Design Basis Events (DBE) = {ES2} and Beyond Design Basis Events (BDBE} = {ES3}.

Given that we define risk for the i-th initiating event as, Rij = flE/P { ESij ..X'ij>L}, where j = 1,2,.3.

We can decompose the total risk into three components each associated with one of the LBEs, that we can then express as and for simplicity without loss of generality, we define Thus, total risk can be represented as, R = .LJ=t fj

  • Cj. This expression formally represents a bivari-ate probability density function for risk, say ¢(f,c).

Thus, the formalism is centered on the joint probability density function </J(f,c), where the random variable f is associated with event sequences that by definition include LBE sequences, and the random variable c is associated with an undesirable consequence due to the event sequence. Further-more, we assume that all our knowledge with regard to the dynamic behavior of the reactor system is contained in a PRA. More importantly, the PRA is an extended PRA, as opposed to a traditional

24 I Draft_ Working_ Testing_A. nb PRA. The distinction is centered about the probability of damage conditional on the event sequence.

As illustrated in Appendix I, the probability of an adverse consequence is binary in the traditional formulation, and represented by a continuous probability density function in the extended formula-tion.

I. Traditional Partitioning of the Risk State Space (f,c) by Event Sequence Frequency Cate-gories A contour representation of </J(f,c) and the three categories of LBEs is shown in Fig. A.II. I f

C Fig. A.II.I Constant Risk Contour We, therefore, define the probability of exceeding the risk limit R* as, (Ref. 4)

(A.II.I)

This formulation is consistent with the one illustrated in Fig. A.1.2. which is in terms of damage proxies X and Y rather than risk R. Furthermore, we can formally develop a relationship between the performance of the reactor as exemplified by </J( f,c), the acceptance criterion as specified by (R* ,aR' ), and the appropriate risk-consistent event sequence frequency boundaries f* and

Dra~_ Working_ Testing_A. nb I 25 f'* through the basic relationship given by Eq. A.II. I:

- rf+oo ( +00

- JJ* dfJR*Jfdc ¢(/, c)

+ J{..df f;.7Jdc ¢(/, c)

{f" r+ 00

+ Jo dfJR*Jfdc¢(j, c)

= aAOO R*

+ aDBE R'

+ aBDBE R*

We have, therefore, partitioned the probability that the risk of an event exceeds the isorisk, i.e. P{

R>R* }, into components associated with AOOs, DBEs, and BDBEs. The risk associated with any one of these three cannot exceed the risk of the others, if we are to be risk-indifferent in our choice of category. We assign a risk of T to each category, and we get two integral equations for the variables of interest:

rf+oo r+oo f!'.E:..

Jp df JR*/Jdc ¢(/, c) 3 and

{f" r+00 f!'.E:..

Jo dfJR*/Jdc<f>(j, c) = 3

  • We observe that boundaries between the LBE categories, for a fixed level ofrisk, that apply to all reactor systems are, in principle, dependent on the specific reactor design through ¢(f,c). There are, therefore, no reactor technology neutral boundaries that result in equal risk with respect to each category of LBE; the boundaries are completely dependent on the reactor system design ¢(f,c), and the reactor system acceptance criterion (R* , aR* ).

This qualitative categorization of LBEs predates PRA application in reactor safety analysis and has served as an engineering guide in the choice of deterministic analysis required to ensure the pre-dicted performance of the reactor system. A key intuitive property of each category is the level of confidence we place on the results of the mechanistic calculations associated with the event sequences in each category. The level of confidence is based mainly on engineering judgement that

26 I Draft_ Working_ Testing_A. nb is above all predicated on the ability to validate the results of the mechanistic calculations in light of experimental evidence.

II. Partitioning of the Event Sequence Space tP(f,c) into Risk Categories In order to partition the event sequence space by risk categories rather than, as above, by sequence frequency categories, we suggest the following qualitative relationships between the categories and the validation of mechanistic tools for computing the parameters applicable to acceptance criteria in the analysis of the event sequences:

1. The validation of AOOs is feasible on a plant basis or very close to one. furthermore, the parameters associated with the acceptance criteria are measurable during the tests.
2. The validation of DBEs is not feasible on a plant basis, and thus, the acceptance criteria are tested in light of scaled and special effects tests.
3. The validation of BDBEs is not feasible on the scale of a plant. Moreover, the acceptance criteria are such that, in general, scaled tests are at best problematic and are not likely to be germane for validation. The mechanistic analyses are based on a series of what are judged to be at least phenomenologically correct models.

The leitmotif in the characterization based on these definitions is that for a given acceptable design, not only are the mechanistically computed unfavorable consequences of the event sequences allowed to be sequentially greater from AOOs to DBEs and BDBEs, but also that the confidence in our estimate of the consequence is likely to be sequentially less. That is, a lower level of confidence in our in terms of validation in the mechanistic analysis is associated with the possibility of more severe adverse consequences. We compensate for this lower confidence by designing the safety system such that the system will have sequentially a lower event sequence frequency. A quantifica-tion of the level of confidence can, therefore, be the needed extra degree of freedom for a risk-consistent classification.

To this end, we propose a partitioning as shown schematically in Fig. A.11.2 (See Refs. 15 and 16

. for an analogous description in a quantitative context.) In this scheme event sequences with a higher probability of realization in category I have a greater opportunity to contribute to <l'R* than those in categories II and III.

f I \

I \ fx c = R*

I \ I I \ R1

\

I "\ / Rn I TT

Draft_ Working_ Testing_A.nb I 27 Fig. A.11.2 Partitions of the Event Space into Categories I, II, and III based on Isorisk Bound-aries R*, R1 and Rn.

We would, therefore, require our estimate of the probability of an event sequence in category I to have a greater confidence level associated with the risk estimate than that associated with category II, and more so with category III. This in effect induces an ordering of the categories by confidence level in the analytic estimate of risk, in that we would require a higher confidence in the analysis of category I events, less so in category II events, and least in category III events. As in Fig. A.II. I is how do we determine the category boundaries in a risk-consistent manner, and furthermore, are they amenable to implementation in practice.

A risk-consistent partitioning of the reactor system states can again be achieved via Eq. A.II. I, We can, once again, write this equation as the sum of three terms. This time taking into account the division of the probability density ¢(f,c) into the three categories shown in Fig. A.11.2.

Jo +oo df foRu/J de¢(/, c)

Ifwe assign three numerical values say, am, a11 and a-1, such that am+ a u + a1 = (1-aR*) to the above three integrals from left to right, respectively, and since we assume we know ¢(f,c) exactly, in principle, we can compute risk-consistent boundary values. Once again, it appears reasonable to seek three categories that represent equal fractions of the event sequence space with regard to those event sequences that exceed the risk acceptance criterion R*. Then as before, we constrain each to a value (l-aR* )13. This allows us, in principle, to compute risk-consistent boundary values R1and Ru.

III. Sample Information and Confidence Level We have assumed to this point that we have perfect information with regard to the analytic form of

28 I Draft_ Working_ Testing_A.nb the probability density function </>(f,c), and therefore are able, in principle, to compute aR* precisely.

In the real world, of course, we do not have perfect information; we obtain the requisite, though limited, information from other sources, especially random sampling. All the fundamental informa-tion to our argument is assumed to be in an extended PRA, especially via sample information.

We can then formally extend the expression for risk given by Eq. A.II. I for the case of perfect information to one that takes into account imperfect information, in the form of sample information, with regard to the analytic form of the probability density function </>(f,c). The paradigm we use to introduce sample information is the one-sided statistical tolerance interval in the context of order statistics. (Ref. 17) This results in the following analogous expression to Eq. A.II. I, f +oo (RlN!/J Pr ( Jo df Jo de </>(f,c) > I - aR*) = y, Eq. A.11.2 where R(N) is the N'th order statistic of R =f x c, and y is the probability that the integral expression is at least as large as I - aR*. In practice f and c are independent random variables and their realiza-tions are computed via Monte Carlo calculations with a best-estimate computer code.

Thus, the acceptance criterion of an extended PRA, based on imperfect information with regard to

</>(f,c) due to sample information R<Nl, needs to include the level of confidence y that is a function of the order N of the statistic R(N) and the risk aR** That is, to deal with imperfect information requires an extension of the acceptance criterion,

{R* ,aR* } -- -- > {R* , aR', y, N},

wherein the confidence level y is analytically related to the sample size N, or more generally the amount of imperfect information.

In the context o estimation based on order statistics, the relationship between the quantities on the right side of the above expression is given by the analytic relationship Eq. A.II.3 That is, irrespective of the analytic form of</>(f,c), the fraction of the population of the reactor states for which R ~ R(N) covers the fraction of the population of reactor states ( 1-aR*) , where R ~ R* with confidence y.

IV. A Metric for a Risk-Consistent Partitioning of the Event Sequence Space into

Draft_ Working_ Testing_ A.nb I 29 Risk Categories The goal is to capture, on the basis of the information in an extended PRA, the notion that the further an event sequence is from the isorisk curve in the f-c plane, the less confidence needs to be associated with the estimate that the risk aR* is met. Our metric therefore needs to capture the dis-tance AR in the figure below in terms of {R* , aw, y , N} .

f

\

\ fx c = R*

\

\

I R1

\

\

a~

C Fig. A.II .3 Risk Metric in the f-c plane.

Let us define our metric through the following equivalence relationship:

We then have Aa1 = a 1 - aR*, which can be expressed as:

The quantity aa1 represents the increase in aR* due to the distance of the category boundary R1 from the limit R* . We can now compute the confidence level at the boundary R1 in the above figure using Eq. A.II.3:

Ifwe once again take for each category Aa = (l-aR*)/3, we can compute the confidence levels nand

30 I Draft_ Working_ Testing_A.nb Y11 at R1 and Ru in Fig. A,11.2. This allows us to define a classification rule for placing the event sequences in a PRA into risk-consistent categories.

The extended PRA, in principle, contains for each event sequence and analysis of Eq. A.11.3.

Namely, Eq. A.11.4 where ¢>ES(f,c) is the probability density of the reactor states and yES is the confidence level in the analysis based on the amount of information N that was brought to bear in the calculations.

We can now formulate a classification rule for classifying all event sequences in the PRA into, for example, one of three categories based on Eq. A.11.4 as follows:

If yESE [ y 1, y *] then ESE Category I If yESE [ yu, n] then ESE Category II If yESE [ 0, y 11 ] then ES E Category III Such a categorization associates the same predicted risk with the event sequences in each category by trading off the distance of an event sequence state point from the isorisk limit against the confi-dence level in the estimation of the risk of exceeding the acceptance criterion {R* ,aR*}.

Draft_ Working_Testing_ A.nb I 31 Appendix III. Implications on Safety Methodologies of New Reactors with Unique Features.

New reactors, by definition, introduce unique design features that improve the economic and safety performance over the current LWRs licensed and operating in the fleet of reactors in the U.S. How-ever, by the same token relatively little data have accumulated to estimate the probabilities required, in principle, for a full PRA analysis. This, it would appear, makes a comparative evaluation of the probabilities for demonstrating 'improved' performance difficult. Particularly when the new reac-tors are confronted with a regulatory structure designed for and based on operating experience of LWRs.

PBMR (Pty) Ltd submitted to the USNRC in 2006 a white paper entitled "Probabilistic Risk Assess-ment Approach for the Pebble Bed Modular Reactor," Revision 1. (Ref. 18) It describes their approach to the development and application of a full scope PRA for the PBMR design, and mate-rial for understanding the risk-informed Licensing Basis Event Selection, SSC Classification and

32 I Draft_ Working_ Testing_A.nb Defense in Depth in support of a PBMR Design Certification Application.

We present here some figures from the submittal that reflect the more ' aggressive' use of PRA for demonstrating the adequacy of the design in meting the objectives of a deterministic safety design philosophy. This is fertile ground for assessing, and gaining insight into inherent features of new reactors. In particular their impact on potential roles of PRA, especially as compared to the L 1 analysis discussed in the white paper at hand.

Fig. AIII. l presents a comparison of the basic PRA characteristics between PBMR and LWRs. The two features that stand out are:

1. There is nothing comparable to a L 1 PRA for the PBMR. That is, due to the inherent design features of the PBMR, there are no plant states comparable to "core damage". Thus there is no CDF metric. One of the challenges of a greater emphasis of PRA is to provide a logical means for prioritiz*

ing the challenges to safety. Refs. ( 4, 19)

2. While PBMR does not calculate CDF metrics, it does calculate PBMR-specific plant state frequen-cies that correspond with LBEs for LWRs. Thus, the analytic machinery presented in the main text above should be applicable to a PBMR PRA that is comparable to a full-scope L3 LWR PRA.

Draft_ Working_ Testing_ A. nb I 33 Non-Proprietary Class 3 PBMR Probabilistic Risk Assessment Approach for the Pebble Bed Modular Reactor 039144 Table 1: Comparison of PBMR PRA with LWR PRA Model Structure PRA Model LWR Attributes PRAOutputs PBMR Level 1 Level2 Level3 Core Damage Frequency Yes Yes Yes No (CDF)

Not Plant Damage necessary Yes, all PDSs Yes, all PDSs Yes, PBMR plant State (PDS) but are are variations of are variations of stales defined as Frequencies sometimes LWR core LWRcore event sequence damage state damage state families for LBEs included Assessment of Release Yes, all involve Yes, all involve Accident Category core damage core damage Yes, PBMR-specific Frequencies No Frequencies and are LWR and are LWR release categories specific specific Frequencies of Yes, but site conservative meteorological deterministic conditions and No No Yes bounding treatment Emergency may be sufficient to Planning (EP) meet DCA responses requirements Yes, Mechanistic Yes, Mechanistic Source Terms Yes, Mechanistic Source Terms Source Terms for Source Terms for each LWR release category for each LWR release category each PBMR release }

Assessment of quantified category quantified quantified Accident No Yes, like LWR level Consequences Yes, early and 3, but conservative Consequences latent health No bounding treatment to the Public effects, property may be sufficient to damage impacts meetDCA quantified 1 requirements C Copyright 2006 by PBMR Revision: 1 - 2006/06/13 Page 28 of 74 Non-Proprietary Class 3 Fig. Aill.1 A Comparison of PBMR PRA Elements and the Current LWR Elements

34 I Draft_ Working_ Testing_A.nb While a PBMR PRA model may exhibit some similarities with the L l-L2-L3 LWR PRA structure, the elements of the PBMR PRA are combined into a single, event-sequence model framework that starts with initiating events occurring in different plant operating states, and ends in PBMR-specific event families and release categories: A given release category contains one or more specific event families for which frequencies, mechanistic source terms and off-site consequences are calculated.

The integral PBMR PRA thus encompasses the functions of a full-scope Ll-L2-L3 LWR PRA. An example is shown in Fig. AIII.2.

Ill

-0 C:

<(

oi

.§ Cl)

{!!

oi

! <  :::,, Exam ple Ablbra viaid l !EvenUTree wofli>>

~

¢:.I P B M R DnnDtding EvelfDi I !Everui Seque nce Families

~ ., . -- '-

Q lnmatlng Response to tnibaUng Event Event ~

MPSHXTwr>>o Bruk e,..h tao4aaon ve1 . .

RNCIO<

Trtp YI*

ACSIAIIS Cc>nHfft Core H*"'

Removal vtaCCS R*1Uo¥at 10a R CCS Rfflea**

Fl1t111lon via UVAC j E-11 Sequence fr9quency per R-y,.

LBE Typo

- I

- *-- 1 3*10.. AOO

-LJ-7

  • l 2 3x10""' DBE
  • -
  • l 1-10* 2 3 Jx10* OOE (I'}

' 1 ~

.! - ~ **** --10* -

- ***- --- (J I ~ -! - < ru "

=

~

I lnklatlng

  • I

_ J .,< 10-"'

  • l 5
  • 3x10 BOBE

=

C"

~

Event - I r.,J FMno-~  ; _ _ 1:c10* . ,.L _-::_19" ....

h10-t

---'-** ,..!!. "'" -- *- =

~

Yea h ":.0 -

- *-*. ** 1_1_,10 ' __ u -:11r-' ~

~).Q:*1~.

I No

-1

  • I

-I 1x10*'

-* -ti "I

-3x104 3x10**

<Hr' DBE ISOISE "0

=

=

~

~

lo,

- *- -- E-t I -*~*-i - -' - I ~ ....

I I - h10 3

. J !1~-- _!-* L -~-" :. =

~

Event Probability 1xto *'- I '1 AtidMlnfllll Event Sequences wilh FrequenciN -c 10 pw rNetot yew) >

SopIem0e, 21, 2005 PBMR Pre-apptlcatlon Planning Meeting 2

~

C> Copyright 2005 by PBMR Ply Lid 22 N eii

~

36 I Draft_ Working_ Testing_A. nb The proposed PBMR Frequency-Consequence Chart for defining LBEs based on Top Level Regula-tory Criteria. This figure is analogous to Fig. All. I as discussed in Appendix II. and shown in Fig.

AIII.3. The appropriate measure of acceptance varies for each category of event as:

1. AOO (10 CFR Part 20)

Mean Consequence< 100 mrem Total Effective Dose Equivalent (TEDE) at the control area boundary (CAB).

2. DBA (10 CFR §50.34)

Upper Bound of Mean Consequence of each DBE < 25 rem TEDE at the exclusion area boundary (EAB).

3. BDBE (NUREG- 0880)

Latent Fatality Quantitative Health Objectives (QHO) limits are applied.

l"I

.Q C:

<(

o!

§

~"'

o!

< !Eitt1mple fFtrequencyaCOlf'i}S<qjll/Jf80'1JCe Clhatrl I

~I P B M R with Top /Level Regul atory Ctriteria

~

0 ANTICIPATED

! 1.E+OO OPERATIONAL OCCURRENCE (AOO) REGION 10CFR50 APP I (Measured In Whole Body Gamma Dose)

Unacceptable

§ 1.E-01 10'1. OF 10CFR50.34 (Measured In TEDE)


Q..------~

(MEAN FREQUENCY 2.5E-02 IL a:

w

!!:. 1.E-02 t DESIGN BASIS EVENT z

w

, 1.E-03 (DBE) REGION Acceptable ....

I.

...~

10CFR50.34 C:

(Measured in l'EOE) .c z 1.E-o4 -~~!~~<?.~~~~~~~~---*---- ----------------.-- ----------- ~-------" '\! u

~

~ CJ

IE w BEYOND DESIGN PROMPT MORTALITY

=

~

CJ ffi 1.E-05 BAS49 EVENT (BOOE)

REGION I

U SAFETY GOAL (Measured in Whole I\

=

c:r

~

a w

II.I PLUME PAO u Body Gamma Dose) \ I "'

=

t-z IU 1.E-06 (MEAN FREQUENCY 5.0E-07) __ __ _ (Measured ,n TEOE)

- ='--------------------------- --~ u 0

~

I I Nole: The Low Populelion Zone (LPZ) and lhe >,

1.E-07 CJ I

plume exposure Emergency Planning Zone (EPZ) are assumed at lhe same dlalance as the I

=

~

1.E-08 EAB.

=

c:r

~

1.E-04 1.E-OJ 1.E-02 1.E-01 1.E+00 1.E+01 I.

1.E+o2 1.E+ol St1Pt""'ber 21. 200!> DOSE (Ra::.!t!A f-;fffl~ARlto!llUIQUNDARY (EAB) ~

C Coovriaht 2005 by PBMR Ply Lid 8 ~

Ref: Exelon letter dated March 15, 2002, "Revlaloi\ of Exelon Generauon Company'e Propoeed Llconalng Approach for the Pebble Bed Modular Reactor In the Unltad State,* Section 4.

t:>i>

~

38 I Draft_ Working_ Testing_ A.nb It was pointed out in Appendix II that, in principle, there exists a relationship between the risk assigned to each category and the boundaries associated with the event sequence families. Because of this the boundaries are design specific and invalidate the application of a universal frequency-consequence chart as in Fig. AIII.4.

g,

~

EVENT SEQUENCE MEAN FREQUENCY (PER PLANT YEAR}

-c ,,

3

[

t t t ~ t  ;  ;

i: ~

s g

1r

~

o'"'

N

~

-::r m ~

!l CD ><

3- 0 I

I I

I

~
v
n ~

. ~ *~

~

3:

,  ; I 1

\I i' i a I I I I

0 ID -

g_ ~

I I

",. 1 1

I: ~

U.

i lo ti!

I ' 8 1

E. I:

5: ti

' I:'

1 e: !:

~

,:"'"~

., -::r I I

-~

0..,

. 8 8

m t

I I

I I

I!: lI I

.!--) ,.,,  ;;

m Iii I

  • ~ l - 1.....:~
r : 6 ~ .! ; f I

I . m

~

o o

~ ~

0 g:

0 iii

  • ::a o:::o

§=~ §~ m

~

il I I

I r ~ t ~ r t Ht p I

~

j I\)-*

C: I

{ i 2.~i* i '!:"

I I

  • m O"=>

-: 5" ~3 t£!

UI )( '< "ti &f ru "g:

l! ~

m u,::, l: i g:"

(ll Cl ~ 0

~ "

Ill t r

S*~; f z C: ll

! ! OJ t 0

-i ll  !?

d

~

er

..~~ 1* &--- -

1 t-l

.,.' 0

! 8

, 0 I Ii 0 I ~
( 3 f..ilJ I Cl) .

il i f f =~ ~

I5 o i"'

I I

fl)

..=~ ~ ~ I

~

8a

~:

5 "'

" i

~ :e ~ ~

I I

~ c6" Ba ~ i ~ ::

(')

['g  ; sm g_ ..

( g_ e I!

3 ;:

,..-s.

., n P=t-0

~~

-::r::,

N:i' 8l"'

~

~

~

CD C o.

al CD m~

0

~

.. cc 1: i n ::,

i~

t~ N

~* g. w i

n

"'~

CD S'C i'

!:!: al in,

~~

Fig. AIII.4 Disposition of DBE on the Frequency-Consequence Chart For each LBE the event consequence evaluation is based on a deterministic OBA where mechanistic source terms are used to evaluate the realistic response of the plant to the initiating event. Thus the

Draft_ Working_ Testing_ A. nb I 39 transport of the radio nuclides from their source through the PBMR barriers, depending on the sequence, to the public are mechanistically modeled with uncertainty distributions on the expected values.

We note that the shape of the Frequency - Consequence chart in Fig. AIII.3 differs form Fig. All.I in our heuristic argument for risk-consistent categories. That is f

3.0

--- --- -- Regulatory 2.5 2.0 1.5

1.0

' \

\

\

\

\

\

0.5 \

0.0 0.5 1.0 1.5 2.0 2.5 C This reflects that regulatory considerations take into account considerations other than those dis-played in a strictly heuristic representation. (Risk-Informed Regulation in action!) In effect, on the regulatory level, a 'loss function' is implicitly introduced that takes into account the public's adverse reaction to very large accidents ( i.e. low-frequency high-release). On the other hand, high frequency but low release would result in economic loss from down time with negligible public health consequences. (Ref. 20)