ML20140H079

From kanterella
Jump to navigation Jump to search
Informs That CNWRA Intermediate Milestone 5708-762-730, Updated User Guide for Tpa Code--CNWRA Rept Does Not Meet Commitments Made in CNWRA Operations Plans.Basis for non-acceptance & Current Understanding of Code,Listed
ML20140H079
Person / Time
Issue date: 05/02/1997
From: Mcconnell K
NRC OFFICE OF NUCLEAR MATERIAL SAFETY & SAFEGUARDS (NMSS)
To: Meehan B
NRC OFFICE OF ADMINISTRATION (ADM)
References
REF-WM-11 HLWR, NUDOCS 9705120326
Download: ML20140H079 (7)


Text

.

i wou p=

- t UNITED STATES NUCLEAR REGULATORY COMMISSION WASHINGTON, D.C. 2056H001 May 2, 1997 MEMORANDUM TO:

Barbara D. Meehan, Contracting Officer Technical Acquisitions Branch No. 1 Division of Contracts. ADM THRU:

Shirley Fortuna. NRC/CNWRA Deputy Program Manager 1[

Program Management. Policy Development

[

and Analysis Staff. NMSS FROM:

KeithI.McConnell,ProgramElementManagfW' Total System Performance Assessment and Integration Key Technical Issue Division of Waste Management. NMSS

SUBJECT:

NON-ACCEPTANCE OF INTERMEDIATE MILESTONE: 5708-762-730.

UPDATED USER'S GUIDE FOR TPA CODE--CNWRA REPORT We have completed our review of the CNWRA Intermediate Milestone 5708-762-730

" Updated User's Guide for TPA Code--CNWRA Report" and find that it is not acceptable because it does not meet commitments made in the CNWRA Operations Plans.

In the paragraphs below, we describe (1) the basis for our non-acceptance. (2) our current understanding of the capabilities of the code, and (3) specific improvements and testing strategies we believe are necessary to ensure that the next version submitted is acceptable.

Basis for Non-Acceotance:

With reference to the basis for our non-acceptance of the code, the Operations Plans specify that a new version of the Total Systems Performance Assessment (TPA) code will be develo)ed in accordance with CNWRA Technical Operating Procedure (TOP)-018 and tlat the User's guide and code will be provided for use in conducting sensitivity studies.

The updated User's Guide and TPA 3.0 code were delivered by the CNWRA on March 14. 1997.

Staff expectations about the functionality of the TPA 3.0 code were explicitly stated in our acceptance review of the Software Requirements Document (SRD) (Intermediate Milestone j

5708-762-710) for the TPA 3.0 code which was submitted on January 28. 1997.

Our acceptance of the SRD was given with the expectation that the code to be 1

delivered on March 17,1997. "must be sufficient to conduct sensitivity analyses for the KTIs" (KMcConnell to RBaca, dated February 18, 1997).

Although we believe that the TPA 3.0 code architecture represents a significant improvement in capability over the IPA Phase 2 code, we conclude j' that the TPA Version 3.0 code delivered on March 14, 1997, contained functionality problems that prohibit the conduct of valid sensitivity studies and is, therefore..not acceptable.

Our conclusion is based on a series of intense meetings between NRC and CNWRA staff and actual test runs of the code CONTACT: Keith I. McConnell DWM/PAHL I

415-7289

$N ynnnmn 9705120326 970502 74/

f

-11 PDR

i

8. Meehan
  • by NRC staff.

Some of the results from those activities are described below.

In addition to the deficiencies in the code discussed below that are the basis for its non-acceptance, a number of improvements were identified that NRC staff requested the CNWRA make to the code prior to the conduct of the sensitivity studies. We want to clarify that these improvements are not considered deficiencies in the code and were not part of the staff's decision on its non-acceptance.

Rather, our determination of the suitability of the code to conduct planned sensitivity studies related to the functionality of the code to perform that task.

Specifically, we used two criteria to evaluate the code:

(1) does the code meet requirements of the SRD as accepted, and (2) does the code produce results that are realistic and verifiable. With respect i

to conformance with the SRD. the staff considers that the code does not meet j

the SRD as defined in our acce3tance of the SRD.

For example, we specified that the code should provide t1e time history of doses as well as peak dose as J

out]ut for the annual individual dose.

In the submission of the code the CNWRA notes that the code does not include a peak dose algorithm.

Further, our expectations based on the review of the SRD were that the code would provide output of intermediate results to allow testing of individual modules in the code. Our testing suggests that this expectation is only partially fulfilled.

More im)ortantly the 3/15 version of the code does not produce valid results.

Althoug1 the code runs and 3roduces results, the results are not valid because of several major flaws in t1e code.

Two examples of these flaws are described below 1)

Radionuclide solubility limits are parameters that input into the TPA code via the input file (TPA.INP). Although these inputs appear in the TPA.INP file and module output files attest to the fact that the input values were properly read. the input values were not being used in the release calculation. Although this error has been corrected by Center staff, cont'nued testing by NRC Staff is still unable to determine whether radionuclide releases are consistent with the water flow and the solubility limits for test simulations selected to impose a solubility controlled release.

2)

The calculated dose due to volcanism was found to be extremely low given the selected input parameters.

Subsequent examination by Center staff discovered an error in a conversion factor that resulted in an under-prediction of dose by eight orders of magnitude.

l Some of the original flaws have been corrected by the CNWRA. however, the fact that flaws of this magnitude occurred undermines staff confidence that other, unidentified flaws that could significantly affect the results, might not exist.

Further, the nature of these flaws calls into question the adequacy of the verification testing the CNWRA performed under TOP-18 as noted in the delivery letter (RBaca co KMcConnell, dated March 14. 1997).

We believe that comprehensive testing cf the TPA 3.0 code would have readily picked up these flaws.

l l

B. Meehan,

Current Understandina of TPA 3.0 Code:

As noted previously, we believe that the TPA 3.0 code contains many positive attributes that, when fully verified, will result in a code that provides a significant improvement over the capabilities of the IPA Phase 2 code.

Further, we believe that the strenuous efforts the CNWRA has put into code development should be recognized.

During our evaluation a number of improvements were identified and changes to the code are implemented or being implemented by the CNWRA.

Some of these improvements are:

1) straight forward input structure allowing for a broader range of users of the code:

2) inclusion of two alternative models for simulating reflux of groundwater during the thermal period of repository performance (for use in TEF KTI sensitivity analysis):

3) more explicit consideration of the saturated zone in the context of determining concentration of radionuclides for the critical group:

4) inclusion of effects of faulting on fluid flow in the unsaturated zone (for use in SDS KTI sensitivity analysis):

5) inclusion of a natural analog release rate model into EBSREL as an alternative conceptual model for release:

6) inclusion of more explicit treatment for determining the quantity of water that flows on and into waste containers: and 7) inclusion of inhalation and direct exposure pathways for locations of the critical group which are less than 20 kilometers (for use in IA KTI sensitivity analysis).

Soecific Imorovements and Testina Needed:

The staff believes that the following improvements and testing must be done before the TPA 3.0 code can be considered as acceptable under the commitments made in the CNWRA Operations Plan:

1)

The approach for modeling rockfall in the SEISMO module appears to give unusually high values for damaged waste containers (e.g. 60% of all containers failed due to rockfall).

The approach for this failure mode needs to be evaluated to determine if it is overly conservative and modified as appropriate.

2)

Confidence.in the results of the performance assessment requires careful examination of intermediate results (e.g.: groundwater flow into the l

repository, release of radionuclides from the engineered barrier, etc.).

Intermediate results need to be written to output files to allow the appropriate evaluations to be performed to determine that the code is functioning as intended.

i B.E Heehan.

4-

3)
  • A determination that a computer program is performing as intended is difficult; but it is equally difficult to determine when that goal has 3

been achieved. Although the staff does not expect that an error free code must be produced in order to be acceptable, there are some minimum tests that must be performed prior to acceptance.

First, a series of calculations should be performed for a few deterministic simulations where the outputs of various modules and the transfer of information i

between modules can be checked for accuracy (e.g., releases from j

engineered barrier are consistent with the flow into the repository and I

the release rates and solubility limits: dose to the critical group is 1

consistent with the releases from the engineered barrier and dilution 1

due to the saturated zone flow and extraction of water from the aqui fer).

Second, the effects of scenarios need to be evaluated with a few deterministic simulations (e.g., evaluate if container failures are consistent with the scenario inputs, evaluate if release of radionuclides and subsequent dose due to volcanism is consistent with the inputs).

Finally, a limited evaluation of a multi-vector simulation

- needs to be performed (e.g., the results of a few vectors in a multi-

.I vector simulation could be checked in a manner consistent with the deterministic simulations).

4)

The format of the output results must be checked to ensure that they are consistent with the needs of the sensitivity analyses.

In summary, although we recognize the significant improvement in the computational capability that the Phase 3 code represents and acknowledge the.

difficult time schedule placed on the Center for completing this effort. the 3/15 version of the code does not allow us to perform alanned sensitivity studies: it is, therefore, not acceptable. However, tie code has significant a

potential and the development-to date has required a high level of dedication i

and effort by Center Staff and this should be recognized in considering this non-acceptance.

Further, if the noted improvements in the code are made and

-it is adequately tested so that sensitivity studies can be com)leted, programmatic impacts will be minimal and the significance of t1is non-i acce)tance will be largely diminished.

We believe the staff and CNWRA are

~

capa)le of accomplishing the 3rogrammatic goals of the Division of Waste-Management and we will be worcing with the CNWRA to effect an orderly approach and schedule to accomplish those goals.

-1 If you have any questions, please-do not hesitate to contact me.

CNWRA TICKET #: 970024 DISTRIBUTION:

Central File-JAustin NHSS r/f DWM t/f DWM r/f PAHL rf/tf BStiltenpole ACNW.

PUBLIC

-PSobel LSS JLinehan JGreeves j

DOCUMENT NAME: S:\\DWM\\PAHL\\KIM\\ CENTER 3A.KIM

  • SEE PREVIOUS CONCURRENCE i

0FC PAHL*

PAHL*

PAHL*

PMDA*

DWM NAME TMcCartin KMcConnell/wd JAustin SFortuna MIeleiline DATE 04/25/97 04/25/97 04/29/97 04/29/97

'OM/M Official Record Copy e

-~

/

B. Meehan

  • 3)

A determination that a computer program is performing as intended is not only difficult to do but equally difficult to determine when it has been achieved. Although the staff does not expect that an error free code must be produced in order to be acceptable, there are some minimum tests that must be performed prior to acceptance.

First, a series of calculations should be performed for a few deterministic simulations 1

where the outputs of various modules and the transfer of information between modules can be checked for accuracy (e.g.. releases from engineered barrier are consistent with the flow into the repository and the release rates and solubility limits: dose to the critical group is consistent with the releases from the engineered barrier and dilution

^

due to the saturated zone flow and extraction of water from the aqui fer).

Second, the effects of scenarios need to be evaluated with a few deterministic simulations (e.g.

evaluate if container failures are consistent with the scenario inputs. evahmte if release of radionuclides and subsequent dose due to volcanism is consistent with the inputs).

Finally, a limited evaluation of a multi-vector simulation

)

4 ne?on to be performed (e.g., the results of a few vectors in a multi-vector simulation could be checked in a manner consistent with the 4

deterministic simulations).

4)

The format of the output results must be checked to ensure that they are consistent with the needs of the sensitivity analyses, j

Ir. ausnat1. although we recognize the significant improvement in the computational capability that the Phase 3 code represents and acknowledge tha difficult time schedule placed on the Center for completing this effort, the 3/15 version of the code does not allow us to perform alanned sensitivity studies: it is, therefore, not acceptable.

However, tie code has significant 2

potential and the development to date has required a high level of dedication and effort by Center Staff and this should be recognized in considering this non-acceptance.

Further if the noted improvements in the code are made and it is adequately tested in time to complete the KTI and system-level 4

sensitivity studies such that programmatic impacts will be minimal the significance of this non-acceptance will be largely diminished.

We believe the staff and CNWRA are capable of accomplishing the programmatic goals of the Division of Waste Management and we will be working with the CNWRA to effect an orderly approach and schedule to accomplish those goals.

If'you have any questions, please contact me at (301) 415-7289.

cc:

J. Linehan. PMDA CNWRA TICKET #: 970024 DISTRIBUTION:

Central' File JAustin NMSS r/f DWM t/f DWM r/f PAHL rf/tf BStiltenpole ACNW PUBLIC PSobel LSS JLinch m mRM4c-DOCUMENT NAME: S:\\DWM\\PAHL\\KIM\\ CENTER 3A.KIM

  • SEE PREVIOUS CONCURRENCE-

/fd. M//f7 0FC PAHL MAHL PMDA DWM NAME TMcCartinN Monnejl/wd

.SFortuna JGreeves DATE 04N97 04f/97 04/ /97 04/ /97 Official Record Copy

B. Meehan 4-3)'

A determination that a computer program is performing as intended is not only difficult to do but equally difficult to determine when it has been achieved. Although the staff does not expect that an error free code must be produced in order to be acceptable, there are some minimum tests that must be performed prior to acceptance.

First, a series of calculations should be performed for a few deterministic simulations where the outputs of various modules and the transfer of information between modules can be checked for accuracy (e.g., releases from engineered barrier are consistent with the flow into the repository and the release rates and solubility limits: dose to the critical group is consistent with the releases from the engineered barrier and dilution due to the saturated zone flow and extraction of water from the aquifer).

Second, the effects of scenarios need to be evaluated with a

'few deterministic simulations (e.g., evaluate if container failures are consistent with the scenario inputs, evaluate if release of radionuclides and subsequent dose due to volcanism is consistent with the inputs).

Finally, a limited evaluation of a multi-vector simulation needs to be performed (e.g., the results of a few vectors in a multi-vector simulation could be checked in a manner consistent with the deterministic simulations).

4)

The format of the output results must be checked to ensure that they are consistent with the needs of the sensitivity analyses.

In summary, although we recognin the significant improvement in the computational capability that the Phase 3 code represents and acknowledge the difficult time schedule piaced on the Center for completing this effort, the 3/15 version of the code does not allow us to perform alanned sensitivity studies; it is, therefore, not acceptable.

However, tie code has significant potential and the development to date has required a high level of dedication and effort by Center Staff and this should be recognized in considering this non-acceptance.

Further, if the noted improvements in the code are made and it is adequately tested in time to complete the KTI and system-level sensitivity studies such that programmatic impacts will be minimal, the significance of this non-acceptance will be largely diminished. We believe the staff and CNWRA are capable of accomplishing the programmatic goals of the Division of Waste Management and we will be working with the C4WRA to effect an orderly approach and schedule to accomplish those goals.

If you have any questions, please do not hesitate to cnntu t me.

CNWRA TICKET #: 970024 DISTRIBUTION:

Central File JAustin NHSS r/f DWM t/f DWM r/f PAHL rf/tf BStiltenpole ACNW PUBLIC PSobel LSS JLinehan hWehrlinc DOCUMENT NAME: S:\\DWM\\PAHL\\KIM\\ CENTER 3A.KIM

  • SEE PREVIOUS CONCURRENCE a/ r/,

7 0FC PAHL*

PAHL*

PAHL,,

PMDA,_ C DW[

NAME TMcCartin KHcConnell/wd Ntb SFor'tda Yeeves DATE 04/25/97 04/25/97 04/Af/97 04/f//97

[04b97 Official Record Copy

/

l

+

' B. Meehan i 3)

A determination that a computer program is performing as intended is difficult, but it is equally difficult to determine when that goal has been achieved. Although the staff does not expect that an error free rode must be produced in order to be acceptable. there are some minimum i

tests that must be performed prior to acceptance.

First, a series of calculations should be performed for a few deterministic simulations where the outputs of various modules and the transfer of information i

between modules can be checked for accuracy (e.g., releases from engineered barrier are consistent with the flow into the repository and the release rates and solubility limits: dose to the critical group is consistent with the releases from the engineered barrier and dilution due to the saturated zone flow and extraction of water from the aquifer). Second, the effects of scenarios need to be evaluated with a few deterministic simulations (e.g., evaluate if container failures are consistent with the scenario inputs, evaluate if release of radionuclides and subsequent dose due to volcanism is consistent with the inputs).

Finally, a limited evaluation of a multi-vector simulation needs to be performed (e.g., the results of a few vectors in a multi-vector simulation could be checked in a manner consistent with the deterministic simulations).

4)

The format of the output results must be checked to ensure that they are i

consistent with the needs of the sensitivity analyses, j

In summary, although we recognize the significant improvement in the computational capability that the Phase 3 code represents and acknowledge the difficult time schedule placed on the Center for completing this effort, the 3/15 version of the code does not allcw us to perform

]lanned sensitivity studies; it is therefore, not acceptable.

However, t1e code has significant potential and the development to date has required a high level of dedication and effort by Center Staff and this should be recognized in considering this i

non-acceptance.

Further, if the noted improvements in the code are made and it is adequately tested so that sensitivity studies can be com)leted, programmatic impacts will be minimal and the significance of t1is non-acce)tance will be largely diminished. We believe the staff and CNWRA are capaale of accomplishing the 3rogrammatic goals of the Division of Waste Management and we will be wor(ing with the CNWRA to effect an orderly approach and schedule to accomplish those goals.

If you have any questions, please do not hesitate to contact me.