ML20203C427

From kanterella
Jump to navigation Jump to search
Forwards Draft ANSI Std for Review & Comment.Comments Requested to Be Provided by 980302
ML20203C427
Person / Time
Issue date: 02/20/1998
From: Cool D
NRC OFFICE OF NUCLEAR MATERIAL SAFETY & SAFEGUARDS (NMSS)
To: Greeves J, Haughney C, Ten Eyck E
NRC OFFICE OF NUCLEAR MATERIAL SAFETY & SAFEGUARDS (NMSS)
References
NUDOCS 9802250186
Download: ML20203C427 (25)


Text

_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

February 20, 1998 a-1 e MEMORANDUM TO: Those on Attached List FROM: Donald A. Cool, Director (orig, signed by)

Division of Industrial and Medical Nuclear Safety, NMSS

SUBJECT:

REVIEW AND CONCURRENCE ON DRAFT ANSI STANDARDS The following draft ANSI Standards are provided for your review and comment:

ANSI /HPS-N 13.33 Guide to preparation of environmental Radiation Surveillance and Monitoring Reports This draft is provided to NRC for initial technical review and is not at this time being officially ballotted. Comments on this standard shouic 1e provided by March 2,1998, to Charleen Raddatz of my staff on 415-6947 (E-mail CTR). The documents have been provided to NMSS to provide NRC and its experts the opportunity for review and evaluation of the documents. For each comment provided, please identify the paragraph and line l numbers of the corresponding text, the suggested revised language, and the reason fw the l recommendation.

cc: C Jones, NMSS J. Piccone, NMSS L. Camper, NMSS '

DISTRIBUTION: IMNS 6048 NMSS R/F i IMNS CIF

-NRC File Center DFO3 CRaddatzR/F PDR Y YES NO Q J (W N f \MST q p $O DOCUMENT NAME: J;\CRADDATZ\HPS%NSl_13.30 *See previous concurrence hc) (\

T. ,.e*.

. con ., um docum.ni. m. g u. not c - coov oiinout n.cnm.nv.ncio.u,. r - cooy ..in .it.cnm. nee.ncio.or. N no cooy

< OFFICE - IMNS* _ in IMNS:QlS-f NAME CRaddatz ( A_ DCoo1~( > ~

DATE 2/18/98 2/10/98 i

.,9 gag e #

  • 1. - ***'I . **% F -

w a_ m - m u q;..<!.-

9802250186 980220 PDR ORG NOMA PDR C3 /% 1

1

!. Mr.MORANDUM TO: Those on Attached List 4

FROM: Donald A. Cool, Director 1- Division of Industrial and Medical Nuclear Safety, NMSS

SUBJECT:

REVIEW AND CONCURRENCE ON DRAFT ANSI STANDARDS 4

4 The foibwing draft ANSI Standards are provided for your review and comment:

ANSI /HPS-N-13.33 Guide to preparation of environmental Ragiation Surveillance and

Monitoring Reports e

, e

/

This draft is provided to NRC for initial technical review and is not at this time being officially ballotted. Comments on this standard should be provided by Ma'rch 0,1998. The documents have been provided to NMSS to provide NRC and its experts the opportunity for review and

, evaluation of the documents. For each comment provided please identify the paragraph and line numbers of the corresponding text, the suggested revised language, and the reason for the recommendation. j cc: C. Jones, NMSS J. Piccone, NMSS /

/

L. Carnper, NMSS /

/

DETRIBUTION. IMNS 6048 /

4 NMSS R/F /

IMNS C/F - .'

NRC File Center /

CRaddatzR/F /

PDR YES NO /

/

COCUMENT NAME: J:\CRADDATZ\NPS\ANSl_13.33 To receNo e copy of this document. Indicate in the bos: "C' = CoDy without attachment /enclosu' a "E' = Copy with ettschmentrenclosure *N* = No cop .

OFFICE IMNS _

E IMNS:DIR

, NAME CRaddavfM -

DCool DATE 2//g/98 ' 2/ /98

/

f

/

/

/

4

,. jeno q o

.At UNITED STATES

,j

. g NUCLEAR REGULATORY COMMISSION

'2 WASHINGTON, D.C. 2006H001

't, February 20, 1998 MEMORANDUM TO: Those on Attached List FROM: Donald A. Cool, Director '

D/ .[!ff k Division of Industrial and --

Medical Nuclear Safety, NMSS

SUBJECT:

REVIEW AND CONCURRENCE ON DRAFT ANSI STANDARDS The following draft ANSI Standards are provided for your review and comment: l ANSI /HPS-N-13.33 Guide to preparation of environmental Radiation Surveillance and Monitoring Reports This draft is provided to NRC for initial technical review and is not at this time being officially ballotted. Comments on this standard should be provided by March 2,1998, to Charleen Raddatz of my staff on 415-6947 (E-mail CTR). The documents have been provided to NMSS to provide NRC and its experts the opportunity for review and evaluation

- of the documents. For each comment provided, please identify the paragraph and line ,

numbers of the corresponding text, the suggested revised language, and the reason for the recommendation, i

cc: C. Jones, NMSS J. Piccone, NMSS L. Camper, NMSS

l Attached List for M;morandum d:t:d: February 20, 1998-j .

SUBJECT:

REVIEW AND CONCURRENCE ON DRAFT ANSI STANDARDS s . John Greeves, Director

. Division of Waste Management, NMSS
. Elizabeth Q. Ten Eyck, Director Division of Fuel Cycle' Safety and Safeguards, NMSS 1

~

Charles Haughney, Director Office of Spent Fuel Project Oi&ce, NMSS' Joseph Murphy, Director Division of Regulatory Applications, RES l Charles Miller, Chief Emergency preparedness and l Radiation Protection Branch Division of Reactor Program Management, NRR j Stuart A. Treby

Assistant General Counsel for Rulemaking and Fuel Cycle, OGC e

't s

}

4 i

i 4

1

- , - . - , - - -.._.u-.-._e . , , - - . ,e - o._-,.- , - - , _.- -.. . . . -3 ,-.-- . -- . --

i 9

Forward (This Forward is not part of the American National Standard HPS N13.33.)

2 A review of current reporting practices reveals no consistent format or adoption of the recommendations made by Health Physics Society Committee Report HPSR-1 (* 980),

4 Upgrading EnvironmentalRadiation Data. This American National Standard provides the minimum requirements for reporting the results of a radiological environmental 6 monitoring program (REMP) with respect to content and data presentation. It is not meant to set requirements for research reports. The recommendations of this standard 8 are based on the precept that the report is intended for the competent reader. By competent reader we mean an individual possessing the basic technical knowledge 10 necessary to understand the sampling and radioanalytical techniques employed in the REMP and the mathematical background to understand the basic statist l cal techniques 12 used to analyze the data generated by the analyses. Nevertheless, the standard should provide an adequate framework for reports to groups and individuals who may 14 not have the same degree of technical knovAedge. it is the responsibility of the wnter to know the audience for v,tiom the report is intended and generate the report 16 accordingly.

Suggestions for improvement of this Standard will be welcome. Suggestions should be le sent to the Health Physics Society,1313 Dottey Madison Boulevard, Suite 402, McLean, Virginia, 22101.

1 Attachment

20 The Health Physics Society Standards Committee Working Group responsible for this standard are the following:

22 Kjell A. Johansen, Chairperson -

(Wisconsin Electric Pover Company) 24 Keith Earnshaw (Global Tek) 26 William A. Hill, Jr.

(Pennsylvania Power & Light) 28 Susan T. Masih (University of Missouri-Kansas City, Chemical, Biological, & Radiation Safety) 30 Steven L. Simon (National Research Council, National Academy of Sciences) 32 Douglas Wahl (Philadelphia Electric Company)

K Ralph G. Wallace (TVA) 4 l

2

'6 3 DRAFT HPS N13.33 WORKING GROUP 38 AMERICAN NT.TIONAL STANDARD HPS N13.33 4e Guide to Preparation of Environmental Radiation Surveillance and Monitoring Reports 42 1 Introduction The purpose of the radiciogical environmental monitoring prcgram (REMP) report is to 44 convey the :, cope and results of the monitoring program with sufficient clarity and completeness so that the comoetent reviewer can determine whether the program 46 objectives have been accomplished. Three elements are required. First, the purpose or objective of the REMP must be stated clearly. Second, all of the relevant data need 48 to be included. Finally, the data must be presented in a way such that the reader can evaluate the results with respect to the stated objectives.

50 The key factor in producing confidence in the reported results is the description of the treatment of the data generated by the monitoring program. Therefore, this standard 52 stresses the need for adequate data treatment in the reporting of results. The

iecommendations for treating data and reporting results found in Chapters 6 and 7 of
54 Upgrading Environmental Radiation Data, Health Physics Society Committee Report HPSR-1(1980) are endorsed by this standard. Guidance for accomplishing the 56 recommendations of HPSR-1 is provided in the Appendix to this standard.

The recommendations of this standard are based on the precept that the report is 58 intended for the competent reader. As stated earlier, a competent reader is an individual possessing Me basic technical knowledge necessary to understand the 60 sampling and radioanalytical techniques employed in the REMP and the mathematical background to understand the basic statistical techniques used to analyze the data 62 generated by the analyses. Nevertheless, this standard should provide an adequate framework for reports to other individuals or groups wno may not be so knowledgeable.

j 64 It is the responsibility of the writer to know the audience for whom the report is intended and generate the report accordingly.

66 In setting forth the expectations of this standard, the word "shall" denotes a requirement, the word "should" denotes a recommendation, and the word "may" 68 denotes permission, neither a requirement nor a recommendation.

2 Purpose 70 2.1 This standa d defines the minimum requirements for reporting the results of a radiological environmental monitoring program in order to produce greater uniformity 72 among institutions in the me', nods used for reporting routine radiological surveillance data. It is not meant to ut requirements for research repnrts.

74 2.2 Some facilities may issue a combined effluent monitoring and REMP report.

Although this standard only addresses the REMP portion of such a report, issuers of a 76 combined report may wish to adopt the recommendations of this standard for the effluent monitoring portion of the report.

78 2.3 Facility, as used in this standard, means any building or site, at which the use, storage, or generation of radionuclides, currently or in the past, either had or has the 80 potential for radionuclides to be released to the environment.

3

Report structure 3

82 3.1 Contents -

The main components of the Contents section shou!d consist of the fouuwing:

84 (1) Table of Contents, (2) List of Tables, and 86 -(3) L.ist of Figures.

The Table of Contents shall be divided, as appropriate, into the following sections:

88 (1) Executive Summary, (2)lntroduction, 90 (3) Purpose, (4) Definitions, 92 (5) Facility Description, (6) Sampling Requirements, 94 (7) Analytical Requirements, (8)Results and Discussion, 96 (9) Conclusion, (10) Bibliography, and 98 (11) Appendices.

This standard makes no recommendations with respect to pagination.

100 3.2. Executive Summary The first text the reader encounters in the report shall be the Executive Summary. The 102 Executive Summary should include' an overview of the sampling program, a general summary of the results, and the conclusions drawn from the results. The conclusions 104 shall state whether or not results are determined to be attributable to the operations of the facility being monitored. The Executive Summary should be approximately one

.106 page in length, but no more than two pages.-

3.3 Introduction 108 The introduction shall provide a brief description of the facility. The facility description should be no more than a short paragraph. A more detailed description should be 110 presented in the Facility Description section of the REMP report (see Section 3.6 of this Standard). The introduction also shall reference any licenses or permits governing the 112 operation of the facility and the regulations and standards which apply to its monitoring program.

114 3.4 Purpose The Purpose section shall state the reasons for conducting the monitoring program and 116 for issuing the report. !t is necessary that each reporting entity provide a proper statement of (1) the objectives of the monitoring program, and (2) the objectives of the 118 report presenting the monitoring data. The reporting institution should attempt to specify the objectives to a precise enough degree so that a competent, independent 120 reviewer could dCermine whether those objectives have been met.

Much of the disagreement among scientists as to the proper level of statistical rigor 122 . required in reporting data could be avoided by first clarifying the objectives of the 4 sampling program and of the report. The report shall discuss the reasons for 1 124 conducting the program, including specific objectives and limitations nf the pregram.

4  :

~ . - , , m .,- _,

,,y- - , . , , , , . - ~ - - - - - - - n- .,r ,,- c,

4 The statement of objectives should not be limited to stating that the monitoring program 1.26 is required by regulatory agencies. It should avoid ambiguous phrases such as "to charactete the environment," "to obtain representative samples," etc., unless the 128 phrase is adequately supported by a precise definition. Also, sufficient detail should be provided so that the successful completion of the objectives can be demonstrated.

130 Factors being quantified shall be stated in this section, e.g., " .to determine the distribution of concentrations of xxx radionuclides in the 10 km zone around the site in 132 zzz type of biota.. ", or " ..to establish the mean concentration in animal forage products to x% certainty within AA county.. ", etc.

134 The Purpose section also should present any regulations or commitments (Technical Specifications, Offsite Dose Calculation Manual, license, etc.) addressing the conduct 136 of the program or the preparation and issuance of the report. It also may present the rationale for the selection of the specific media sampled and the locations from which 138 the samples are taken.

3.5 Definitions 140 3.5.1 All terma used in the REMP report that may not be familiar to the reader shall be defined prior to use or shall be included in a Definition section with the terms presented 142 in alphabetical order. The following, not meant to be all inclusive, are examples of entries in a " definitions" section:

144 (1) Blank - Laboratory media such as water, filters, or chemicals which are subjected to the same radicanalytical procedures as the indicator samples in order to determine the l 146 degree to which, if any, labocatory procedures introduce a positive or negative bias in the analyses of environmental samples.

148 (2) Composite - The combining of several similar discrete items collected over a period of time or over a large area to form one sample; 150 (3) Control - A monitoring site located at a distance and/or direction away frcm the monitored source of radioactive materials so that trv cccurrence of radionuclides 152 attributable to the source would not be expected to be present; (4) Indicator - A monitoring site where the effects from the production or use of 154 radioactive materials wou!d most likely be detected; (5) Joint Frequency Distribution (or wind rose) - A method of presenting the percent of 156 time that the wind is from each of the sixteen 221/2 degree sectors; (6) LLD -lower limit of detection (describe by a mathematical formula );

158 (7) MDA - minimum detectable activity (describe by a mathematical formula); and (0) Radiological Environmental Monitoring Program (REMP)- A program designed to 160 assess the radiological conditions in the environs of a facility which possibly emits radionuclides end subsequent documentat'on for the public record.

162 3.6 Facility Description The Facility Description section should:

164 3.6.1 Provide information where the faclity is located (township, county, state and country), as applicable. (Map, graph, andillustration guide!ines appearin Section 4 of 166 this standard.)

5

3.6.2 Describe the facility area in total hectares (acres), topography, and location of 168 streams,' rivers, estuaries, tidal rivers, lakes, bays, oceans or other major water sources, as applicable.

170 3.6.3 Descri'oe the geological formations and ground water movements of the site area, as applicable to the facility.

172 3.6.4 Describe the populations centers potentially affected by the facility.

3.6.5 Report the joint frequency distributions of the historical meteorological data, as 174 applicable to the facility. Wind direction should always be stated as the direction from which the wind is coming.

176 3.7 Sampling Requirements The Sampling Requirements section shall describe the REMP by providing the 178 following:

3 7.1 All monitoring locations shall be identified on a map or series of maps, as 180- required. (Map specifications are presented in Section 4.0.)

3.7.2 Maps of sampling locations should be supplemented by tables provicing 182 descriptions of the sampling locations, distance and direction from the site, sample type, and frequency of collection as shown by example in Table 1.

184 3.7.3 Any samples that are composites shall be identified.

3.7.4 Any sampling irregularities that resulted from conditions such as weather, 186 mechanical failure, or personnel error shall be documented. Depending upon the magnitude of the irregularity, a discussion of the impact upon the REMP may be 188 warranted in the Discussion Section of the report.

6 4, - - - - . . -.. ,.,_,

e Table 1 Sample Collection end Analysis for the Radiological Environmentai Monitoring Frogram for 19 190 Table Location Desenption Distance & Direction Cottection Method & Frequency Analysis & Frequ6 icy 192 Location

, 194 i Surface Water ,

725 miles E Two gallon sample cors.cted from a G. Bet:- (S&I) - monthly ,

10F2 Pumping Station (contro!) '

continuous water sampler, monthly Gamma Spec - monthly

' Tntium - quar 1erty comp  ;

Map 3 G. Beta (S&I) - monthly Gamma Spec nonthly 1.75 miles SE Same as 10F2 Same as 10F2 1381 out Dam (indicator)

Map 2 020 n es SW Same as 10F2 Same as 10F2 24S1 Plant intake (control)

Map 1 l 196  ;

B. Drinkina (Potable) Wag i

198 113H2 i Small Town Water Works 24.91 miles SE Two gallon composite sample Beta (Sat)- monthly .

collected from a contrnuous wate, Gamma Spec- monthly (indicntor) Tntium - quarterly comp.

sampler, monthly  ;

Map 3

\

15F4 Suburban Water Company 8.62 miles S3 Same as 13H2 Same as 13H2 (indicator)

Map 3 S&l = soluble and insoluble 7

1 ,

4 i i

i y

i 200 3.7.5 The sampling methodology should be described in sufficient detail to distinguish between routine and non-routine sampling methods and among continuous, systematic, 202 and grab sampling.

3.7.6 The exposure pathways that are being monitored shall be described. Within 204 each pathway, the parameters important to estaolishing the monitorhg locations should be described. The description r.ay include parameters such as meteorology, 206 population distribution, hydrology, land use characteristics. etc.

3.7.7 All sampling frequency terms should be clanfied. Special sampling practices 208 that are relied on in the event that normal sampling cannot be performed may have to be described. Figures may be used to describe complicated or unusual sampling 210 techniques.

3.8 Analytical Requirements 212 3.8.1 The types of analyses (or measurement techniques), the detection limits (LLD, MDA, MDC), and the analysis frequency shall be stated in the Analytical Requirements 214 section of the REMP report. (See HPSR-1, ANSI N42.23-1996, or HPS N13.30-1996 for definitions.) A table is useful for conveying this information. Differences between 216 analysis requirements established by regulatory agencies and the analysis regimen adhered to by the program should be stated.

218 3.8.2 Terms used to express detection limits and formulas including the LLD etc., as applicable to the REMP, shall be defined in the report.

220 3.8.3 Differentiation shall be provided between routine and non-routine analyses performed under the program.

222 3.8./ Information concerning radioanalytical methods and procedures, including separations, purifications, enrichments sample quantities required and counting 224 techniques necessary to achieve the desirod lower limit of detection should be ,

described, as applicable, in the report. If analyses are conducted by a contracted 226 laboratory which uses proprietary procedures, the use of such procedures should be noted in the description, i.e., " . iodine extracted from milk using procedure developed 228 by Laboratory AYZ.. ." This analytical information may be provided in an appendix to the REMP report instead of the body if so des' red.

230 3.9 Results and Discussion 3.9.1 Results If the data are not too extensive, the data values should appear in the 232 Results section. Extensive- data may appear in an appendix or in a separate, referenced document. Although this standard concurs with previous recommendations 234 (NUREG-0475, HPSR 1, ANSI N42.23-1996, HPS N13.30-1996) that all results be reported as measured, whether positive, negative, or zero, even if below the a pn'ori 236 LLD, it is recognized that results also may be reported as less than a certain predetermined value under the conditions described in Section A.S.3 of the Appendix 238 to this standard. However, only the "as measured values" should be used to calculate any descriptive statistical parameters such as means and standard deviations.

240 3.9.2 Discussion The Discussion shall include the interpretation of the results with the presentation of selected results to support the conclusion. The presentation and 242 interpretation of environmental monitoring data may be accomplished by many inethods that depend on facility operating status and the type and extent of historical monitoring 8

I-L.______._ . _ . _ . , _ _ - . . _ - -_ , _ _ . _ . . _ _ . _ _ _, _ _ . _ . _ _ . _ . , . _ , -

244 records. Comparison of monitoring results with existing standards is of primary importance, but several other comparisons are also valuable. Selected data, (such as 246 a summary table of means and the highest and lowest results) should be included if the following comparisons and analysis techniques are appropriate for the given monitoring 248 situation:

(1) Comparison c' sults with applicable regulatory standards or limits, 250 investigation ai- .clion levels, or other compliance standards required by facility or site lice, r conditions; 252 (2) Comparison of resu . with preoperational monitoring baseline information or monitoring information for previous reporting periods; 254 (3) Comparison of results with information obtained and reported earlier and evaluation of any short- or long term trends; 256 (4) Presentation of summaries, interpretations, and analyses of results and trends for the reporting period of interest:

258 (5) Comparison of results between upwind / downwind, upstream / downstream, and upgradient/downgradient monitoring locations; and 260 (6) Discussion of any radionuclide movement or release in terms of the amount, pathway, and significance of impact to the public or environment.

262 3.9,3 Adaitionalitems Besides items included in the above discussion, Additional l

Items, such as those mentioned below should be documented:

264 (1) Description of relevant modifications to the monitoring program since the last reporting period, (e g., changes in sampling locations or sampling / analysis methods);

266 (2) Description of any significant results from audits, surveillances, and laboratory intercomparison studies during the reporting period; 268 (3) Description of any notable events and their impact (e.g., floods or fires);

I (4) Status and evaluation of any investigations, corrective actions, or program 270 modifications initiated and identified in previous reporting periods; and (5) Identification of any scheduled sampling or monitoring not performed with a 272 description of the reasons for not performing the activity and steps taken to correct any problems and prevent a recurrence.

274 3.9.4 Unit' The Units chosen for reporting and discussing results will depend, in part, on how the results will be used. For example, results that can be compared with 276 regulatory standards shall be reported in units consistent with the regulatory standards.

This standard endorses the use of SI units or their derivatives wtienever possible.

278 3.10 Conclusion The Conclusion section should provide a concise summary of the results and discussion sections. The overall goals, objectives, and scope of the 280 radiological environmental monitoring program should be re-stated. Explanations of how the goals and objectivns have been met should be included in the conclusion. Any 282 commitments and/or corrective actions to be implemented in subsequent monitoring and reporting periods which may impact future observations should be described.

284 3.11 Bibliography 3.11.1 The Bibliography section shall provide a listing of all outside sources of 286 information mentioned in the report. This includes printed material, such as previous monitoring reports, journal articles, books, and reports., as well as oral communications 9

e 288 which are mentioned in the report or from which data or other information is obtained in order to generate the report and interpret the results.

290 3.11.2The bibl!ographic information shall be unambiguous. The correct edition or report number of seria' publications shall be used. For example, references to i 292 NUREG-0837 (NRC TLD Direct Radiation Monitoring Network) shall include the volume, issue number, and pages. If only a chapter, a few pages, a table, or a figure 294 from a report, book, or lengthy article is used in the report, indicate the portion used in the bibliographic information.

296 3.11.3This standard makes no further recommendations with respect to bibliographic format.  !

298 3.12 First appendix : Presentation of radiological envirornnental monitoring program results 300 3.12.1 The data generated by the REMP shall appear in clearly formatted tables in the appendix. The data tables shall contain the following information:

302- type of media and analysis performed; (1)

(2) sample collection location (either a description or by a clear reference to a 304 description of the location in the text of the report);  !

(3) the date the sample was collected & the time period over VMich the sample was 306 collected; i

(4) the best estimates of the sample analyses determinations with the associated 308 estimated errors.

(5) the factors included in the error analysis shall appear in the occument, and be

! 310 referenced in this section; and (6) any sampling irregularities shall be indicated in footnotes directly associated with the 312 appropriate data table.

3.12.2 The units in which the results are reported should coincide with those required by 314 the applicable regulatory body. If requirements placed on the report do not specify SI units, every effort s:nould be made to include a conversion factor for converting reported results in 316 SI units.

3.13 Second appendix: Quality assurance program description 318 3.13.1 Laboratory calibration / precision A determination that the reporting laboratory's procedures are consistent over time and _

320 are accurate _(i.e., in agreement with known values) is imperative as a first step to reporting measurement uncertainty.

322 Credibility of a radiation monitoring and measurement program can be estabiished by a comprehensive inter-laboratory comparison program which is repeated at appropriate 324 intervals of time. Reporting the results of participation in such comparison programs is required to establish the capability of the laboratory for conducting precise and 326 accurate- measurements and makes excessive effort to assess systematic error ,

unnecessary. Statements of uncertainty of individual measureinents cannot be 328_ appropriately judged without some form of validation external to the laboratory that is conducting the monitoring program.

330 A brief summary of 'he laboratory quality assurance program results for the time period of the report should be presented. Results from previous periods may be necessary if 10 l

pyzywg .

332 there is a trend of decreasing quality. Also, the results from participation in pau I

specialintercomparison studies should be referenced as applicable. Any fectors used 334 to account for systematic biases of measurement procedures should be explained.

Internal auditing procedures should be documented. ANSI N42.23-1996, Measurement 336 and Associated Instrument Quality Assurance for Radioassey Laboratories, provides comprehensive guidance for developing a OA program.

338 Results from the intercomparison program may be presented in a tabular format such as that shown below.

340 Example Table for Intercomparison Program Analytical Results Known value Laboratory Type of Analysis Date (: reported Results (:

uncertainty) measurement error) l 3.13.2 Other useful information 342 Because many results of environmental measurements are expected to be near "zero,"

results from the analyses of " blanks" should also be reported in order to document any 344 inherent positive or negative bias which may skew the results. Reporting of the results from the analyses of blanks should be presented in a similar manner to that of the 346 interlaboratory comparison results.

4.0 Maps, Graphs, and litustrations 348 4.1 Maps shall provide sufficient detail to show the location of major structures, boundaries, roads, rivers, bodies of water and major communities in relation to control 350 and indicator sampling sites. Other items that can be added, as appropriate, include airports, recreation areas, and secondary roads. The map shall clearly differentiate 352 between indicator and control sites.

4.2 Maps should be designed using sixteen 221/2 degree sectors, as applicable. If 354 the sectors are numbered, sector number one is the North Sector and other sectors are numbered in clockwise order.

356 4.3 Each sampling location shall have a unique designation. To the extent possible, use unique symbols on the maps to represent the media sampled, or the type of 358 monitoring being conducted to provide a quick, concise view of the program.

4.4 If a graph contains more than one dependent variable, each dependent variable 360 shall be uniquely identified.

4.5 Maps, Graphs, and Illustrations that are in color format should be reproducible in 362 black and white format with no loss of information.

APPENDIX A guide to the principles of analyzing and reporting data 364 A.1 Purpose The purpose of this appendix is to provide a guide to the principles useful for clearly 366 reporting environmental radiological data. The application of these guidelines should 11

n produce greater uniformity among institutions in the methods used for summarizing and 368 reporting routine surveillance data. A limited number of pertinent literature references are included to aid those individuals responsible for organizing, analyzing, interpreting 370 and reporting environmental radiation data.

The philosophy of this section is that statistical innovation or complex modeling is not 372 needed for the purpose of summarizing and reporting routine environmental radiation  !

monitoring data, in contrast, research programs often do require complex analyses. i 374 Therefor, research programs should be understood to be separate initiatives with quite separate objectives and methodologies and should not be restricted to the procedures 376 and methods noted here.

A.2 Normative references 378 No American National Standard currently is in use for reporting routine environmental radiation data. However, the Department of Energy has published orders and guidance 380 documents (see DOE Order 5400.1 and DOE /EH-0173T). A report issued by the Health Physics Society in 1980 (HPSR 1,1980) was reviewed to determine its 382 usefulness in providing guidance and methodology, This standard supports in full tho '

methodology provided in Chapters 6 and 7 of that report. The main points of those

> 384 chapters are reiterated here. Some additional considerations are also discussed in this Appendix.

386 A3 What to report A.3.1 Types of data Which types of data (individual measurements, summary 388 statistics, etc.) to report is a function of the objectives of the report being issued and will be specific for the program objectives of the reporting institution. In general, both 390 summary statistics and detailed measurement data may be reported.

All environmental radioactivity data collected by publicly supported institutions should 392 be published, however, not all reports must contain all data. The reporting institution should determine in which reports large amounts of data are to be made available. In 394 the case where very large data sets are collected, only summary statistics may be required in some reports. In such a case, publication of individual data values may be 396 reserved for reports that are available only by specific request.

Depending on the intended degree of statistical summary, various levels of 398 mathematical manipulations of the data may be needed.

A.3.1.1 Individual data values The publication of individual data values requires 400 that the total estimated uncertainty of each value be reported. Section A.5 discusses procedures for determining this uncertainty, in many cases, the relative error of 402 measurements of the same type (within the same monitoring program) will be the same.

This will most likely be the case when counting times are determined by statistical 404 precision requirements rather than elapsed time, and when sample preparation procedures are identical among a group of camples.

406 A.3.1.2 Summary statistics The preparation of Summary statistics sometimes connotes simple - mathematical manipulations, but may in fact, require careful 408 examination of the type of sampling program conducted and the degree.to which random sampling, systematic, or haphazard sampling was carried out. This Appendix 410 is not intended to elucidate the details of all types of sampling programs and the 12

statistical estimators that are valid for each, though some details are provided below in 412 Sec. A.3.2. d A.3.2 Additionalinformation to be reported 414 A.3.2.1 Type of sampling program The type of sampling program should be determined and reported. If a data summary is part of the report objectives, the 416 summary statistical estimators appropriate to the type of sampling plan should be used.

See Gilbert (1987) Chapters 4 - 9 for a discussion of sampling plans and statistical 418 estimators.

A.3.2.2 Measurement conditions and detection limits Special conditions of the 420 measurement process which are relevant to interpretation of the data should be presented. For example, special conditions might include specifying whether 422 measurements (in particular, in-situ measurements) were conducted in a low- or high-background area or if instrument performance was unusual for any reason.

424 A quantitative measure of the level at which the laboratory can detect the specific radioactivity of interest should also be reported. Various measures of detection limits l 426 have been discussed in the literature (see ANSI N13.30 and ANSI N42.23) However, l reports issued to regulatory agencies and the public should include information from i 428 which it is possible to compare the amount of activity detected relative to the minimum amount detectable by the reporting laboratory at a specified confidence level (for the 430 measurement process used). The precision of radioactivity measurement will depend on a number of factors. However, if the measurement process is specified (e.g.,

432 sample size and counting time), the amount of activity which can be detected with a specified level of confidence can be estimated. The background radiation of the 434 laboratory is one important factor which determines that level.

Neither the LLD (lower limit of detection) nor MDC (minimum detectable concentration),

436 however, gives complete information about the capability of the detection process. The ability to detect true counts above background depends on the total radioactivity 438 present in the sample as well as other factors. Both the LLD and the MDC are of intcrest to the reporting institution but are not necessarily appropriate to be reported.

440 Rather, a value should be reported which indicates the amount of activity which can be detected at a given confidence level by the specific measurement process used.

442 A.3.2.3 Specific sample information Specific information about the condition of the samples that are relevant to interpretation of the data should os presented.

444 Examples include whether samples were measured wet or dry, whether liquid sample; were filtered, etc. Supporting rationale for each measurement condition, particularly 446 thote which may be unusual, is also required.

A4 Units of measurement 448 A.4.1 Primary units The International System of units (commonly called "Si units") or those units of measurements which are derived from the basic SI units should be used.

450 The use of SI units as the main reporting units would bring about a consistency between laboratory and institutional reports and refereed publications, e.g. HEALTH 452 PHYSICS, the Journal of the Health Physics Society as well as with laboratories worldwide. Moreover, Si units were recommended by the NCRP in 1985 (NCRP 1985) 454 for gradual adoption by the radiation protection community in the U.S. with complete 13

adoption suggetted by December 1989. Although considerable disagreement over the 456 use of SI units still exists, this standard adopts the usage of SI units as the unit of ,

choice. To the degree that supporting agencies require the submission of reports with 458 other units, suitable conversions may be provided or alternate values enclosed in parentheses.

460 The conversions between Si and conventional units and definitions of those ' derived units' (i.e. compounds of the fundamental units) useful in radiation protection are 462 available in the literature (NCRP 1985, Taylor,1995).

A.4.2 Special units HPSR 1(980) supports the continued use of familiar units that are 464 multiples of basic SI units (e.g., minute (min), hour (h), day (d), and liter (L)). This ,

standard does not disagree with the continued use of these units. However, it repeats 466 the advice of HPSR-1 (1980) against using non SI units for radiation quantities (e.g.

curie, roentgen, rad, and rem). It is reiterated here that SI units or their derivatives 468 should now be used for reporting environmental radiation data.

A.4.3 Exarnples of units Similar to that presented in HPSR-1 (1980), the following 470 summary gives the units most likely applicable to reporting environmental radiation surveillance data.

472 (1) Air concentrations (radioactivity, trace elements, etc.): Bq m 3 (2) Liquids concentrations (radioactivity, trace elements, etc.): Bq L 1 474 (3) Solids concentrations (radioactivity, trace elements, etc.): Bq kg 1 (4) Exposure-rate: C kg 1 s-1 or C kg-1 h 1 476 (5) Absorbed dose: Gy (6) Absorbed dose-rate: Gy s 1, Gy h-1, or Gy y 1 478 (7) Equivalent dose: Sv (8) Equivalent dose-rate: Sv y 1 480 A.4.4 Notation Scientific (i.e., 6.02 x 10 2) or exponential notation (6.02E+23) is recommended. II exponential notation is used and the exponent is less than ten, 482 precede the exponent by a zero, i.e.,6.02E+03. t A 4.5 Significant digits The precise number of significant digits to be used for 404 reporting data cannot be standardized, the reporting institution must first determine the precision or total uncertainty of the measurements which they me reporting. In general, 486 however, and as stated in HPSR-1(1980), environmental radiation data will only require the use of 2 or 3 significant digits. The number of significant digits used should be 488 supported in the text by clear statements relating to measurement precision and/or total uncertainty.

490 This standard supports the views of HPSR-1(1980). In intermediate calculations at least one extra digit (also see Bevington 1969; Taylor 1982) than that dictated by 492 experimental precision should be maintained because in_ computations, one significant figure is sometimes lost. After the final quantity is derived (e.g., concentration, etc.)

494 round to a number of digits appropriate to the precision of the measurement process.

A.4.6 Reporting style for units This appendix does not support a single style for 496 reporting units with exponents, however, a single, consistent usage by each reporting institution is necessary to avoid confusion. The report should use a style which is not 14

498 ambiguous. _ The report should provide an explanatory note about the reporting style prior to any data presentation.

500 Styles for units include, but are not limited, to those shown in the following example:

(1) Bq/m' 502 (2) Bq m .

A5 Deriving and reporting precision of individual measurements 504 A.5.1 Introduction This Section is not intended to teach the theory of error propagation. However, reporting institutions should derive in detail the error bounds of 506 individual measurements. Such a process requires some background in the theory of statistics.

508 The reporting institution is responsible for reporting and substantiating the uncertainty of each measured value. The basic method of presentation is the best estimate 510 bounded by the range of uncertainty: xbut i 6x. The value of Sx will depend on many factors including the confidence level that the reporting institution has set. This -

512 appendix does not require that measurements be reported to a specific confidence level, rather that the reporting institution specify the factors used in developing their 514 error statement so that a proper interpretation can be made without ambiguity.

Crucial to the process of determining measurement uncertainty is formulating an outline 516 of the sequential steps of the measurement process which lead to the final results of each measurement and to specify, either objectively or subjectively, if necessary, the 518 probability distribution associated with each step. That process is the same concept as discussed in HPSR-1 (1980)in a three step process. From the outline of sequential 520 steps and their associated uncertainty, the total uncertainty of the measurement may be determined analytically. The total uncertainty is the uncertainty of a single 522 measurement estimated fron the mathematical combination of variables (and their sequence), e.g., chemical yield, counting efficiency, etc., which are used to derive the 524 calculation's endpoint (e.g., concentration).

Measurement precision. as well as the concept of accuracy, has been discussed at 526 length in the literature (see, for example, Bevington 1969; Natrella 1963; Taylor 1982).

The basic concept of precision is an assessment of random variations. This standard 528 supports _ the current practice of stating uncertainty of individual radiological measurements, however, such a practice can only be useful after establishing 530 laboratory consistency and accuracy.

The generalized error propagation formula is often cited (e.g., Bevington 1969) from 532 which the standard deviation of the final measurement can be theoretically derived.

This formula is a first order Taylor expansion of the functional form of x about the mean 534 of x:

o,'s (&x/bu)2 o,'+(adav)2 o, + 2% 2(6x/fu)(6x/Sv) .

536 The above formula can be used to determine uncertainty (e.g., variance or standard deviation) of individual steps or of the total formula for the reported quantity if it 538 includes mathematical representations of the all the physical steps of the measurement process.

540 The determination of measurement precision - can be laborious for complex measurements, e.g. radiochemical techniques. For other measurement procedures, 15 i

542 e.g., gamma spectrometry of radionuclides in liquid samples with single gamma emission energies, the random error components are relatively easy to quantify. Most 544 quantities derived from measurements can be reduced to a combination of simplo mathematical operations which represent those steps (e.g., dilution, chemical yield, 546 etc). Error propagation formulae for individual operations have been published (e.g., j Ku,1966; Ku,1969; Bevington,1969; HPSR-1,1980; Taylor and Kuyatt,1994). In 1 548 some casos, these can be properly combined to determine the overall uncertainty. l However, derivation of total error requires thoughtful consideration and experience. It 550 is recommanded here that error formulations be derived by experienced personnel. In l such cases, derivation from the Taylor expansion would generally be understood and 552 can be carried out. I The error propagatien formulae for the basic operations of addition, subtraction, 554 multiplication and division are provided below. It is important to note that these formulae assume independent errors. If this is not the case, it is the responsibility of 556 the reporting institution to use more complex formulae to account for correlations and to substantiate the use or neglect of correlations.

558 Uncertainty in sums and differences if S = (at Sa) + (bi 6b) +. . + (mi Sm) - (ni Sn) - (0160) - -

(ziSz) 560 and the errors in the variables 'a' through 'z' are known to be independent and random, then, 562 SS = ((Sa)2 + (Sb)*+ .. + (Sm): + (6n)' + (So): + .. + (6z)')" l Uncertainties in product and auctients .

564 If Q = ((at Sa) (b1 Sb) .. (mi Sm)) / ((ni Sn) (o1 So) . (zi Sz))

and the errors in the variables 'a' through 'z' are known to be independent and random, 566 then, 60= [ (Sala): + (6b/b)2 + . + (Sm/m)2 + (Sn/n)2 + (Solo): , , , [37jg): )w 568 A.S.2 Counting error versus total error " Counting error" refers to the statistical variation of the decay and counting process. Radioactive emissions are a binomial 570 process which can be mathematically simplified and represented by Poisson statistics.

When the total counts are greater than few dozen, Poisson statistics may be replaced 572 by Gaussian statistics as a simplification. The mathematics is also more tractable.

Counting statistics are often quoted to represent the possible variation of repeated 574 measurements of the same sample and are useful for bounding the estimated true

- emission-rate.

576 However counting error only represents the total error associated with a measurement when counts or count-rate is repoded. Most environmental radioactivity measurements 578 require other information to convert the count-rate to a more useful quantity, in general, it is quite difficult to determine a priori the extent that counting error 580 contributes to the total error. This fact alone justifies the necessity of estimating total error. For these reasons, this appendix agrees with other publications (e.g., ANSI 582 N42.23-1996, HPSR-1(1980)) in the necessity of reporting total estimated random error, not just counting error. The only exception to this rule is when the reporting 584 agency can show that the amount of error contributed by all additional sources of variation is inconsequential. ,

16

,,-,yn,--. ..--,n co-,., .--.-..,.,,-i~..- ~,._.-..,--+,r-- . , , , y ,ww-- - - - - . , - . , , , , . - - . . .

586 Total error most directly applies to measurements of individual samples. Summary

, statistics (of multiple samples) also have an associated " error term" but this actually 588 refers to the variation of samples within a homogeneous (or otherwise defined) group.

Error terms for grouped measurements, often called " sampling errors" are discussed in 590 Section A.6.

A.S.3 Reporting of measurements near zero and negative values The question 592 frequently arises about how to report data values which are indist;nguishable from zero, or in some cases, negative. In deciding how to report these values, consideration 594 must be given to avoiding a bias in the average level of radioactivity. Assigning any value to negative or zero values (e.g. assigning the LLD value) introduces a bias 596 because of the artificiality of the assigned number.

Most discussions of this topic, NUREG-0475, ANSI N42.23-1996, HPS N13 30-1996, 598 as well as HPSR-1 (1980), recommend reporting the value determined by laboratory analysis regardless of its value. Reasons for not reporting negative values usually 600 relate to the fact that the value is obviously an artifact of the calculation. However, assigning a value, e.g. zero or the LLD is also artificial and introduces a bias as well.

602 The above mentioned documents recommend against reporting values as "less than LLD" or "less than MDC" The reason provided is that these calculated limits are only a 604 pr/or/ estimates of the detection process. However, if the reporting institution determines the level of activity wtiich can be detected with 90% probability (or any 606 specified confidence level), this standard suggests that the predetermined limit of activity detection can be used as a replacement for the numerical value in data tables 608 (only). This advice is contrary to that provided in HPSR-1 (1980). However, if this is done, two conditions must be met. First, the detection limit must be clearly stated, the 610 probability level given, and the measurements must have been made under the same conditions (sample size, counting time, etc.). Second, the original, measured value 612 must be reported in the Appendix.

In summary, individual data may be reported to be "non-detectable" or "below the LLD",

614 etc, if sufficiently detailed information is provided regarding the detection limit.

However, the actual measurad value (not the LLD) should be retained and used in 616 calculating summary statistics.

The difficulties in determining wtiether or not to report negative values can be 618 alleviated by a clear and detailed statement of objectives (see Section 3,4 of this standard), if the purpose and level of detail of the monitoring report is specified clearly 620 enough, ad hoc decisions, e g., how to report negative values, will not be needed.

A.5.4 Searching for outliers Determination of outliers is a quality control process and 622 is beyond the scope of this appendix to discuss in detail. It is the responsibility of the reporting institution to maintain quality control programs of various types, including 624 inter-laboratory comparisons, preparing " acid blanks", counting " acid blanks" or

" sample blanks", checking data calculations, checking data input, etc. and finally, 626 searching for data values (after verification of laboratory procedures) wilich are unusual with respect to the overall group, 628 The definition of " outlier" can depend on program objectives. There are methods available including quality control charts, probability plotting, etc. to search for unusual 17

630 values. Searching for outliers is a requisite part of analysis before reporting of data,

  • Any rejected values should be noted in the text of the report. (Also see discussion in 632 DOE /EH-0173T,)

A.S.5 Reporting of uncertainties of final results 634 The reporting institution should use the various principles discussed here to report the estimated total uncertainty for each reported value. The reported total uncertainty will 636 contain the effects of counting statistics as well as the uncertainties due to other steps in sample preparation and measurement. The reporting institution should provide an 638 explanation of the methods and assumptions used to derive the total uncertainty, particularly if subjective uncertainty components are included in the total uncertainty 640 estimate.

A,6 Deriving and reporting statistical summaries of measurements 642 Summary statistics may be useful to summarize data from large data sets or within groups which are homogeneous by definition or in groups in which the samples are 644 defined to be similar (e.g., plant species, soil types, locations within specified boundaries, etc.). Generally, statistical summaries are not produced for sets of data 646 values which do not have a common and well defined characteristic.

The most common type of statistical summary is a measure of the central tendency of 648 the group and the degree of variation of the data. When statistical summary information is provided, the reporting institution must take care to mathematically define 650 the variance estimator. However, the report should also define in words, the concept of variation being presented Whereas the uncertainty of a single measurement specifies 652 tha bounds in vstlich the true value is expected to lie and which cannot be determined perfectly because of certain instrumental limitations inherent in the measurement, the 654 uncertainty of the mean value reflects the 'Jncertainties of individual sample as well as the sample size. If measurement uncertainty is small, the uncertainty of the mean 656 describes natural environmental variability. The meaning of the variance estimate should be made clear.

656 Various measures of central tendency are useful and are widely familiar, e.g., mean, median, etc. However, measures of variation are numerous and less well understood.

660 The reporting institution should take care to define the measure and assign the proper units and report whether it is a squared estimate (e.g., variance) or not. Other 662 important points to be specified are the confidence level of the estimate of variation.

This standard does not require variation to be reported as 1o,2a, etc. It does require 664 that the reporting facility clarify its reported values and justify them according to the program's objectives.

666 Some consideration should be given to the effect of measurement uncertainties on summary statistics, e.g., the arithmetic mean (i) and the arithmetic variance rs (I) (the 668 variance of the mean). As described in Section 3.12, inter-laboratory comparisons can be used to determined the amount of systematic error or bias. The logical situaticn to 670 strive for is no systematic error (bias). In this case, measurement uncertainties should be random with zero mean and the reported value of I can be assured as an unbiased 672 estimate of p, the true population mean. Also, if the measurement uncertainties are uncorrelated and if the fraction of the population sampled is ~0 (commonly the case 18

674 with environmental samples), the variance estimate is also unbiased. However, when

, the errors are positively correlated, s*(i) will underestimate the true variance, and i 676 will appear more precise than it is (Gilbert 1987, Cressie 1991). This may be the case, for example, with samples obtained within a very limited geographic area. See Section i

. 678 A.6.2 for further discussion on how spatial correlations can contribute to this effect.

l.

A.6.1 Distributional assumptions 680 Summary statistics of measurements for different sample types, geographic region, etc.  !

are generally reported in addition to, or in replacement of, tables of individual data 682 values. Thus, the reporting institution should give consideration to using moment '

estimators which properly describe the correct statistical distnbution of the data. Often 684 the arithmetic mean (i) and arithmetic variance (s') are used. However, it is sometimes not made clear wtiether the distribution of values are normally distributed or 686 whether those moment estimates were used only because of convention. The reporting  ;

institution has the responsibility to examine the data and report summary statistics '

688 which meaningfully reflect the true central tendency and spread of the values. For example, lognormal distributions are quite commorJ/ used to describe environmental 690 concentrations. If a distribution of values is particularly skewed, it is all the more important to properly describe that distribution through proper statistics (e.g., medians, 692 goometric means, geometric standard deviations, proper confidence limits, etc.).

Gilbert (1987), Tab!e 12.2 should be consulted for proper formulae. Qualitative 694 analyses, e g., probability plotting, will be useful for determining normality or lognormality, though more formal methods exist to test the " goodness of fit" of data to 696 theoretical dmtribution types. For example, the W test (Hahn and Shapiro 1967; Gilbert 1987) to determine lognormality for data sets of n 550. Other tests include the 698 Kolmogorov-Smirnov and Lilliefors test discussed by Conover (1980) and Gilbert (1987). These tests are not required for the purposes of this standard but may be 700 usefulin some instancas for analyzing the data.

A.6,2 Considerations of spatial correlation Environmental measurement data are 702 unique in that measurements made close by to one another often are correlated.

Measured concentrations of radioactive contamination or natural trace elements in the 704 environmer.t represent single values determined from a spatial continuum. These concentrations are related to nearby values as a result of the process which deposited 706- them. Spatial correlation has important consequences to estimating the local (or regional) spatial variance. In turn, the accurate evaluation of spatial variance has 708 implications for correctly stating confidence limits and can limit the capability for assigning probabilities associated with different levels of risk. Failure to realize the 710 presence of positive correlation in the data leads to a confidence interval that is too narrow as shown in the following example.

712 If "n" sample measurements are independent and identically distributed for a Gaussian distribution with unknown mean p and known variance c', the classical estimator of the 714- population mean is:

F=1"Tx e il 7 19 I

L _ _ _ . _ . _ . _ _ _ - _ _ _ _ ___-__ - . - . - - -

~

716 and the two-sided 95% confidence interval for p is

  • F 1.96 o/4 . 4 718 However, Cressie (1991) shows that for measurement data correlated in space with correlation p, the variance is 720 Var ( F ) =oo' { 1 + 2(p/(1 p))(1 1/n) - 2(p/(1-p))' (1-p")/n]

and gives the example for n=10, p=0.26, Var (Y)={1.608) co'/10, the two sided 722 confidence interval for p is I (2.485)oo / Ji0 .

i- 724 Therefore, for the purposes of comparing current measurements with guidelines, standards or historical values, recognizing the spatial correlation of measurements may l 726 be important. Gilbert (1987, Chapters 4,8,16,17) and Cressie (1991) may be consulted '

for methods to determine the existence of, and the degree-of, spatial correlation.

728 Cressie (1991) shows how to compute the number of samples (n) required to reach the same precision, if spatial correlation exists, as compared to the same number of 730 independent samples. A complete analysis of spatial correlation may be beyond the needs of many routine reporting situations. It is the responsibility of the reporting 732 institution to determine whether the program's objectives require such analysis.

A 6.3 Weighted averages  :

734 The need to use weighted averages depends on the goals and methods of the '

sampling programs. The standard does not require the use of such techniques, 736 however, in some cases, the reporting institution may have multiple measurements of the same sample or aliquots removed from the same samples. If a veighted average is 738 used, the reporting institution should justify the calculation. In general, a veighted average is computed to be:

N N 740 r ,, = { w, x, { w,

..i ,.i where the weights w, are the inverse squares of the total uncertainties of each -x; ,

742 w,= 1/ oi a A.6.4 Summary statistics for specific sampling plans 744 This standard does not recommend any particular sampling plan _as this should be '

specific to the objectives of.the monitoring program. However, haphazard sampling 746 should not be conducted and the type of sampling plan implemented should be t reported. Furthermore, summary statistics are reported, the reporting institution 748 should state the definitie of the groups (e.g.', locations, types of plants, time periods, etc.) which are statistically summarized.

750 Formulae for means and standard deviations for groups of samples obtained in a variety of sampling plans can be found in the literature. Further detail should be sought 752 from the literature. In particular, see Gilbert (1987)

'I 20

d 1

REFERENCES

754 ANSI N42.23-1996 American National Standard, Measurement and Associated i .

Instrumentation Quality Assurance for Radioassay Laboratories.

756 Bevington, P. R. Data Reduction and Error Analysis for the Physical Sciences. New i York, McGraw-Hill Book Comphny 1969.

i 758 Conover, W. J. Practical Nonparametric Statistics. NY, John Wiley & Sons 1980.

Cressie, N. Statistics for Spatial Data. New York, John Wiley & Sons, Inc.1991.  ;

760 DOE Order 5400.1, General Environmental Protection Programs.  !

DOE /EH-0173T, Environmental Regulatory Guide for Radiological Effluent Monitoring 762 and Environmental Surveillance,1991. i Gilbert, R. O. Statistical Methods for Environmental Pollution Monitoring. Van Nostrand 764 Reinhold Co.1987.

Hahn, G. J. and Shapiro, S. S. Statistical Models in Engineering. New York. Wiley.

j 766 1967.

1 '

HPS N13.30-1996, Performance Criteris for Radiobicassay

768 HPSR 1. Upgrading Environmental Radiation Data. Watson, J. E. (chairman), EPA.,

i EPA 520/1-80-012.1980.

. 770 Ku, .H. H. Notes on the Use of Propagation of error formula. J. Research NBS i

70C(4):236-273.1966.

772 Ku, H. H., Ed. Precision Measurement and Calibration: Statistical Concepts and i Procedures. Washington, DC, National Bureau of Standards.1969.

j 774 Natrella, M. G. Experimental Statistics. Washington, DC, National Bureau of Standards.

i 1963.

l 776 i' NCRP SI Units in Radiation Protection and Measurements. National Council on Radiation Protection and Measurements. NCRP Report No. 82.1985, 778  :

NUREG-0475, Radiological Environmental Monitoring by NRC Licensees for Routine

Operation of Nuclear facilities Task Group Report, September 1978.

780 Taylor, B. N., NIST Special Publication 811, 1995 ed., Guide for the Use of the

, Intemational System of Units (SI).

i 782 Taylor, b. N. and C. E. Kuyatt, (1994) NIST Technical Note 1297,1994 ed., Guidelines for Evaluating and Expressing the Uncertainty of MST Measurement Results.

784 Taylor, J, R. An introduction to Error Analysis. Oxford, Oxford University Press.1982.

i i

k 21

--,---,--m-, ,e. u -e..w.-- --e- . - - -%-- mr - . -r . .,---,...--w.-+.,.,

- , - - - -- - --.-e,....--, .e: --m - - - _ . . . . . - . ~ . . .m v r- , v-