ML12264A562

From kanterella
Jump to navigation Jump to search
University of Michigan - Ford Nuclear Reactor, Quality Assurance Plan for Final Status Survey Plan
ML12264A562
Person / Time
Site: University of Michigan
Issue date: 09/17/2012
From: Driscoll M
University of Michigan
To:
Document Control Desk, NRC/FSME
References
Download: ML12264A562 (21)


Text

O E I University of Michigan Occupational Safety &Environmental Health SCampus Safety Services Building Occupational Safety & 1239 Kipke Drive, Ann Arbor, MI 48109-1010 Environmental Health Phone: 734 647-1143

  • Fax: 734 763-1185 September 17, 2012 Document Control Room U.S. Nuclear Regulatory Commission Two White Flint North 11545 Rockville pike (Mail Code: 03H8)

Rockville, Maryland 20852-2738 RE: University of Michigan - Ford Nuclear Reactor Quality Assurance Plan for Final Status Survey Plan Docket 50-2 / License R-28 Decommissioning Branch:

Please find enclosed a copy of the Quality Assurance (QA) Plan to be used in conjunction with the Final Status Survey (FSS) Plan currently being reviewed by the NRC Decommissioning Branch for the University of Michigan - Ford Nuclear Reactor (License R-28 / Docket 50-2).

Thank you for your time, effort, and consideration with respect to including this QA Plan with the NRC's review of the FSS Plan. Please do not hesitate to contact me at OSEH / Radiation Safety Service [(734) 647-2251] should you have any questions or comments regarding this QA Plan.

Sincerely, Mark L. Driscoll Director/ Radiation Safety Officer Radiation Safety Service / OSEH MLD/mId NRCFNRD&DFSSQAPlan091712.doc cc: Terry Alexander, Executive Director, OCS Robert Blackburn, Manager, Laboratory Operations, MMPP Theodore Smith, FNR Project Manager, NRc Headquarters (Mailstop T-8F5)

Jeremy Tapp, Health Physicist, NRC Region III FNR Decommissioning Files

Quality Assurance Plan for Final Status Survey Ford Nuclear Reactor University of Michigan Revision 2

CSignature of Person Change Date of Description Date 11 Entering Number Change/Addition Entered Change/Addition 0 Initial Document distributed for use by the company.

1 08/24/12 Revisions by FNR Decommissioning staff 08/24/12 after revisions 2 09/05/12 Revision of Organization Chart to reflect 09/05/12 contracted FSS and additional minor changes

__ 1 ___ 1 ________ ii ___

4 4 4 4 4 4 4 4

__________ I ____________________ I ____________________________________________ I I ii

TABLE OF CONTENTS CHAPTER TITLE PAGE

1.0 INTRODUCTION

.................................................................................... 1 2.0 PU R PO SE ..................................................................................................... 1 3.0 SC O PE ...................................................................................................... 1 4.0 PROJECT ORGANIZATION AND RESPONSIBILITIES ................... 1 5.0 QUALITY ASSURANCE OBJECTIVES AND. CRITERIA FOR MEASUREMENT DATA ...................................................................... 6 6.0 SAMPLING AND FIELD MEASUREMENT PROCEDURES ......... 9 7.0 TRAINING AND CERTIFICATION ................................................. 9 8.0 FIELD RECORDS .................................................................................. 10 9.0 SAMPLE MANAGEMENT .................................................................. 11 10.0 EQUIPMENT CALIBRATION ............................................................. 11 11.0 ANALYTICAL METHODS ................................................................. 11 12.0 DATA MANAGEMENT ...................................... 12 13.0 ASSESSMENTS AND AUDITS ............................. 14 14.0 CORRECTIVE ACTION ...................................................................... 14 15.0 BIBLIOGRAPHY .................................................................................... 16 iv

1.0 INTRODUCTION

The Ford Nuclear Reactor (FNR) facility at the University of Michigan (U-M) in Ann Arbor, Michigan is being decommissioned. Radioactive materials have been removed and remaining surfaces decontaminated to reduce residual radiological contamination levels to satisfy NRC criteria for termination of the license (R-28) and allow for future reuse of the facility without radiological restrictions. A final status survey (FSS) will be performed to demonstrate that the NRC criteria have been satisfied.

This Plan is a controlled document. Copies are distributed to all FNR Decommissioning Project personnel who may use the information to perform or review work.

This Plan is reviewed annually and revised, if necessary. Interim revisions may also be performed when significant modifications or additions are required. Permanent revisions are reviewed and approved by the Executive Director - Office of Campus Sustainability (OCS), prior to implementation and inclusion in the Plan.

2.0 PURPOSE The purpose of this Quality Assurance (QA) Plan is to present the requirements and guidelines which will assure that the FSS is performed in accordance with the FSS Plan and that the resulting data and information adequately and accurately describe the as-left conditions of the FNR facility.

3.0 SCOPE This QA Plan is specific to the FSS in support of decommissioning the University of Michigan's Ford Nuclear Reactor facility.

4.0 PROJECT ORGANIZATION AND RESPONSIBILITIES Section 2.4 of the U-M Decommissioning Plan for the Ford Nuclear Reactor (Revision 1) described the organization and responsibilities for the overall decommissioning project.

Since that Decommissioning Plan was prepared, the Nuclear Reactor Laboratory Manager terminated employment with the U-M. Operational aspects of decommissioning, previously assigned to the Nuclear Reactor Laboratory Manager, are now the responsibility of the Decommissioning Manager. Duties and responsibilities of other positions, relative to the FSS, remain essentially unchanged.

Figure 1 illustrates the organizational structure for the FSS activities. The University of Michigan has contracted with Ameriphysics to implement the FSS and with DeNuke Contracting Services, Inc. to provide technical guidance regarding FSS activities. Major responsibilities of groups and individuals, relative to survey activities, are listed below.

Additional information regarding the organizational structure and responsibilities for the overall decommissioning project is presented in the Decommissioning Plan.

I

Figure 1. Organization Chart for the FNR Final Status Survey Regents University of Michigan President I

Vice President Executive Vice President Research Chief Financial Officer Associate Vice President Facilities & Operations Chair, Decommissioning Executive Director Outside Review Committee Office of Campus Contracts Sustainability F

I Radiation Safety Officer

[------ Decommissioning Manager Technical, Decommissioning ES&H, Staff Licensing, and Quality Management

- Operations FSS Technical Contractor Consultant Contractors, Laboratories, Vendors Line Function Oversi2ht

4.1 Executive Director of the Office of Campus Sustainability The Executive Director of the Office of Campus Sustainability (OCS) has overall responsibility for FSS of the FNR, including:

  • The facility's license (compliance and amendments),
  • Successful completion of decommissioning activities,
  • Authorizing the expenditure of funds for decommissioning,
  • Requesting termination of the license for FNR,
  • Approval of contractors, subcontractors, and consultants,
  • Approval of budgets and schedules,
  • Serving as technical spokesman for the U-M on decommissioning activities,
  • Resolving conflicts between the Decommissioning Manager, RSO and Decommissioning Review Committee (DRC),
  • Ensuring that the conduct of decommissioning activities complies with all applicable licenses and registrations held by the University and with compliance to applicable federal and state regulatory requirements,
  • Serving as the point of contact with the FSS Contractor and Technical Consultant, and
  • Providing direction to the RSO and Decommissioning Manager regarding ES&H, QA/QC, and FSS activities.

4.2 Decommissioning Manager The Decommissioning Manager has responsibility for:

  • Drawing upon other UM engineering, technical, and skilled trade resources as needed,
  • Assisting the Executive Director in resolving decommissioning issues,
  • Controlling and maintaining safety during decommissioning activities and protection of the environment.,
  • Determining facility staffing and organization to support decommissioning operations,
  • Reporting performance to the Executive Director,
  • Acting as interface between contractor, subcontractors, or vendors and the Executive Director,
  • Coordinating contractor, subcontractor, or vendor activities,
  • Resolving facility or site issues,
  • Investigating adverse monitoring or audit findings, scheduling corrective action, including measures to prevent recurrence of significant conditions adverse to quality, and notifying the Executive Director of action taken or planned or to be taken, and
  • Assisting the Executive Director in ensuring that decommissioning activities comply with all applicable licenses or registrations held by the U-M and with compliance to applicable federal and state regulatory requirements.

3

The Decommissioning Manager shall have the authority to enforce safe performance of decommissioning activities and to shut down or suspend any operations or activities because of safety, environmental, licensing or regulatory issues, if immediate corrective action is not taken. Resumption of any activity shut down or suspended by the Decommissioning Manager shall require the approval of the Executive Director and the Decommissioning Manager.

4.3 Radiation Safety Officer The RSO is responsible for:

  • Maintaining the radiation safety and health and QA/QC aspects of the FSS and ensuring compliance With programs, plans, or procedures,
  • Providing technical support to the Executive Director,
  • Managing, directing, and providing oversight of field operations to conform with FSS survey packages and the FSS Plan,
  • Reviewing FSS results, and
  • Ensuring the implementation of an industrial safety, industrial hygiene; and environmental protection program which satisfies all applicable licenses, permits, or registrations held by the U-M and complies with all applicable federal and state regulatory requirements.

4.4 FSS Contractor The FSS Contractor responsibilities are:

  • Establishing survey unit boundaries and installing reference grid systems,
  • Identifying sampling and measurement locations,
  • Preparing FSS work packages,
  • Selecting instruments and performing operational checks,
  • Conducting surface scans, contamination measurements, and sampling in accordance with the survey design package,
  • Preparing records of field activities and results,
  • Converting field data to units for comparison with cleanup criteria, and
  • Bringing non-compliant and questionable results to the attention of the RSO, and
  • Preparing a report of FSS results.

The RSO and any facility FSS staff shall have the authority to enforce safe performance of decommissioning activities and to shut down or suspend operation or activities because of either safety or environmental issues, if immediate corrective action is not taken. Resumption of any activity shut down or suspended by the RSO or a facility FSS staffer shall require the approval of the Executive Director or RSO.

4.5 Decommissioning Review Committee The Decommissioning Review Committee (DRC) review committee shall approve:

  • Proposed changes in the license or technical specifications, 4

" Proposed changes to the facility that can be implemented without the prior approval of the NRC as authorized by the license conditions implementing 10 CFR 50.59,

" Proposed changes in the Decommissioning Plan that can be implemented without the prior approval of the NRC, and

  • New procedures and proposed changes to the procedures for the following activities which shall be in effect and followed.
1. Normal operation of all systems structures or components described in these technical specifications or which are important to safety,
2. Actions for responding to emergency conditions involving the potential or actual release of radioactivity, including provisions for evacuation, reentry, recovery, and medical support,
3. Actions to be taken to correct specific and foreseen malfunctions of systems, structures or components described in these technical specifications or which are important to safety,
4. Activities performed to satisfy a surveillance requirement contained in these technical specifications,
5. Radiation and radioactive contamination control,
6. Physical security of the facility, and
7. Implementation of the quality assurance for the calibration and response testing of radiation instrumentation utilized for direct measurement in support of characterization, release, final status survey, or other quality assurance activities.

These procedures shall be appropriate to protect the U-M community, the public, and personnel involved in decommissioning and to implement the quality assurance necessary to support a request for the termination of the license. Substantive changes to these procedures shall be made only with the approval of the DRC. Non-substantive changes to these procedures may be made with the approval of the Executive Director. All non-substantive changes made to procedures shall be documented and subsequently reviewed by the DRC.

The DRC, as an audit functiox, shall ensure that the following are independently monitored or audited:

  • Decommissioning operations to ensure they are being performed safely and in accordance with all applicable licenses and registrations held by the U-M and in compliance with applicable federal and state regulatory requirements (Radiological Protection Plan, Environmental Safety and Health Plan, etc.).
  • The quality assurance program to verify that performance criteria are met as well as to determine the effectiveness of the program in satisfying the quality assurance requirements of the Decommissioning Plan.

)

Monitoring or audits shall be performed annually, as a minimum, and should be scheduled by the Chair of the DRC, in a manner to provide coverage and coordination with ongoing activities, based on the status and importance of activities. Scheduled

-5

monitoring or audits should be supplemented by additional monitoring or audit of specific subjects when necessary to provide adequate coverage.

5.0 QUALITY ASSURANCE OBJECTIVES AND CRITERIA FOR MEASUREMENT DATA The overall objective of the FSS is to obtain data that demonstrate that residual radiological contamination levels are below the criteria, approved by the NRC for this project. The Data Quality Objectives (DQO) process provides input to the FSS design.

This DQO process was integrated into the development of the MARSSIM, which is the primary source of guidance and direction for final status survey. The FNR FSS Plan therefore embodies the DQO process. The quality assurance objectives are specifications that data must meet to comply with project DQOs, and include quantitative parameters (precision, accuracy, measurement sensitivity, and completeness), and qualitative parameters (representativeness and comparability).

5.1 Precision Precision is a measure of agreement or reproducibility among individual measurements for the same property under the same conditions. For radiological parameters, precision for each duplicate pair is measured using the relative percent difference (RPD). The RPDs calculated as shown in Equation (1):

RPD- (C1-C2) x100 (1) 1(C1+C2)/22 Where RPD = relative percent difference C1 = measured concentration of Sample 1 C2 = measured concentration of Sample 2.

Laboratory sampling precision will be checked by obtaining a minimum of one replicate data point for every 20 data points collected in a given survey unit. Precision will be evaluated by calculating the RPD for each replicate pair. It is expected that the replicate pairs will generally have RPDs < 50%.

5.2 Accuracy Accuracy is the relative agreement or non-agreement between a measured value and an accepted reference value. Accuracy reflects the measurement error associated with a 6

measurement and is determined by assessing measured levels relative to known levels.

Accuracy is defined as the measured value divided by the true value expressed as a percent, as shown in Equation (2).

%R= C - C", x1 0 0 (2)

Cas Where CS, = measured value Cus = background value CIS = known value.

Acceptable laboratory accuracy will be determined by the analysis of one laboratory reference sample per analytical batch. The accuracy of all analyses must be within historically derived, method-specific criteria. During the DQA process, accuracy of the environmental measurements (in the form of bias that may be indicated by the measure discussed above) will be assessed to determine whether any impacts on hypothesis testing are due to the accuracy of the data.

5.3 Sensitivity/Detection Limits Sensitivity refers to the ability to detect a minimal amount of a substance and is typically expressed as the method detection limit, i.e., minimum detectable activity (MDA) or minimum detectable concentration (MDC). Detection sensitivities are chosen to be a fraction of the cleanup criteria to assure that the data demonstrate compliance at a high level of confidence. Target measurement sensitivities for laboratory analyses are < 25%

of the default screening values for individual radionuclides (refer to Table 4-1 of the FSS Plan).

Target detection sensitivities for field instruments are <_50% of the cleanup criteria for gross beta activity. These values are determined for the various instruments and survey techniques, following the guidance in MARSSIM and NUREG- 1507, "Minimum Detectable Concentrations with Typical Radiation Survey Instruments for Various Contaminants and Field Conditions." Values are presented in Section 4.5.7 of the FSS Plan.

5.4 Completeness Completeness is the measure of the amount of valid analytical data obtained compared to the total number of data points planned. Valid analytical data are those generated when analytical systems and the resulting analytical data meet all of the quantitative measurement quality objectives outlined for the project (i.e., all calibration verification, interference, and other checks not affected by the sample matrix meet acceptance-criteria). It is important to understand that data that are flagged during the data validation process are not necessarily invalid data. Part of the DQA process is the review of flagged data to determine the negative impact, if any; the validation flags have on the intended 7.

use of the data. Therefore, the definition of "valid data" in the context of calculating completeness is "data that are acceptable for their intended purpose." Completeness of the reported data (expressed as a percentage) is calculated as shown in Equation (3).

C(%) = M,/Mt x 100(3-4) (3)

Where M = number of valid sampling or analytical results obtained per analyte Mt = total number of samples submitted for analysis per analyte.

A certain amount and type of data must be collected for each final status survey unit to be valid. The statistically derived number of samples has been calculated in accordance with MARSSIM. Missing data may reduce the precision of estimates or introduce bias, thus lowering the confidence level of the conclusions. The completeness goal for each final status survey will be 95% (areal) for the field scanning and 85% (number) for measurements and sampling. The importance of any lost or suspect data will be evaluated in terms of the measurement location, analytical parameter, nature-of the problem, decision to be made, and the consequence of an erroneous decision. Critical locations or parameters for which data are determined to be inadequate may be re-sampled.

5.5 Comparability Comparability is the degree to which one data set can be compared to another obtained from the same population using similar techniques for data gathering. Comparability will be achieved through the use of consistent sampling procedures, experienced sampling personnel, the same analytical method for like parameters, standard field and laboratory documentation, and traceable laboratory standards.

5.6 Representativeness Representativeness is a measure of the degree to which data accurately and precisely represent a characteristic of a population parameter at a sampling point, a process condition, or an environmental condition. Representativeness is a qualitative term that should be evaluated to determine whether in-situ and other measurements are made, and physical samples are collected, in such a manner that the resulting data appropriately reflect the population parameter of interest in the media and phenomenon measured or studied.

The final status survey unit sampling program has been designed in accordance with the guidance given in MARSSIM to ensure that the appropriate statistically derived number of samples is collected during final status surveys. Sampling methods have been developed to ensure that samples collected are representative of the media. Field handling protocols (e.g., storage, handling in the field, and shipping) have been designed to preserve the integrity of the collected samples. Proper field documentation and QC 8

efforts as outlined in this plan will be used to establish that protocols have been followed and that sample identification and integrity have been maintained.

5.7 Data Quality The data generated from the soil sampling effort will be used to evaluate whether the site meets the cleanup criteria. Each parameter to be evaluated requires data of specific quality. To demonstrate compliance with the cleanup criteria, the data obtained must be of high quality. Laboratory analytical procedures and laboratory data reporting will follow the QA/QC protocols described in the "Environmental Measurements Laboratory Procedures Manual" (HASL-300), "Multi-Agency Radiological Laboratory Analytical Protocols" (MARLAP), and the task-specific laboratory statement of work (SOW) prepared by the project for these analyses.

6.0 SAMPLING AND FIELD MEASUREMENT PROCEDURES Procedures for performing sampling and measurements have been developed specifically for FNR final status survey purposes; Appendix A contains a listing of those procedures.

For field activities not covered by these procedures, consensus or industry-accepted procedures, such as those of EPA, DOE/EML, ANSI; ASTM, and other standards organizations may be used. Planned modifications to, or deviations from, established procedures and use of procedures, other than those identified here or in the survey design package for a particular survey unit shall be approved by the Executive Director. The Executive Director is responsible for selection and justification of any additional procedures.

The Executive Director may initiate modifications to work plans, when necessary. Based on situations or conditions that may arise during field activities or findings as the survey progresses, the RSO may implement modifications to the survey design and/or procedures with concurrence of the Executive Director. Documentation of modifications is included in the project file, providing the following information:

  • Circumstances requiring the modification,
  • Alternative procedure or method used, and
  • Effective date of modification.

7.0 TRAINING AND CERTIFICATION An integral part of the Quality Assurance program is a commitment to utilization of trained, qualified personnel. Survey team personnel receive specific training for all procedures which they are expected to perform. Training is provided by individuals who have demonstrated competence in the procedure. The RSO confirms the competence of trainers and trainees, relative to performance of specific FSS procedures.

9

8.0 FIELD RECORDS Field survey records provide the direct evidence and support for the technical interpretations, judgments, and decisions regarding a project. They usually contain original data or information which would be difficult, if not impossible, to replace.

All data, notes, maps, calibrations, and other information pertinent to a survey project should be recorded and maintained. Records miust be legible, thorough, and unambiguous. They are to be prepared in indelible ink; black ink is preferred. Sufficient information and data should be collected to enable an independent evaluation of the site status. Examples of information which should be recorded with survey data may include, but are not limited to:

  • Site/Project identification,
  • Measurement or sampling location (grid position and depth/height),
  • Instrument identification,
  • Names of survey personnel,
  • Date of data collection,
  • Counting interval, if appropriate,
  • Data units,
  • Unusual observations and situations, and
  • Name of data recorder.

When practical, survey data are recorded on standardized forms. Forms which are generally appropriate for typical survey activities are available. These forms may be modified or tailored to meet the needs of a specific project, application, or site.

Information requested on data record forms may be inappropriate or incorrect for specific applications. If so, handwritten changes should be made on the forms. When certain information on a form is not required, the space or columns should be eliminated, crossed through, or marked "N/A" (not applicable) as an indication that such information was not required, rather than having possibly been overlooked. Other information, for which forms are not a practical means of documentation, is also recorded in project files.

If data corrections are necessary, a single line is drawn through the entry; new data is then recorded and the change is initialed and dated. Data should not be obliterated by overwriting, erasing, or use of white-out.

Records include, but are not limited to, the following:

  • Survey work plans,
  • Project logbooks,
  • Data forms,
  • Calibration data,
  • Instrument QC data,
  • Chain-of-custody forms,
  • Training and certification documentation, 10
  • Laboratory analyses data,
  • Data review and validation, and
  • Audit Reports and follow-up documentation.

Records are reviewed by individuals not directly responsible for developing and/or recording the data. Reviews should be performed as soon after completion of the task as practical; it is prudent to conduct reviews of field data before onsite activities are terminated to assure all necessary data have been obtained. Reviews are documented by signing and dating the record. A project file entry or summary report may also be used to document review of large quantities of data.

9.0 SAMPLE MANAGEMENT At collection, each sample is assigned a unique identifier. In addition to the sample identifier code, the name of collector(s) and date of collection will be noted in the field records.

A chain-of-custody record is maintained for all samples.

Samples are placed in containers selected on the basis of required sample volume, physical and chemical compatibility with the sample medium, anticipated contaminants, and structural integrity. Typically, polyethylene, plastic, metal, or glass containers are used.

10.0 EQUIPMENT CALIBRATION Field equipment is maintained, calibrated, and operated in accordance with standard, accepted practices. Calibration data for field measuring equipment is maintained and may be included with the project survey records, where required.

Calibration and operational requirements for laboratory instruments, used by contracted analytical services must be in accordance with industry-accepted practices.

11.0 .ANALYTICAL METHODS The FSS Contractor will arrange for analytical services consistent with HASL-300 and MARLAP. The Executive Director assists the Decommissioning Manager in the selection and oversight of analytical services, including review of quality assurance/quality control activities and performance, relative to technical specifications.

11

12.0 DATA MANAGEMENT The quality and validity of data should be consistently and thoroughly evaluated and documented to assure that the objectives of the survey project are satisfied and conclusions developed from the data are supported.

12.1 Data Reduction Data will be reported in standard units for comparison with applicable guidelines or criteria; the metric system is the system of choice, but English units may be acceptable for some applications. Typically, uncertainties, based only on counting statistics, are reported with data. Uncertainties resulting from other aspects of the measuring process may be included, if they are known.

12.2 Data Review and Validation Review consists of an evaluation of data for completeness, consistency, procedural compliance, recording and transcription accuracy, and accuracy of processing. Validation is an independent assessment of the data and results to trace and justify activities, analyses, and decisions for defensibility, and to compare a body of data to a set of performance objectives. The extent and rigor of review and validation will depend upon the type of data and proposed uses of the data; the Executive Director determines the level of review and validation to be performed.

Review Raw data Raw data are measured values, obtained from instrumentation or equipment, or transcription of values, obtained from other sources.

Raw data will be reviewed for legibility, completeness, and to determine if appropriate procedures were followed. Reviews should be performed as soon after completion of a task as is practical; however, they must be completed before work on the project is considered to be complete, for example, prior to leaving a survey site or prior to report finalization, as applicable. This review is documented by signing and dating the data sheet. A project file entry or summary report may also be used to document the review of a large quantity of data.

Transcribed data Transcribed data are data transferred or copied by hand entry including computer print input from one location to another.

Transcribed data are reviewed for accuracy after completion of the task, and before data is used for any purpose.

12

At least 10% of transcribed items from a set of data are checked.

Documentation of reviews is performed by signing and dating the original or a photocopy of the most recent version of the data or by a signed and dated summary sheet.

Hand processed data Hand processed data are obtained from hand calculations, using raw data and established constants. Hand processed data are reviewed for completeness and accuracy after completion of the task, and before the data are used for other purposes. For each set of data, at least two calculations of each equationshould be checked. Equations not identified in approved procedures should be included in the documentation. The review is documented by signing and dating the data sheet.

  • Computer processed data Computer processed data are obtained from computer calculations, using raw data and established constants. Computer programs established for performance of routine calculations should be checked by hand calculation of at least two sets of input values, prior to release of the program for general use. Equations not identified in approved procedures must be included in the documentation. Documentation consists of hand calculation sheets and computer printouts showing input parameters and results that have been signed and dated by the reviewer. Files containing the information are identified in the project file.

Problems identified during the review process should be resolved prior to release of data for further use.

Validation Requirements for various levels of data validation are determined by the Executive Director and may include the following:

  • Use of correct procedures,
  • Training of personnel,
  • Acceptable equipment performance,
  • Complete data reviews,
  • Problem resolution, and
  • Complete documentation.

13

A minimum of 10% of analytical data will be validated. Deficiencies noted during the validation process are handled according to the corrective action procedure, unless another method is specified by the Executive Director. Validation activities should be performed by personnel independent of project survey or management roles.

12.3 Records Handling and Storage Original field and laboratory data are generally the property of UM. This includes:

  • Survey plans,
  • Site log books,
  • Field data forms,
  • Calibration data,
  • Field drawings,
  • Chain-of-custody forms,
  • Training and certification documentation,
  • Source certification,
  • Laboratory analysis reports,
  • Audit reports and follow-up documentation, and
  • Performance evaluation results and associated documentation.

Upon completion of a survey project, records are placed in the project file and are the responsibility of the Executive Director.

Storage of records should be in an access-controlled area, constructed to assure protection of the records from loss or damage. The retention time for a particular record is determined based on the status of activities associated with the project. Disposal of records is under the authority of the Executive Director.

13.0 ASSESSMENTS AND AUDITS Assessments are performed by the Executive Director and/or RSO on a continuous basis.

Audits are performed annually. The DRC assures that audits are conducted and assists the Executive Director in planning and implementing such audits.

Responses to audit findings are initiated in a timely manner. Findings will be tracked through completion or resolution.

Records of audit findings and corrective actions are retained by the Executive Director.

14.0 CORRECTIVE ACTION, Various control limits have been identified, which, if not met, require corrective action. Instruments or equipment found to be operating outside acceptable operating ranges or found to be in use after the expiration of the calibration period are immediately removed from service and may not be returned to use until the 14

deficiency has been corrected. Unplanned deviations from procedural requirements are documented. Such deficiencies/deviations should be reported immediately to the RSO and the Executive Director; the cause and possible impact on previously collected data should be determined. The findings of the evaluation are documented.

15

15.0 BIBLIOGRAPHY Quality Assurance Program Requirements for Nuclear Facilities, ANSI/ASME NQA-1, 1989 Edition.

Multi-Agency Radiation Survey and Site Investigation Manual (MARSSIM), NUREG-1575. Revisionl, US Nuclear Regulatory Commission, 2000.

Minimum Detectable Concentrations with Typical Radiation Survey Instruments for Various contaminants and Field Conditions, NUREG/CR- 1507, US Nuclear Regulatory Commission, 1997.

Environmental Measurement Laboratory Procedure Manual, HASL-300, 2 8th Edition, US Department of Energy.

Multi-Agency Radiological Laboratory Analytical Protocols (MARLAP), EPA-402-B-04, US Environmental Protection Agency, 2004.

16

APPENDIX A Procedures Applicable to Final Status Survey Of the Ford Nuclear Reactor FSS-002 Mapping and Gridding for FSS HP-211 Tennelec 5 XLB Gas-Flow Proportional counter HP-304 Radiation Monitoring Instrument Operation HP-401 Survey Instrument Source Checking HP-402 Calibration of Ludlum 2221 Scaler Ratemeter & Detectors HP- 101 Operational Radiation Surveys HP-502 Soil Sampling HP-503 HPGe Operation 17