ML19289E300

From kanterella
Jump to navigation Jump to search
Responds to NRC Questions Re Pilot Study on Personnel Dosimetry.Concludes Most Processors Have Not Exerted Effort Required to Pass Hpssc Std.Proposes three-phase Course of Constructive Action
ML19289E300
Person / Time
Site: University of Michigan
Issue date: 03/08/1979
From: Hudson G, Plato P
MICHIGAN, UNIV. OF, ANN ARBOR, MI
To: Ryan Alexander
NRC OFFICE OF STANDARDS DEVELOPMENT
References
NUDOCS 7904110103
Download: ML19289E300 (12)


Text

,

THE UNIVERSITY OF hilCHIGAN ScnOOL OF UUBLIC HEALTH ANN ARBOR, hilCillGAN e no9 Dcpartrnent of Ernironmental and Industrial IIcahh MEMORANDUM T0:

Robert Alexander FROM:

Phillip Plato and Glenn Hudson DATE:

March 8, 1979 RE:

Our response to your questions concerning the results of the pilot study.

Last week your office requested that we respond to three questions concerning the results to date of the pilot study of the IIPSSC Standard. The questions are:

1.

Are the processors consistently reporting higher or lower doses than we are delivering to the test dosimeters?

2.

What are the specific reasons that we have identified for the poor performance of the processors?

3.

What are our recommendations to improve the performance of the processors?

The following is our response to these questions.

Please let us know if you would like to discuss any of these responses in more detail or if we can provide additional information.

_f Phillip Plato Glenn Hudson PP/mf Enclosure 790411 0 /0 3

'M

a 1.

RES'JL"5 0F TEST #1 We identified 15 of the 58 participating processors that process more than 10,000 dosimeters per month.

The work load of each processor was obtained from the questionnaire we sent to each participant.

For each dosimeter tested, a performance index is calculated for the HPSSC Standard as:

reported dose - delivered dose P=

delivered dose Among all the processors, we recorded the percent of their dosimeters that had a performance index from 0% to +30%, +30% to +50%, greater than

+50%, 0% to -30%, -30% to -50%, and less than -50%.

Table 1 shows the results for each depth of each interval of each category. The table also shows a summary for each category that was calculated by averaging the fractions within each range of performance index.

For some categories, the processors as a whole are reporting doses that are consistently higher or lower than we are delivering.

For example, in Category V 89.3% of all the dosimeters tested responded with doses less than the delivered dose.

Some intervals showed an even distribution of reported doses relative to the actual delivered doses. For example, in interval 2 of Category I (gamma-ray doses of 30 to 100 mrem, commonly observed among radiation workers),

43.4% of the dosimeters underresponded and 56.6% overresponded.

2 Table 1.

Distributior. of performance index P /r eng 2 5 processors that routinely process more than 10,030 dosimeters per month.

Percent of P Values Within A Specified Range **

Total No. of Category and Interval

  • P<.5

.51P<.3

.35P<0 05P5.3

.3<PS.5

.5<P Dosimeters I.

Camma 3.7%

3.2%

51.2%

33.3%

5.6%

3.0%

565 1.10-800 rad D

6.0 0.8 57.9 27.1 2.2 6.0 133 2.30-100 mrem D

3.7 3.7 36.0 37.5 12.5 6.6 136 3.

101-300 mrem D

0.0 3.3 59.4 31.3 6.0 0.0 150 4.

301-10,000 mrem P 5.5 4.8 50.7 37.0 2.0 0.0 146 II. High Energy X Ray 3.3%

3.2%

33.4%

38.8%

10.5%

10.8%

420 1.10-800 rad D

9.7 3.2 17.2 50.5 11.8 7.5 93 2.30-100 mrem S

3.6 3.6 50.0 24.6 9.1 9.1 110 D

3.6 4.5 50.0 28.3 9.1 4.5 3.

101-300 mrca S

1.8 4.5 37.3 38.2 7.3 10.9 110 D

1.8 4.5 43.6 37.3 10.0 2.8 4.

301-10,000 mrem S 1.0 0.0 15.0 48.6 12.1 23.3 107 D

1.9 1.9 20.6 43.9 14.0 17.7 III. Low Energy X Ray 15.4%

15.2%

12.6%

23.5%

9.6%

23.7%

198 1.

150-300 mre=

S 18.2 8.1 16.1 20.2 15.2 22.2 99 D

10.1 19.2 11.1 28.3 6.1 25.2 2.

301-10,000 mrem S 19.2 11.1 17.2 16.2 10.1 26.2 99 D

14.1 22.2 6.1 29.3 7.1 21.2 IV.

Beta 2.1%

0.0%

26.4%

43.5%

11.7%

16.3%

239 1.

150-300 mrem S

1.7 0.0 27.7 40.3 14.3 16.0 119 2.

301-10,000 mrem S 2.5 0.0 25.0 46.7 9.2 16.6 120 f

  • Within cach interval, S = shallow and D = deep.

Percentages shown for an entire category are the averages for all interval tests within the category.

3 e

Table 1 (ccatinued)

Percent of P Values Within A Specified Rance**

Total 16. of Category and Interval

  • P<.5

.55P<.3

.35P<0 05P5.3

.3<P5.5

.5

:ture of Sources Most processors do much better in the three mixture categories than they do in the categories that use only one radiation source. Unfortun-For ately, this appears to be due to compensating errors on their part. example, in Category VIII, many processors report a high gamma-ray dose and a small or zero neutron dose to a given dosimeter. k' hen these two components are added together for the HPSSC Standard, the total reported dose is of ten close to our total delivered dose. Summary of Problems The results to date of the pilot study show that most processors are making many or all the errors described above in spite of the fact that they have been given ample time to prepare for the pilot study. Every processor had all the information necessary to calibrate for the pilot study a minimum of six months before Test #1 began. They also had a min-imum of three months between Test #1 and Test #2 to make further adjust-ments in their calibration procedures. The The problems described above are relatively simple to correct. question is, why aren't more processors making a conscientious effort to calibrate in compliance with the HPSSC Standard? Some possible, but unresearched reasons are: 1. Some processors wanted to see how their current calibration pro-cedures would handle Test #1. 2. Some processors delayed sending their reported doses to us for Test #1 which greatly reduced or eliminated the time they had to take corrective action before Test #2 began for them. E 3. Some processors do not understand the HPSSC Standard well enough to know what to do to pass. This ignorance is due to a lack of time spent reading the Standard and not to the compicxity of the Standard. 4. Some processors feel that to comply with the Standard, they would have to duplicate the sources and facilities of the testing lab-oratory. They perhaps are not aware that X-ray eachines and radionuclide sources are available at local hospitals, univer-sities, and national laboratories. 5. Some processors viewed the pilot study as yet one more survey of the industry and not as a preview of a candatory testing program. We sensed this lack of interest and concern among some processors in spite of the fact that we made the purpose of the pilot study clear to all 120 processors that were invited to participate. Since some processors did not take the pilot study seriously, they put little effort into calibrating their sources according to the requirements of the Standard. e 9 III. RECOMMENDATIONS It is our opinion, based on the pilot study data received to date, that most of the processors have not exerted the ef fort required to pass the HPSSC Standard. We do not know if these processors are unwilling to exert more effort than they have or if they are willing to improve but lack the knowledge to initiate improvement steps. We therefore pro-pose a three-phase course of constructive action to begin at the conclusion of the pilot study (September 27, 1979). The following is an outline of work that we propose to do following the current pilot study with financial support from the NRC. Details concerning timing and cost will be oiscussed if this outline or a mod-ified outline seems appropriate. Phase I: Mandatory Testing Program The NRC and other relevant regulatory agencies would jointly announce their intent for a mandatory testing program, and the nec issary legal pro-cedures would be initiated. All proces_. ors would be informed as to the estimated time schedule for the rule-making procedures, the number of tests to be passed per year, penalties for failure, etc. Phase I is ne-cessary to convince all processors that the HPSSC Standard will be used in an enforcement capacity by the NRC and other regul:: tory agencies and will not simply renain as yet another passive ANSI standard. o 10 Phase II: Technical Assistance Beginning at the same tbme as Phase I, the NRC would financially support The University of Michigan to provide necessary technical assis-tance to all processors on a voluntary basis. The technical assistance would involve basic inforr.ation on calibration procedures so that, given a reasonable amount of competence on their part, a processor could pass the HPSSC Standard. This assistance would include the following: 1. Drs. Plato and Hudson would visit each processor that requested tech-nien1 assistance. The visit would include a detailed explanation and discussion of the Standard, a review of the processor's performance in Test #1 and Test #2 of the pilot study, a discussion of the instru-ments and procedures that the processor should use to calibrate ac-cording to the Standard, and suggestions for acquisition of the necessary radiation sources (e.g., purchase a source, rent tLoe on a source from a local hospital or university, etc.). These visits would involve discussions of basic radiation dosimetry but would not include the type of information that most commercial processors consider propri-etary (e.g., how to interpret optical density or thermoluminescence behind each of several filters within a dosimeter). These visits would enable us to question i -h processor in detail concerning their calibration needs. For example, we strongly suspect that so=e processors that participated in the pilot study were: submitting prototype personnel dosimeters to see how they a. cight perform. s 11 b. submitting dosite:ers for irradiations with sources for which the dosincters were never intended to be used (e.g., X rays) to see if the dosimeters are versatile. c. submitting environmental dosimeters to obtain free calibration informa tion. It is not fair for these processors to influence decisions concerning use of the HPSSC Standard, but at the moment, we cannot tell exactly if and when these misuses of the pilot study have occurred. 2. To the extent possible, we would take calibration instruments to the processors to demonstrate proper procedures for calibration, including tracability to NES, repetitive measurements, quality assurance pro-cedures, etc. 3. To the extent possibic, we would provide calibration irradiations for the processors. For example, many processors have never calibrated f doses greater than 3 to 5 rad. Although the pilot study provided all 1 participating processors with some calibration data points in the accident range of 10 to 800 raa, additional calibration data are pro-bably needed. Also, almost all processors could uce additional cal-ibration data with a californium-252 source. l Phase III: Pilot Test #3 At the conclusion of Phase II, we would administer a third three-month pilot test to any processor that voluntarily chose to participate, t 12 Financial support for Phase III would be provided by the 1;P.C. Test #3 would serve two purposes. First, it would detercine if the combined ef-f orts of Phases I and II had been successful. Second, since it would be completed before the end of Phase I, it would reflect the conditions of dosimetry processors in a format that could be used to predict the conse-quences of finalizing Phase I. GENERAL SU3S'ARY Our recommendation for cethods to improve processor performance dis-cussed in part III of this report is actually a beginning of a request by us for an extension of the present pilot study contract. For Phase I of our recommendation, our involvement would be the preparation of the value/ impact statement. We would have complete responsibility for Phases II and III. We would enjoy discussing the details of our recommendation with you at your convenience. a.