ML20101B001

From kanterella
Jump to navigation Jump to search
Surrebuttal Testimony of Jj Regan Re Aspects of Industrial Training Programs at Facility.Certificate of Svc & Svc List Encl.Related Correspondence
ML20101B001
Person / Time
Site: Three Mile Island Constellation icon.png
Issue date: 12/17/1984
From: Regan J
AFFILIATION NOT ASSIGNED
To:
Shared Package
ML20101A998 List:
References
SP, NUDOCS 8412200020
Download: ML20101B001 (13)


Text

i 9%eo' 55p ga,,

December 17, 198%KXUEru USN=C UNITED STATES OF AMERICA 'gf { m j g g ) q q NUCLEAR REGULATORY COMMISSION

rge'~rf r q,7 :g, s BEFORE THE ATOMIC SAFETY AND LICENSING BOARD '

)

In the Matter of ) -. . , _

)

METROPOLITAN EDISON COMPANY ) Docke t No. 50-289 SP *

) (Restart - Management Phase)

(Three Mile Island Nuclear )

Station, Unit No.- 1) )

)

SURREBUTTAL TESTIMONY OF DR. JAMES J. REGAN Q.l. Have you read the Rebuttal Testimony of Dr. Ronald A.

Knief and Mr. Bruce P. Leonard dated November 28,.1984?

A.l. Yes.

Q.2. At Page 3, Dr. Knief and Mr. Leonard discuss validation methods applied by GPU Nuclear to the TMI-1 licensed operator training program. Quoting from your deposition, they state that,

" relating the content of training to the characteristics of a job

'isn't done all that often.'" Do you agree with that statement as applied to industrial training programs such as that in place at Three Mile Island?

A.2. No. In my deposition I was referring to the fact that ,

relating training content to the characteristics of a job is not

!!n=toei!

T o Jua@

.1

done in many areas of. training or education. It is rarely done in primary, secondary, or college settings, for example, because the jobs -for which this training is intended are remote in time and quite varied. The military has systems of feedback from the job to the training activities, but the relationship of job success to particular training courses is difficult to determine because of differing activities between the end of the training and the start of the job for which the training was provided.

My statement referred to the full range of educational and training activities, for which, as illustrated by these examples, job performance is of ten very dif ficult to measure or to key to training in any meaningful way. In those cases, the content of training is not closely related to the characteristics of the job because it is not feasible to do so, not because this is a novel idea or because it is unimportant.

The situation is different in an industrial setting such as Three Mile Island. The relationship between training and the job is much closer. Industrial training is much more frequently keyed to specific jobs, and the jobs and training are followed up to relate them closely to each other. As I also said in my deposition at page 144, relating training content to job characteristics is central to establishing that the training program is effective and efficient. The fact that this is not done in non-industrial contexts does not diminish its importance

here. As I said in my testimony at page 9, assessment of the training program against the_ operational performance of the individuals, teams, and systems involved in the program is the only reliable means of measuring the effectiveness of training.

I should add that in that same part of their Rebuttal Testimony Dr. Knief and Mr. Leonard note that I said that the incorporation of advances in educational technology does not

, ensure that a training program is a good one. I agree that the mere presence'of such features as computer assisted instruction, or specially designed texts would not, without further information, be conclusive evidence of optimum training. On the other hand, it is also true that the absence of all or most of these features would strongly suggest an inadequate training program. I have not been able to determine from the information that I have seen, and particularly from the testimony that I have seen, whether the training program at Three Mile Island makes sufficient use of such features to be termed an adequate training program.

Q.3. In the discussion at and following Question and Answer 5 of their Rebuttal Testimony, Dr. Knief and Mr. Leonard discuss various means by which GPUN has purportedly validated its training program. Does this testimony demonstrate that GPUN adequately relates the content of training to the characteristics of the job?

A. No. Dr. Knief and Mr. Leonard say that GPUN relies upon job and task analysis to assure that the training progra- is i.

z closely enough related to the jobs for which the operators are being trained. This type of job / task analysis, which I understand GPUN has not yet completed, is the front-end of a process (such as the TSD) of developing a training program. Task analysis is not enough, however, to assure that training is adequately preparing the incumbents for their jobs. As I discussed in my testimony, this assurance requires measurement and analysis of the jobs and job performance, which must then be tied back into the training program.

In particular, the discussion of the Training System Development (TSD) and Systematic Approach to Training (SAT) models at page 5 and following does not provide enough information to evaluate the adequacy of their implementation at Three Mile Island. These are but two of a number of procedures, all of which have several features in common. Other similar procedures include the military's Instructional Systems Development (ISD) (probably the most widely used) , Systems Analysis of Training (SAT) (Bryan, Regan, 1971), and Training Situation Analysis (TSA) (Regan, 1961). These procedures address the need to analyze jobs, to develop training objectives, to design training, to evaluate training and performance, and to

-relate.these activities closely to each other. All of these procedures-tend to focus on what to do and to say very little about how to do it. Since, for the most part, these procedures are not reduced to a detailed set of how-to-do-it rules, significant training of those who use the procedures is important c

v.)

4k, o- to assure that the procedures are used correctly. Finally, implementation of these procedures is important, but it varies widely and is.often found wanting. Accordingly, any attempt to rely upon these procedures must demonstrate both that the users have.been adequately trained in their use and that they are being implemented correctly. Simply invoking a procedure does not 3

v-

, assure that the training program is adequate.

a Moreover, the rebuttal testimony itself may reveal

., mis, understanding on the part of GPUN with respect to the TSD approach. In Answer 8, Dr. Knief and Mr. Leonard explain that the'd velopment and implementation phases of the approach were conducted ef fectiv ely, but that analysis, design, and evaluation phases needed work. It is difficult to understand how these three aspects could be found wanting, while the others were adequate. In the TSD model, training content and technology are deterNined by the completion of steps 1 and 2 (analysis and

, j- design) , and verified by step 5 (evaluation) after steps 3 and 4 (development and implementation, which GPUN s tates were adequate at the time). In other words, it is a linear process in which

. the adequacy of later steps cannot be determined if the earlier

-f

, e steps were not adequate.

L

r. Q,4. At page 12-13 of their testimony, Dr. Knief and Mr.

o Q3 1

g Leonard argue that GPUN cannot rely upon certain statistical

..f approaches for validating tests or standardized methods for i.

designing training programs, so that they must develop and evaluate their training program by other means. Do you agree?

o

)~

h

_i A.

A.4. I agree that there may be practical limits on the use of. statistical or other approaches. That is one reason why I

. said in my- testimony that a procedure such as the Instructional Quality. Inventory (IQI).can serve.as an intermediate indication

'of the' effectiveness'of training in the absence of adequate

. statistical: measures .

~

The emphasis in the rebuttal testimony on the f act that GPUN cannot rely upon such tools as statistical reliability highlights the'need to implement procedures such as the IQI. Is precisely because these tools are not available that it i.s even more important to be explicit in determining that the training and measures of training are explicitly keyed to tasks, objectives, training content, and job performance.

Q.5. Have you read the Rebuttal Testimony of the Reconstituted OARP Committee, filed on November 28, 1984?

A.S. Yes.

s

. 0 6. -In Answer 24, the Committee states that it disagrees with your view that~all of the issues that you raise "must be examined in evaluating a training program such as one for a

-nuclear power plant." What is your response to that statement?

A.6. Tnis is an area in which I f undamentally disagree with the Committee. If som9ene is charged wits. the responsibility of evaluating a training program to determine whether that program produces people who will perform well in the jobs for which they are being trained, I believe that the review and evaluation would

.have to include the elements that I have identified. That is

particularly true where the review and evaluation are intended to-be relied upon by a government agency in the context of a

. decision that may have a significant impact on the public health' and safety.

Of. course, I would not argue that the issues must be examined in the precise manner that I describe, or that they must be defined precisely as I have defined them. Bu t . it would be necessary to address the issues as I have discussed them in order to reach a. reasonably reliable conclusion about the adequacy of a training program, and particularly about the quality of the performance that can be expected from trainees.

I would have been glad to address any particular points that the Committee believes need not be considered in a review of the TMI-l training program. However, the Committee in this response has not identified any that it believes are not necessary or that it believes be addressed in some manner other than the one I suggested. This points up one of the dif ficulties that I have with the Committee's work. In order to undertake a reliable review of this training program, one should at least develop a

, model of'how to go about'such a review and then tailor the model to the program. Otherwise, the program itself tends to direct the review and to bias conclusions in favor of what is already in

.the program, as opposed to what should be in the program.

L o

s i

i

Q.7. In Answer 25, the Committee argues that documentation, standardization, and formality in general are unnecessary and counter productive for a small program such as this where trainees and managers are in close and constant contact. Do you agree?

A.7. No. I disagree both with the way that the Committee has characterized my testimony and with the Committee's substantive conclusions on this point.

First, I have not commented on how the activities that I discussed in my testimony should be documented, nor have I proposed that the training program should follow a standardized format (ISD as opposed to TSD, for example) , because there are- a number of formats for arriving at a valid training program.

Thus, the Committee's discussion of my testimony creates an impression of formalism and bureaucratic burden that is greater than I intended.

Far more important, however, I strongly disagree with the Committee' conclusion that the small size of the program and the close relationships of the participants render documentation and formality of evaluation unnecessary. To the contrary, this is precisely the type of situation in which it is vital to follow explicit, repeatable, state-of-the-art training procedures in order to assure that evaluations are objective rather than intuitive.

At Three Mile Island, there are relatively few people involved in the program, and even those people change roles, as

when operators write questions for the examinations. In this situation, as I discussed at page 12 of my testimony, for example, there is an even greater danger than in larger programs that extraneous considerations such as personal relationships will significantly influence evaluations of trainees or job incumbents. In addition, skilled practitioners (subject matter experts) such as Senior Reactor Operators are not necessarily good teachers or good evaluators. The skills required to learn a job, do a job and to teach others to do a job are not the same.

Thus, in addition to subject matter experts, educational specialists and repeatable, state-of-the-art, explicit procedures are the ingredients for designing, executing, evaluating, and revising and adequate industrial training program such as the one at Three Mile Island.

On the top of page 16 the Committee makes a comment that leads me to reemphasize the points I have just made. The Committee argues that GPUN training program managers are extremely f amiliar with the environments f rom which trainees are drawn, so that the skill and knowledge of incoming students is usually well understood. This reflects the assumption that peaple from similar backgrounds, such as the Navy nuclear program, are similarly skilled and equal performers, or that their competence can be judged solely f rom their backg rounds.

That simply is not the case. People with similar backgrounds exhibit substantial individual differences, which should be assessed through some objective measure. The fact that the

. Committee, and perhaps the company, do not recognize the need to measure these differences emphasizes the need to rely upon explicit and objective measures rather than upon the intuition of program managers.

At the end of Answer 25, the Committee states that.none of the' methods that I suggest should be used exclusively. I certainly agree. What is necessary is a proper mix of all of the methods.

Finally, I have now seen the rating definitions referred to in Answer 26 of the Committee's testimony. Such definitions would not substantially improve the usefulness of the ratings (many are dictionary definitions of traits) , although they might be somewhat helpful if supplemented with behavioral examples. I still believe that rating activities of job holders rather than traits or constructs is easier to do, more likely to be consistent, more useful to management, and fairer to all concerned.

l l

4 .

r ^

December 17, 1984""

'84 EC 19 A11 :15 UNITED. STATES-OF AMERICA OFFICEIE HCEI!',

NUCLEAR-REGULATORY COMMISSION 0 hhg

)

In the Ma tter - of )

)

METROPOLITAN EDISON COMPANY ) Docket No. 50-289

) (Restart Remand on (Three Mile Island Nuclear ) Ma nagement) .

Station, Unit No. 1) )

)

CERTIFICATE OF SERVICE I hereby certify that copies of the SURREBUTTAL TESTIMONY OF DR. JAMES J. REGAN, were served on those indicated on the accompanying Service List. Service was made by deposit in The United States mail, first class, postage prepaid, on December 17, 1984, except those indicated by an asterisk were served by hand.

1

/MY'W W

,45V y/

William'S. Jordhn, III i.
a. ...

UNITED STATES OF AMERICA NUCLEAR REGULATORY COMMISSION BEFORE THE ATOMIC SAFETY AND LICENSING BOARD In the Matter of )

)

METROPOLITAN EDISON COMPANY ) Docket No. 50-289

) (Restart Remand on (Three Mile Island Nuclear ) Management)

' Station, Unit No. 1) )

)

SERVICE LIST Administrative Judge Gary J. Edles, Chairman Jack R. Goldberg, Esq. *

' Atomic Saf ety & Licensing Appeal Bd. Office of the Executive Legal Dir.

U.S. Nuclear Regulatory Commission U.S. Nuclear Regulatory Commission Washington, D.C. 20555 Washington, D.C. 20555 Administrative Judge Deborah Bauser, Esquire

  • John H. Buck Atomic Safety & Licensing Appeal Bd. Shaw, Pittman, Potts & Trowbridge U.S. Nuclear Regulatory Commission 1800 M Street, N.W.

Washington, D.C. 20555 Washington, D.C. 20036 Administrative Judge Christine M. Kon1 -

Ms. Louise Bradford

- Atomic Safety & Licensing Appeal Bd. TMI Alert U.S.' Nuclear Regulatory Commission 1011 Green Street Washington, D.C. 20555 Harrisburg, PA 17102 Administrative Judge

  • Ivan W. Smith, Chairman Joanne Doroshaw, Esquire
Atomic Saf ety & Licensing Board The Christic Institute U.S. Nuclear Regulatory Commission 1324 North Capitol Street W;tshington, D.C. 20555 Washington, D.C. 20002
Administrative Judge
  • Sheldon J. Wolfe  !!r . and Mrs. Norman Aamodt Atomic Saf ety & Licensing Appeal Bd. R.D. 5 U.S. Fluclear Regulatory Commission Coatesville, PA 19320

'Wnghington, D.C. 20555 Administrative Judge

  • Lynne Bernabei, Esq.

Gustave A. Linenberger, Jr. Government Accountability Project

. Atomic Safety & Licensing Board 1555 Connecticut Ave.

U.S. Nuclear Regulatory Commission Washington, D.C. 20009 Washington, D.C. .20555 l

Docketing.and Service Section Michael F. McBride, Esq.

Office of.the. Secretary LeBoeuf, Lamb, Leiby & MacRae U.S. Nuclear Regulatory Commission 1333 New Hampshire Ave, N.W. #1100 Wcshington, D.C. ~20555 Washington, D.C. 20036

_ 'Mi.chael.W. Maupin, Esq.

.Hunton & Willicms 707 East Main Street P.O. Box 1535 Richmond, VA -23212 Thomas Y.-Au, Esq.

Office of Chief Counsel Department of Environmental Resources 505 Executive Houses '

P.O. Box 2357 .-

Harrisburg, PA 17120 '[-

4 O

e I

i 4

5 l

(

l l

l l

l l

I l

l l

.. , - .. .- . - - . . . .