ML20133N756
| ML20133N756 | |
| Person / Time | |
|---|---|
| Site: | Prairie Island |
| Issue date: | 08/07/1985 |
| From: | Morisseau D Office of Nuclear Reactor Regulation |
| To: | Booher H Office of Nuclear Reactor Regulation |
| References | |
| NUDOCS 8508130605 | |
| Download: ML20133N756 (43) | |
Text
__
i
{
(
{
Ef 4
DISTRIBUTION:
EsiciaTia CPIClino Central Files t
_ h gust-7,, 1985 certifica p.
rs nsky DMorisseau p5 MEMORANDUM FOR:
Harold R. Booher, Chief 50- M 2 Licensee Qualifications Branch Division of Human Factors Safety A-M THRU:
Julius J. Persensky, Section Leader Personnel Qualifications Section Licensee Qualifications Branch, DHFS FROM:
Dolores S. Morisseau. Training and Assessment Specialist Personnel Qualifications Section Licensee Qualifications Branch, DHFS
SUBJECT:
OBSERVATION OF INPO ACCREDITATION TEAM VISIT AT PRAIRIE ISLAND NUCLEAR POWER PLANT Introduction During the week of July 15 - July 19. Bill Guldemond, Region III, and I were NRC observers during the INPO Accreditation Team Evaluation of the following Prairie Island Nuclear Power Plant Training Programs:
Licensed R0 and SRO Training Program Licensed Operator Requalification Training Program Electrical Maintenance Training Program Mechanical Maintenance Training Program Non-Licensed Operator Training Program During his opening remarks at the Prairie Island Training Center, the Team Leader emphasized thet the training programs were to be evaluated against INP0's criteria and not compared to those of other plants. Team members werc to gather data and prepare a report based on that data. The Team Manager and the Accreditation Department Manager or his assistant will return in three to four weeks to ensure the technical accuracy of the report. The utility will prepare responses to the report, especially with respect to areas of concern identified during the team evaluation.
The responses will be incorporated into the report and the report submitted to INP0's Accreditation Board. The Board then deliberates, makes a decision, and INPO forwards the decision to the utility. This process will take five to six months. The Assistant Team Manager, Ron Fritchley, noted that the team does not make the decision regarding accreditation but has input to the deliberation process through its report based on its data collection. He also assu " ="W trainina nersonnet that the NRC observers were there 2609I'%O h O $ LA-
'"'c ' >
................. ~
-..-~~.--
. ~. ~. - ~ ~ ~
~~~~~~~
. - ~ ~. ~ ~ -
c" "* ' >
.......... - -. ~.
.............. ~. -
- -. - ~ ~.. ~
........ - ~. - -
~~~~~~.
-.... - ~. - -
. ~. - ~ ~.. ~
~:c m:= m no,.o' Nac" ono OFFICIAL RECORD COPY
- u.s. am us-4ao s
Q
~I k
k.
k'
~
August 7, 1985 i
to observe INP0, not Prairie Island. A listing of the INPO Evaluation Team members is enclosed with this report.
1 During this opening session, Prairie Island training personnel gave a slide l
presentation on the use of Job / Task Analysis in the development of craft i
training programs. It was noted that only parts of the program had gone
{
through all the steps of JTA development.
l The Accreditation Process The INPO Evaluation Team is divided into two areas - Process and Program, j~
Evaluators in the Process group gather data concerning the management and i
facilities that support the training operation. Program Evaluators look at,
i the technical content of the program. The task of the team is to look for
~
evidence that supports the material in the utility's Self-Evaluation Report, j
which is submitted to INPO prior to the team visit. Before data collection i
begins, the Peer Evaluators, training and technical personr!el from several
. utilities, receive training from INPO Team, Leaders. They are briefed on interview techniques and data gath'ering strategies. Team members in the Process and Program areas were urged to support each other's data collection by avoiding duplication of effort wherever possible and by conducting joint 4
i interviews. Team members were instructed to ensure that they obtained a good representative sample of lesson plans and other supporting documents when l
gathering data.
1 l
Each day, the Process and Progran Teams met separately. Team members j
submitted a Daily Evaluation Update (forin enclosed) to their respective Team Leaders, noting areas of possible concern. Team Leaders would review the concerns, especially with respect to relating concerns to specific criteria.
After the review, the two groups. Process and Program, would meet jointly and exchange their findings for the day, comparing notes, filling in gaps in other team member's findings, and planning their activities for the following h
day.
i The Team Leader and Team Manager also used this time to bring up areas of particular concern or areas that needed emphasis during the team's data collection. Team members were instructed to particularly look at exam security and feedback mecht.nisms throughout the system at Prairie Island.
Interviews i
I sat in on a number of interviews during the week. Team members used l
interview questions (enclosed) that were tied to the INPO Accreditation i
Criteria.
(Note that Prairie Island was being evaluated against,INP0's old criteria. According to the Team Leader, this will be the test time that this occurs.) The first interview I attended was conducted by the Process Team Leader and a Peer Evaluator from his team. The purpose of the interview was sm)
- u.s.am au
. -4oo.a.
~:c mu m no,eo ccu ano OFFICIAL RECORD COPY
,-,.m...--,.
...-----4--
---.-,---.-,-..-<-,--w-
..r---,-.<.--,--
.m y-
%h h
k
([
h 7 -
August 7, 1985 to gather data about the methodology used for detennining the completeness of the task list for the licensed and non-licensed operator training programs.
The Peer Evaluator followed his list of questions; his Team Leader probed for more detail when he felt he wanted more infomation. Both interviewers were trying to determine exactly how the decision is made as to which tasks were appropriate and which were not. They probed for the mechanisms and cross checks that were used in finalizing the task list. When they spotted inconsistencies or gaps in the available documentation, they asked for more documentation to support the methodology.
The second interview I observed was conducted by the Training Staff Evaluator. He addressed his questions to the training staff concerns identified in Prairie Island's Self-Evaluation Report. In particular, he was concerned with how instructors received feedback from trainees and the Operations Department. He asked for actual forms that had been ft11ed out In although he did not attempt to evaluate the contents of the coments.
addition to thorough questions about the feedback mechanisms, he asked the respondent to describe a scenario for when an instructor had been identified as having an instructional or technical weakness. He also attempted to determine the causes for a lack of instructor preparation time, e.g.,
insufficient staff, inability to fill slots, inability to get new slots approved. When he interviewed a second member of the training staff management, he used the same set of questions to obtain the needed information. He also asked a number of questions about the multi-path system for certifying instructors. He continued to expand on his questions until he was sure he understood the system and the lines of responsibility for certifying instructors. Another concern to which he addressed questions was instructor development. Specifically, he asked questions about programs being implemented to bring instructors up to speed in ISD/TSD skills.
I also observed interviews in the plant with management in the maintenance These questions were primarily addressed to the implementation of the area.
use of qual cards for maintenance tasks.
Observation of Classes One Team member who sat in on classes not only followed along with the lesson plan, but used the Training Department form for evaluation of the class and the instructor. Throughout the week. Team Leaders reminded Evaluators of the importance of ensuring that classes followed lesson plans. Emphasis was also placed on the quality of the lesson plans themselves, and particularly on the quality of learning objectives. 1 also observed one Team member who sat in on a class, followed the lesson plan, and interviewed the class members after the lecture was over. He asked if they saw any difference in the material in the new requalification training cycle.
He also asked questioni related to the quality of feedback mechanisms, quality of instruction, and the relevance of the content of the instruction.
- "'at
. ~.................
..................... ~. ~. - ~....
'~*"*>
- u.s.opo imwoo.u -
cc roaa m noisoi unew on OFFICIAL RECORD COPY
...,-m
N.
W k
(
- j i
August 7, 1985 Conclusions The INPO Evaluation Team members are clearly qualified to evaluate their specific areas of concern.
The process is extremely labor intensive.
It appears to be more efficient than was observed in previous visits in that Evaluators i
conducted their interviews in teams of Process and Content Evaluators whenever possible. When it was not possible to do this, an Evaluator would ask his counterpart on the other part of the team to track down some parts of his or her documentation.
The wrap-up meetings at the end of the day were an effective method of '
avoiding duplication of efforts. Often one Team member's concerns or open items could be resolved by another Team member's efforts for the day.
Peer Evaluators indicated that they had sufficient time to prepare for the Prairie Island evaluation visit.
The INPO Team clearly connunicates all open items to the utility at the l
exit interview.
Concerns i
INP0's guidelines for the content of specific programs may provide 4
guidance to Technical Evaluators but they do not appear to be tied into the guidelines and criteria for accreditation. There is also no l
provision in the guidelines to evaluate retraining in basic job skills for continued training of maintenance personnel.
Prairie My overall concern is that this evaluation was premature.
Island barely has training programs in place that follow the TSD model.
Some of the training procedures had only been completed several days before the Team's arrival. Throughout the interviews with utility personnel, it became clear that all the programs developed on the TSD model had barely been implemented. Personnel could not tell how well programs were working because there were no results as yet. This was especially evident in the area of maintenance programs.
INPO personnel indicated that this was because there was not sufficient infonnation in the utility's self-evaluation report to determine how recently TSD i
procedures had been put in place. They stated that even if this was a premature evaluation, it would provide an opportunity to help the utility get its programs implemented correctly.
I asked $ y that couldn't have been done in one of INP0's preliminary assirtance visits.
A certain amount of defensiveness on the part of Prairie EAM)
- u.s.am msoo r, me ro:n u no..o,oc" "*
OFFICIAL RECORD COPY
- =.
J
- L L
- -.--..........+..r
v.- -
- y. -.
Q Q
x.
m_
August 7, 1985 i
Islanc"s training personnel was noted at debriefings each morning.
If this had been an assistance visit rather than an evaluation for l,
accreditation, this might not have occurred.
If INPO reviews a self-evaluation report and finds it lacking information, they should i
request clarification or additional information before evaluating.
i I
The NRC observer from Region III indicated that he believes that INP0's exclusion of management training from their evaluations is a weakness in i
l their overall program. He also indicated that INP0's concentration on their criteria to the exclusion of all else results in too narrow a i
focus. Areas of concern that don't match any specific criterion may l
have long-range implications for the successful implementation of training programs.
I concur. More than once during the wrap-up j
meetings each day, the Team Manager would remind Evaluators that if an l
l item of concern couldn't be tied to a criterion, then it shouldn't be a i
concern of the Team.
I understand that INPO must evaluate against their l
established criteria, but other concerns that may have long-range implications for training should at least be noted.
INPO's response to this was that we should keep in mind that this was not a "one-shot" effort, i.e., they could resolve problems that crop up at some later visit.
At least among that segment of the public that concerns itself with the j
nuclear industry, the INPO accreditation of nuclear training programs l
has been of great interest. INPO has indicated that accreditation will result in excellence in training.
If accreditation is awarded to utilities that are only in the early stages of implementing programs on the TSD model, the public may be led to an inappropriate conclusion.
There is no evidence to support the judgment that a utility training program is excellent if it can achieve accreditation based on an incomplete system that has yet to turn out any end products, i.e., a group of trainees who have completed an entire training cycle based on the TSD model. We mentioned this concern to the INPO Team Leaders.
Their response was that in the interest of meeting the goal of accrediting all utility programs by the end of 1986, it was necessary to lower their benchmark. They stated that they would raise the benchmark for.the next round of evaluations for accreditation. They also reiterated the connent made at the May 22 training session in Atlanta, i.e., it doesn't matter if all the criteria are not met, as long as the overall objective is achieved. This seems to be counter to the emphasis placed en the importance of being able to tie identified concerns to specific criteria. Either the criteria are important or they are nnt.
IMPO has stated their wish that NRC observers be present thp ugh the entire process.
In the interest of conserving resources, I believe it would be just as beneficial if we observed the process after it was in progress for one or two days. There is more to be gained by observing
- u.s.oco im-4een ;
ac ro=u m no,ooi.cu es.o OFFICIAL RECORD COPY
]
n pe---
Y~
W Q
' August 7, 1985 1
I It is the process after the team has gathered some initial data.
essential to be present at the latter half of the process to determine how concerns are resolved and how open items and concerns are presented to the utility at the exit debriefing.
I.
i.
Dolores S. Morisseau. Training and Assessment Specialist Personnel Qualifications Section Licensee Qualifications Branch, DHFS s
Enclosure:
As stated 1
t DW DSM3/WK OF JULY 15-JULY 19 t
i An A l[
.d..... B./..,/........
omesp.L.Q..B../.D.H.I 4
3Morissa tb JDer nsky oan) 8/ 7 /85
.......7./..g2.....
~
- u.s. aco au-4m,4 4ac rou sie sio, oinnem o 4o OFFICIAL RECORD COPY m
Assignments for the Prairie Island Accreditation Team Visit Ashley Erwin Team Manager M Ron Fritchley Team Manager in Training Lead Process Evaluator and Process Evaluator Mike Sakmar
-Training Staff Charlie Felton Lead Process Evaluator in Training Process Evaluator - Organization and Dan Poling Administration and Resources and Facilities Process Evaluator - Program Development, Al Haeger (Peer-Comonwealth Edison Co.
Non-Licensed Operator Training Process Evaluator - Program Development, Pat Wilson Licensed Operator Training Process Evaluator-Program Development, Ron Hollen (Peer-Kansas Gas & Electric)
Mechanical / Electrical Training Lead Program Evaluator and Program Evaluator W. A. Pruett
-Licensed Operator Requalification Training Gary Pollard Lead Program Evaluator in Training John Richmond Program Evaluator - Non-Licensed Operator Training Clyde Brewer Program Evaluator - Licensed Operator (Peer-Tennessee Valley Auth.)
(RO,SR0/SS) Training Program Evaluator - Electrical Joe D. Carter (Peer-Baltimore Gas & Electric)
Maintenance Training Quinton Williams-Program Evaluator - Mechanical (Peer-Alabama Power Co.)
Maintenance Training OBSERVERS t
Vice President, Comonwealth Edison Company Cordell Reed and Chairman, INPO Accrediting Board.
Dolores Morisseau NRC Headquarters Staff Bill Guldemand NRC Region 3 Staff
Jm I
l DAILY EVALUATION UPDATE Please complete this form prior to the daily grodp meeting.
DATE:
EVALUATOR:
CRITERIA AREA ASSIGNED:
CONCERNS (Briefly state your area (s) of concern, what appears to evidence of the problem, and be prepared be the root problem (s),
to reference the applicable INPO Accreditation Criteria.)
Update No.
Acc.
Concern Criteria 9
i t
l (Continued On Reverse) i
l No.
Acc.
Concern Update Criteria 9
t
's 1
l I
O e
S 4
O.
k I
1 TRAINING OBSERVED - CLASS ROOM / SIMULATOR / LABORATORY /OJT POSITIVE POINTS i
ITEMS FOR OTHER TEAM MEMBERS
}
s INPO ACCREDITATION EVALUATOR'S DATA GATHERING QUESTIONS e
6 4
e t
Training Organisation Goals, Obiectives,-and Plans-j
(~.
1
(
Criteria:
III. A. 2.a. (1), (2), (3)
Training organization goals, objectives and plans a.
(1)
Goals, objectives, and training plans are clearly and concisely stated.
i (2)
The goals, objectives', and training plans are reviewed.
periodically and revised as needed.
(3)
The training. organization's performanc.e is evaluated i
against established goals and objectives.
s Questions:
I 1.
Does a formal, written statement of goals and objectives exist?
What are they?
2.
Are the objectives specific, messurable and results oriented?
i Is there a training plan to accomplish the goals and 3.
'g obje tives?
What does it or other site documents say as to hoe objectives will be operationalized?
i 4.
What use is made of the goals and objectives?
By whom?
i l
Are the goals and objectives understood by all?
5.
How often are the goals and objectives rev.ised?. Who eevises3.; based'an~*rhat inyut?
6.
What methods of organizati'onal evaluatibns'are spectfbed?
7.
Now arr'the various reviews used to evalu' ate per forissnee?: Does the evaluation measure the, progress-l of the tesining plan?
Is there a direct relationship. :-
between stated goals and objectives and the measurement -.
tools?-
8.
What is MeecheMan for the feedba' k of.-evaluation c
results? 'Whatiin theliycle?
What happens if.a goal or objective.*is not metP:
l l
!).
- -- -,-.~ - - -,,~. _..,. - - -.,. -,.
,,~.n.,...
Organizational Structure c'
Criteria:
III. A. 2. b. (1), (2), (3) b.
Organizational structure (1)
The responalbilities and authority of trainng personnel l
are clearly defined in writing.
I (2)
The relationship of the training organization to the remainder of the corporate structure is clearly defined 1
in writing.
i (3)
The training organization is structured to enable l
training personnet to perform their assigned tasks.
i s
i Questions:
l.
Do written job descriptions exist?
Are job functions, duties, responsibilities, accountabilities,
(
qualifications and authority covered?
. 1 2.
Who writes these job descriptions?
Who approves them?
4 a
(3 3.
Does the organizational chart agree with content of the position descriptions?
4.
How does training effectively support production and f
operations as organized?
[
[
5.
How is the organization planned and directed to ensure cooperative decisions?
6.
Is each organizational unit assigned specific respon-sibilities?
What are.they?
Does each organizational t
unit have sufficient authority to carry out its
{
l responsibilities?
l r
l 7.
How is training involved in training decisions?
Does an 1
effective decision-making process exist?
How does it j
work?
q f
i f
i
}
I l
Procedures s
(
Criteria:
III. A. 2.c. (1), (2) c.
Procedures (1)
Written training procedurcs are established and implemented for the following:
(a) development, implementation, and revision of training programs and materials (b) maintenance of training records (c) evaluation of training effectiveness (d) development and evaluation of the instructional staff
~
(e) assessment of trainee knowledge and skills (f) use of feedback from on-the-job performance of trainees j
Training procedures are reviewed and revised as
,( g (2) needed.
Questions:
1.
Does a procedure exist for each criterion item?
l 2.
How are policies and procedures formalized, published, and distributed?
Is sufficient detail provided to be of useful guidance to staff?
3.
Are procedures content specific to ensure uniformity and consistency?
Do actual practices agree with written policies and procedures?
4.
Do procedures specify the operation of training programs j
in a systematic manner?
5.
How are procedures reviewed and revised?
How often?
By whom?
t (s
l l
-.--r,-
r-
Recordkeeping
(
criteria: UI. A. 2.d. (1), (2), (3), (4) d.
Recordkeeping (1)
Records of training programs and trainee performance are maintained for effective entry and retrieval.
(2)
Records relating to training programs permit review of content, schedule, results of past and current programs.
(3)
Individual trainee records include a history of trainee performance and permit verification of attainment of required qualifications.
(4)
Retention periods are specified for training records Questions:
1.
What procedures contial recordkeeping?
Who has access to records?
Are back-up files maintained?
Where are records kept?
2.
What is included in a program record?
Determine
(
completeness?
3.
What is included in a trainee record?
Determine completeness?
Can a trainee record permit review of pre-entry qualifications and testing, training performance, and on-the-job performance (appropriate to training)?
4.
What is the retention period?
How is it determined?
W t
r i
i j
l Contracted Training
(
i Criteria:
III. A. 2.e. (1), (2) e.
Contracted training (1)
Contracts for training programs and services are consistent with the training organization's goals, objectives, plans, and policies.
(2)
Programs offered under contract remain under the control of the sponsoring utility and are monitored by it to ensure that the INPC Accreditation Criteria are met.
Utility monitoring and control should include the followings.
(a) approval of training program content, evaluation instruments, performance standards, and instructional materials (b) approval of instructor qualifications (c) periodic monitoring of instruction both on-site and off-site by supervisory personnel (d) retention of all training records (e) trainee evaluation of instruction submitted tc the utility Questions:
1.
sow do the contracts support the training organization goals, objectives, plans, and policies?
2.
How does the utility monitor and control the above mentioned items if contracted?
l 3.
Do procedures exist to cover all aspects of contracted training?
Describe.
e t
/~,
~
. -- - l l
Physical Facilities and Equipment
('
Criteria:
III. B. 2. a. (1), (2), (3), (4), (5), (6), (7)
Physical fa'cilities and equipment a.
(1)
Classroom facilities meet current training needs.
(2)
Laboratories and shop facilities are available for scheduled training and are appropriately equipped to provide hands-on training for trainees.
(3)
A real-time, full-scope control room simulator is available for providing hands-on training in recognition and control of normal, abnormal, and emergency plant
=
conditions for the operator's plant.
When a plant-spec'ific simulator is not available, the utility should determine the most appropriate simulator available for training.
(4) office space and furnishings are adequate to accommodate the needs of current training personnel.
(5)
The training staff has available necessary audiovisual aids and equipment.
.s (6)
Tools and equipment used in training are similst to those used on the job by the trainees.
(7)
Facilities and equipment are adequately maintained.
Questions:
1.
How are training needs met with respect to classroom facilities, laboratories, and shop facilities?
2.
Does the facility environment facilitate effective learning?
i 3.
What method determines classroom requirements?
i I
4.
What is th,e availability of these facilities?
5.
What simulator is used in training?
How is it used?
What is its availability?
6.
What methods are used to determine training edquirements for a simulator?
(
l t.
Trainee Evaluation
{'
Criteria:
II.B.3 A systematic method exists for evaluating and documenting the j
level of competence of individual and should include written I
examinations; oral examinations; practical ability demonstrations; and final, comprehensive examinations, as appropriate.
Criteria:
III.C. 2. e. (1), (2), (3), (4), (5), (6), (7) e.
Trainee evaluation (1)
Trainees are pretested routinely or otherwise evaluated' to ensure that they meet training program prerequisites.
(2)
Procedures exist for granting exemptions from training based on previous training or experience when documentation or assessment indicate the individual possesses the required knowledge and skills.
(3)
Trainee performance is evaluated regularly against established learning objectives.
Trainee evaluation is conducted throughout each training (4) program.
(5)
Trainees whose performance is unsatisfactory are given remedial training, recycled through training, or removed from training.
(6)
Procedures exist to ensure that ri'ainees are consistently evaluated, while minimizing possible compromise of examinations prior to administration.
(7)
Follow-up evaluations of trainees' on-the-job performance are conducted routinely.
l Questions:
1.
Are the examination methods used appropriate to the required performance?
Do the examination methods accomodate adult learning and working styles?
(e.g.,
use of oral exams, practical demonstrations of work performance, questions placed in the context of task i
performance)?
l 2.
Is testing and feedback conducted on a routine and l
f-frequent basis to inform trainees of their status and progress.
3.
Are exemptions granted as the result of thorough testing
_a, g
--5,--y w
g,.
on the basis of competencies gained by experience or background?
Are all trainees required to undergo all of the training program?
Are entry level skills evaluated?
,l
(~'
Is course or trainee program scope adjusted as the result of pre-entry evaluation?
Are trainees provided with the course learning 4.
objectives, evaluated against the objective standards and informed of status and progress?
Is trainee evaluation uniform throughout all aspects of 5.
the training program for all trainees?
Are criteria for quizes the same as for exams?
Is remedial training provided when needed?
Are the 6.
program standards made clear to the trainees prior to entering the program?
Are alternatives provided to th,e trainee who fail?
When trainees fail is the program evaluated to determine if problems could be solved?
Do procedures exist to assure that each trainee has a 7.
fair and equal chance to be evaluated against written What measures are taken to protect against standards?
compromise of examinations?
Does the program evaluate on-the-job performance of 8.
trainees and feedback the results to improve program content and instructional practice of both supervisor
. ~3
/
and performer?
)
t (TJ
Program Evaluation Criteria:
III.C. 2.g. (1), (2), (3), (4), (5), (6) g.
Program evaluation (1)
A systematic evaluation process exists for all training programs.
This evaluation process includes the following:
(a) achievement of stated goals and objectives (2)
Compatibility of the trainees and the program (3)
Development and modification of program content (4) sequence of instruction and time allocations (5)
Instructional strategy and materials (6)
Feedback from on-the-job performance Questions:
.q 1.
What questions were posed for the evaluation?
l 2.
What methods and instruments (tests, questionnaires) were used to gather data for the evaluation?
3.
What methods are used to analyze and interpret the evaluation data?
4.
What were the specific outcomes of the evaluation?
l l
5.
Does the program evaluate trainee overall performance, opinions and preferences to determine if approaches, pace and scope are appropriate to learner characteristics.
6.
How was the evaluation used to confirm or change the program objectives?
Content?
Instructional l
materials?
Program management?
Evaluation process?
7.
Does evaluation include a measure of instrucdional time vs. sctual performance levels acquired or improved?
Is i
the sequence of instruction. checked to assure enabling objectives are in place for each required terminal 7-performance?
/
i i
l I
l
?
i
~
I i
8.
Are instructional strategies evaluated to assure that p
the appropriate practice for required performance is taking place?
9.
Are instructional materials evaluated?
Are the results used to revise the materials?
s e
e e
O e
t I
\\
l'/
I
4 i
..__,...._ _ _..=
- m.--wc,--r.- a-we.- w+ wes. u = - e.e
-u-f f's Staff Size and Workload Criteria:
IV.A.2.a.-c.
A.
Staff size and workload (a)
To implement the training crograms, the training staff includes an adequate number of qualified personnel to perform the following activities:
(1) management (2) supervision (3) instruction (4) crogram development The workload of the instructional allows time for the (b) followingr (1) instructor preparation (2) materials development (3) program imorovement (4) trainee evaluation
/'-
(5) trainee counseling (6) staff professional development (c)
The instructor-to-trainee ratio is consistent with the i
training situation.
Questions:
1.
Does the organizational structure and position description's provide for the above mentioned l
activities 7 l
2.
Does a manpower plan exist for the training l
organization?
i 3.
Does a training plan allocate manpower to permit the l,
functions listed above?
4.
Does the-trgining schedule reflect time allocation to l
permit the f unctions above?
l 5.
Are workload standards' established for each instructor?
i i
6.
Does the trainee instructional workload address number of class contact hours, number of preparations, and the nature of the instruction demanded by the subject matter?
l t
m.m
...- - m._.....
.....,, _.,;Y'
- ~.. _. - _. -
_______._.-_m,,
A Are thers procedures established to deal with class 7.
size?
Are the purnose of course, the complexity of the course 8.
material, the adequacy of the facility, the competency of the instoleter, the chacacteristics of the trainee, and the cost of the program considered when determining class si.re?
S e
r l
O ep F
i L
2-
Staff Qualifications Criteria:
IV.S.2.a.-f.
B.
Staff Qualifications bl The duties and responsibilities, of the training staff (a) are clearly defined.
I; The training manager has appropriate educational, and experience qualifications for the (b) technical, position.
i The training / instructional supervisor has appropriate I
educational, technical, and experience qualifications (c) for the position.
(utility and contracted training)
Technical instructors (d) possess the following qualifications:
with technical training and experience consistent (1) the subject matter taught instructional skills appropriate for assigned (2) instructional duties Staff members responsible for program development have appropriate education, training, and experience (e) qualifications for the position.
When instructors have not yet attained the required (f) instructional qualifications or only instruct occa-sionally, the training quality is maintained through appropriate and qualified assistance and supervision.
Questions:
Are there descriptions of job functions, duties, accountabilities, and authority for i
1.
responsibilities, each position?
l Does the incumbent training manager and supervisor possess quilifications consistent with the requiremen:s 2.
delineated for the positions?
l Are the qualifications delineated in writing and do : hey and experience 3.
specify the educational, technical, requirements for the positions?
i I
L t
i i
4.
Do the incumbent instructor and program development staff possess qualifications consistent with the requirements delineated for the positions?
5.
Are the qualifications delineated in writing and do they specify the technical training and experience and instructional skills for their positions?
6.
Does a procedure exist describing how non-qualified instructors are used, how assistance is provided to them, and who is responsible for supervision and evaluation?
I o
9 e
e t
._.m Staff Develcoment and Evaluation Critera:
IV.C.2.a.-e.
C.
Staff Development and Evaluation Initial and continuing training of instructional staff (a) members is developed and implemented.
Criteria and procedures for evaluating and ensuring the (b) cualifications of the training staff are established and implemented.
(c)
Instructor performance is evaluated at least annually.
Opportunities for continuing development are provided (d) for training staff in response to both individual and organizational needs.
Those who regularly provide technical training maintain (e) their technical qualifications and their familiarity with job requirements through in-plant activities.
Questions:
How does the instructional skills training prepare an 1.
individual to plan, develop, conduct, and evaluate instruction?
Is there technical skills training to develop the 2.
technical qualifications for specialized subject area assignments?
How does the continuing training help instructors to 3.
maintain, improve, and advance their instructional skills?
How does the continuing training help instructors 4.
and advance their technical knowledge maintain, improve, for which qualification is to be maintained?
do instructors maintain familiarity with job How 5.
and requirements, plant changes, operating experiepces, technical specifications?
are the selection criteria and evaluation methods l
6.
What and procedures?
I
PROGRAM CONTENT:
CLASS ROOM 1.
How is the classroom training program purpose, content, and schedule described?
2.
Is the content described in the procedure the same as the content presented during the last program?
3.
How is the program subject matter sequenced?
Is it arranged in an orderly sequence and in small enough segments to achieve the stated learning objectives?
4.
Is the students classroom performance used to evaluate the training program?
Is the students job performance used to s
evaluate the training program?
Are the results of the evaluations used to modify the program content?
Describe the evaluation methods.
5.
Are the program entry level requirements consistently applied and documented?
6.
How is contracted training program content monitored and controlled?
7.
How is classroom training integrated into the total training system?
Who has the responsibility of scheduling to ensure that this program does not conflict with other programs using the same services?
8.
How are plant modifications, industry events, plant events, i
and procedure changes incorporated into classroom training?
9.
Does the classroom content support the OJT Training and Simulator Training ?r: grams?
I h
l 9
4 i
t t
r i
l I
4 PROGRAM CONTENT:
ON-THE-JOB TRAINING 1.
How is the OJT program purpose, content and schedule described?
How is the OJT conducted?
2.
How were the tasks developed that are listed on the qualification guide?
3.
Are the tasks listed on the qualification guide performance based?
(Frovide copies of guides).
4.
Do the qualification guides include the desired performance' criteria for, successful completion of each task?
If not, how is the performance criteria described?
5.
What type of training or certification does each evaluator and instructor receive?
6.
How are plant modifies t i ons, industry events, plant events, and procedure changes a.i,
.rporated into OJT training?
7.
Describe the methods used by the Plant Training Staf f to monitor and control the on-the-job training program.
8.
Describe feedback used to ensure OJT is performance based.
Evaluations performed after the operator is working in the position?
t i
l
U, PROGRAM CONTENT:
LESSON PLAN 1.
Complete attached check list.
2.
Are the learning objectives written to the appropriate level for the target population?
Do they support the stated training objectives (goals)?
3.
Does the lesson plan content support the stated learning objectives?
4.
Are the learning objectives complete?
5.
Are the learning objectives performance based?
Are the objectives derived from a JTA?
6.
Are exercises included in the lesson plans to further explain the learning objectives?
7.
Does Trainee material exhibit a professional quality and aid the student in mastering the learning objectives?
?
l l
f
EXAMINATION CONTENT:
P WRITTEN Do the written examination questions address the lesson plan 1.
objectives?
so that a variety of Does an exam bank of questions exist 2.
exams can be given?
keys?
f.
Are written examinations supported by complete answer 3.
and to the Is the written examination grading consistent 4.
appropriate level?
the Does the content of the written examinations reflect 5.
scope and depth of the material presented?
How are the written examination results recorded and 6.
maintained?
How are the written examination results used?
7.
is the security of the written examinations maintained?
8.
How ORAL EXAMINATIONS are the oral examination questions developed and 1.
How documented?
are the oral examination questions consistency maintained 2.
How between examiners?
- onsistency Of grading mainca:ned?
Ecw is the Of the Oral axaminacions reflect cae scope Does the :entant 4.
and depth of the material presented?
are the oral examination results used?
5.
Ecw is the secu-ity of the oral examination maintained?
6.
How t
~
PROGRAM CONTENT:
SIMULATOR i
1.
How is the simulator program, purpose, content, and schedule described?
Is the content described in the procedure the same as the 2.
l.
content used in the last program?
3.
How is the program subject matter sequenced?
Is it arranged in an orderly sequence and in small enough segments to J.
achieve the stated learning objectives?
4 Does the simulator program identify certain tasks that must 4.
be completed successfully prior to the completion of the course?
(Minimum acceptable criteria).
What methods are used to control simulator operation, and 5.
instructor performance?
(Attach documents).
6.
Describe or attach a copy of documents used by the simulator staff to conduct training.
(Exercise guides, lesson plans).
i 7.
How are plant modifications, industry events, plant events, and procedure changes incorporated into the simulator training?
8.
Describe.the methods used by the plant staff to monitor the simulator training program.
9.
What methods are used to mitigate the effect of plant and simulator differences?
l How is the contracted training program content monitored and 10.
- entrolled?
11.
'4 hat training is provided ts simulacor inscractors to ennance their instructional abilities?
11.
Is :he scudents simulator performance used to evaluate ene training program?
Is the students job performance used to evaluate the training program?
Are the results of the evaluations used to modify :ne program centen:?
Descri:e the i
evs ;a: :n me:ncds.
s f
i I
s
EXERCISE GUIDE CONTENT (SIMULATOR):
1.
Are the exercise guide learning objectives based on a JTA?
2.
Do the exercise guides contain, as a minimum, those items listed in the INPO Guidelines (INFO 82-005)?
3.
Are the objectives written to the appropriate level for the target population?
4.
Does the exercise guides content support the stated learning objectives?
5.
Do the exercise guides describe the simulator instructo(s performance?
6.
Is enough detail provided in the exercise guide to ensure that each training group receives similar training?
t
SIMULATOR EXAMINATION:
1.
How are exercises developed for the simulator examination?
2.
How is the examination graded?
Has specific criteria been identified to establish a pass / fail grade?
(
{
i
(
6 i
I f
f i
s l
f I
i
?
is the simulator modified to keep it,as identical as 7.
How 4-possible to the simulated plant?
How are plant differences handled on the non-plant 8.
specific simulator?
Assess the adequacy of space and furnishings with 9.
respect to personnel needs?
g 10.
What aids and equipment are available?
What production capabilities are available?
Assess the adequacy of aids I
and equipment with respect to training needs, staff capabilities, and physical conditions.
What methods are used to determine training requirements 11.
for tools and equipment?
How are these modified to stay similar with plant tools and equipment?
What is the availability of these tools and equipment?
What procedures are followed to maintain facilities and 12.
equipment?
How is the effectiveness of the maintenance program assessed?
e t
e 9
e t
l I~1 1
~
Reference Materials 9
(*
t criteria:
III.c. 2.b. (1), (2), (3) b.
Reference materials (1)
Technical materials such as the following are available i
for instructor and trainee reference to support training:
(a) academic texts (b) engineering handbooks (c) power plant equipment technology texts and training materials (d) nuclear power plant technology texts and training materials (e) plant-specific documents (f) trade, technical, and engineering journals (g) technical regulations, guides, codes, and standards
(
(h) training-related regulator guides, industry standards, bulletins, and guidelines (i) ma.terials describing significant industry events (j) reference materials on instructional technology, industrial training, and related topics (2)
Technical reference materials have the following characteristics:
(a) coverage of topics at the level appropriate for the programs and the trainees (b) application to the systems and equipment at the plant (c) current information with respect to plant modifications (3)
Superseded reference materials are removed f/om use.
(
l a
Questions:
r
\\
1.
How are technical materials selected, acquired, and organized?
How are they maintained in order to furnish easy access and fulfil program objectives?
How are instructors involved in building technical materials?
2.
How are materials used and by whom?
Evaluate the adequacy of service.
i 3.
What measures are used to ensure the relevance of l
l resources to the program content and performance required?
Are resources current and broad in coverage?
Evaluate the adequacy with respect to diversity and depth of materials, s
4.
What means exist to identify and remove outdated reference materials?
What is the review cycle?
O I
j W
i n
h i
l p
l Y
f I
j
I Program Development criteria:
III.C. 2 '. a. (1), (2) a.
Program development (1)
The development and modification of training programs are performed systematically.
t (a)
Program content and depth of knowledge are based on an analysis of the knowledge and skills I
required for the job position that trainees will i
occupy.
s i
(b)
. Programs are developed giving consideration to l
the entry skills and knowledges of trainees.
l (c)
On-the-job performance problems are analyzed to determine training needs.
(d)
Industry events are analyzed to determine l
training needs.
t i
(e)
Alternative means of accomplishing training needs are considered, as necessary.
l
,.(
(2)
Both training personnel and plant technical personnel participate in identifying training needs and t
developing training programs.
l Questions:
1.
What evidence exists that indicate that a systematic i
effort is used to develop and modify training progrars?
1 i
2.
What process is used for determining program content?
3.
How are the entry level knowledges and skills of the trainee used in developing and modifying training l
programs.?
4.
How is on the job performance data used in program development and modifications?
How are industry events analyzed and used as sources of 5.
data for program development and modificatiod?
i 6.
What evidence exists that indicates that alternatives to training as well as alternative training methods are I. (*
considered when training programs are developed and l
modified?
l j
i l
i
i l
7.
To what extent do both training and plant personnel C
g.
have input into training program development and modificatlon?
f S
e
.4 O
t Fk.
/
I i
Learning Objectives Criteria:
III.C.2.b.(1), (2), (3) 6.
Learning objectives (1)
Learning objectives are developed from the systematic analysis of knowledge and skills required for the job position and are stated clearly and concisely.
(2)
Learning objectives are specific, measurable, and job-performance-based, and they are arranged in a logical sequence.
(3)
Learning objectives are used to accomplish the following:
(a) determine instructional deiivery methods and j
materials
~
(b) determine acceptability of trainee performance Questions:
(.
1.
Are behavioral learning objectives (which specify what L
the trainee, not the instructor, is supposed to do) developed from an analysis of job performance?
Are the objectives written with conditions-actions-st,andards?
2.
Are conditions-actions-standards clearly stated so that the trainee can determine when he/she has acquired the i
correct performance?
3.
Are the learning objectives arranged in a logical step-by-step sequence to show the entry level, the learning steps and the required terminal performance?
4.
Does the sequence of objectives include a TPO?
Does the sequence of objectives include enabling objectives for each TPO?
5.
Is there i meaningful relationship between the objectives to permit a determination of the lower level skills and knowledges needed for a higher level complex 1
performance?
)
t 6.
Do higher level complex performance include making judgements and decisions related to job performance in an emergency situation?
i l
=
- -, - -,,, + - -
--w=
7.
Are objectives used to select teaching. practices and trainee exercises appropriate to the performance to be
(
learned?
8.
Are criterion referenced test items written to cover the objectives performance and standard?
9.
Is the performance required in the test actually taught in the course?
10.
Is content written to cover the performance called for by the objectives?
Do the objectives cover all the learning required by the content?
11.
How is job and task analysis reflected in the content?
Is the flow from JTA to content and classroom s
instructional practice evident?
V i
e I
B e
t i
/'.
J j
Instructional Materials A
Criteria:
III.C. 2.d. (1), (2) d.
Instructional Materials (1)
Course descriptions exist for each training program and describe an acceptable level of trainee performance and the method of evaluation.
{
(2)
Lesson plans exist for all courses and should include the following:
(a) suggested instructor teaching activities appropriate for the content and objectives j
f (b) appropriate instructor references, trainee texts, and trainee references (c) suggested evaluation methods and instruments for
~
i determining trainee performance (d) trainee text materials appropriata for the trainees
,r g who use them (e) audiovisual materials to support the objectives of each program Ques'tions:
f i
1.
Are written descriptions of the course objectives,
. course content, test items, instructional methods and i
techniques and provisions for feedback written and used l
by the instructor?
i
!i 2.
Does a written lesson plan exist which is used by the instructor and describes:
I i
o the teaching strategy?
l 1
o the learning activity?
o the content being covered?
l o.
available instructional materials inclWding texts, references and media?
l o
evaluation methods including feedback and t
(^#.
correction, questions and topics for discussioti and j
quiz and/or testing methods?
L l
l
o supplementary text and/or handout materials used ar f.
part of the lesson or for additional study?
o audiovisual materials and specific objectives for their use in the lesson?
3.
Do learning strategies include a mixture of appropriate techniques such as the use of small groups, worksheets, transparencies, slides, motion pictures / videotape, computer based instruction, part-task simulators or actual equipment?
o 4
e 8
i l
f.-
~~
7 Instruction C
s.
Criteria:
III.C. 2. c. (1), (2), (3), (4), (5) c.
Instruction (1)
Nethods of instruction are appropriate for the content being taught and the learning objectives.
i (2)
Subject matter is arranged and taught in an orderly sequer.co.
- (3)
Training activities permit trainees to become involved directly in the learning process; the methods of instruction permit maximum instructor-trainee interaction.
(4) opportunities exist for trainees to work with instructors to obta(n remedial help.
(5)
New techniques for the improvement of instructilon are encouraged.
t Questions:
1.
Are the methods of instruction appropriate to the content and the performance that must be learned?
2.
Has the subject matter content'been arrang'ed in a logical sequence or a sequence building from simple to complex performance?
F 3.
Does the trainee have an opportunity to practice the l
performance and obtain feedback on hcw well he/she is doing?
4.
Do lesson plans (handouts, materials, etc.) require specific trainee responses as a means of encouraging i
j involvement in the practice?
~
5.
Do lesson plans require trainees to be responsible for their own learning program by independent study and investigation?
t 6.
Is the feedback on performance prompt and appropriate?
Are opportunities for tutorial ce remedial assistance from qualified personnel made available?
f-I
~
L
7.
Are different modes of instruction for. the content or
{.
performance made available?
Are learner characteristics assessed and is provision made to accomodate different learning styles?
8.
Are the instructional approaches evaluated at this point?
Are the evaluation results used by the program to develop alternatives which improve the efficiency or effectiveness of the learning experience?
s i
e l
i
[
l
'l r
i I!
l t
[
e'
}
F
\\
[
l
_