ML102230004: Difference between revisions

From kanterella
Jump to navigation Jump to search
(Created page by program invented by StriderTol)
(StriderTol Bot change)
 
Line 16: Line 16:


=Text=
=Text=
{{#Wiki_filter:CP-2010-07                                                                   DRAFT OPERATING TEST COMMENTS                                                               ADMIN JPMS
{{#Wiki_filter:Attachment 10 Page 1 of 3 OBDI 202 - IOLE Process CP-2010-07 DRAFT OPERATING TEST COMMENTS ADMIN JPMS JPM#
: 4. Job Content
: 1.
: 3. Attributes                                                                                     6.
Dyn (D/S)
: 1.       2.                                                  Errors       5.
: 2.
JPM#      Dyn      LOD                                                                                                              Explanation U/E/S (D/S)    (1-5)    IC    Cues Critical  Scope    Over-  Job-    Minutia                                          (See below for instructions)
LOD (1-5)
Focus         Steps     (N/B)   lap   Link RO (A1)                                                                                     N/A since no Admin JPMs administered RO (A2)
: 3. Attributes
: 4. Job Content Errors
: 5.
U/E/S
: 6.
Explanation (See below for instructions)
IC Focus Cues Critical Steps Scope (N/B)
Over-lap Job-Link Minutia RO (A1)
N/A since no Admin JPMs administered RO (A2)
RO (A3)
RO (A3)
RO (A4)
RO (A4)
Line 30: Line 38:
SRO (A9)
SRO (A9)
Instructions for Completing Matrix This form is not contained in or required by NUREG-1021. Utilities are not required or encouraged to use it. The purpose of this form is to enhance regional consistency in reviewing operating tests. Additional information on these areas may be found in Examination Good Practices Appendix D. Check or mark any item(s) requiring comment and explain the issue in the space provided.
Instructions for Completing Matrix This form is not contained in or required by NUREG-1021. Utilities are not required or encouraged to use it. The purpose of this form is to enhance regional consistency in reviewing operating tests. Additional information on these areas may be found in Examination Good Practices Appendix D. Check or mark any item(s) requiring comment and explain the issue in the space provided.
: 1. Determine whether the task is dynamic (D) or static (S). A dynamic task is one that involves continuous monitoring and response to varying parameters. A static task is basically a system reconfiguration or realignment.
: 1.
: 2. Determine level of difficulty (LOD) using established 1-5 rating scale. Levels 1 and 5 represent inappropriate (low or high) discriminatory level for the license being tested.
Determine whether the task is dynamic (D) or static (S). A dynamic task is one that involves continuous monitoring and response to varying parameters. A static task is basically a system reconfiguration or realignment.
: 3. Check the appropriate box when an attribute weakness is identified:
: 2.
* The initiating cue is not sufficiently clear to ensure the operator understands the task and how to begin.
Determine level of difficulty (LOD) using established 1-5 rating scale. Levels 1 and 5 represent inappropriate (low or high) discriminatory level for the license being tested.
* The JPM does not contain sufficient cues that are objective (not leading).
: 3.
* All critical steps (elements) have not been properly identified.
Check the appropriate box when an attribute weakness is identified:
* Scope of the task is either too narrow (N) or too broad (B).
The initiating cue is not sufficiently clear to ensure the operator understands the task and how to begin.
* Excessive overlap with other part of operating test or written examination.
The JPM does not contain sufficient cues that are objective (not leading).
: 4. Check the appropriate box when a job content error is identified:
All critical steps (elements) have not been properly identified.
* Topics not linked to job content (e.g., disguised task, not required in real job).
Scope of the task is either too narrow (N) or too broad (B).
* Task is trivial and without safety significance.
Excessive overlap with other part of operating test or written examination.
: 5. Based on the reviewer=s judgment, is the JPM as written (U)nacceptable (requiring repair or replacement), in need of (E)ditorial enhancement, or (S)atisfactory?
: 4.
: 6. Provide a brief description of any U or E rating in the explanation column.
Check the appropriate box when a job content error is identified:
: 7. Save initial review comments as normal black text; indicate how comments were resolved using blue text so that each JPM used on the exam is reflected by a (S)atisfactory resolution on this form. 0                                                                            Page 1 of 3                                                          OBDI 202 - IOLE Process
Topics not linked to job content (e.g., disguised task, not required in real job).
Task is trivial and without safety significance.
: 5.
Based on the reviewer=s judgment, is the JPM as written (U)nacceptable (requiring repair or replacement), in need of (E)ditorial enhancement, or (S)atisfactory?
: 6.
Provide a brief description of any U or E rating in the explanation column.
: 7.
Save initial review comments as normal black text; indicate how comments were resolved using blue text so that each JPM used on the exam is reflected by a (S)atisfactory resolution on this form.  


CP-2010-07                                                               DRAFT OPERATING TEST COMMENTS                           CONTROL ROOM/IN-PLANT SYSTEMS JPMS
0 Page 2 of 3 OBDI 202 - IOLE Process CP-2010-07 DRAFT OPERATING TEST COMMENTS CONTROL ROOM/IN-PLANT SYSTEMS JPMS JPM#
: 4. Job Content
: 1.
: 1.      2.                 3. Attributes                                                                               6.
Dyn (D/S)
Errors      5.
: 2.
JPM#      Dyn    LOD                                                                                                        Explanation IC    Cues Critical    Scope  Over-  Job-    Minutia U/E/S (D/S)    (1-5)                                                                                                (See below for instructions)
LOD (1-5)
Focus           Steps     (N/B) lap   Link S1                                                                                     N/A since no JPMs administered S2 S3 S4 S5 S6 S7 S8 P1 P2 P3 Instructions for Completing Matrix This form is not contained in or required by NUREG-1021. Utilities are not required or encouraged to use it. The purpose of this form is to enhance regional consistency in reviewing operating tests. Additional information on these areas may be found in Examination Good Practices Appendix D. Check or mark any item(s) requiring comment and explain the issue in the space provided.
: 3. Attributes
: 4. Job Content Errors
: 5.
U/E/S
: 6.
Explanation (See below for instructions)
IC Focus Cues Critical Steps Scope (N/B)
Over-lap Job-Link Minutia S1 N/A since no JPMs administered S2 S3 S4 S5 S6 S7 S8 P1 P2 P3 Instructions for Completing Matrix This form is not contained in or required by NUREG-1021. Utilities are not required or encouraged to use it. The purpose of this form is to enhance regional consistency in reviewing operating tests. Additional information on these areas may be found in Examination Good Practices Appendix D. Check or mark any item(s) requiring comment and explain the issue in the space provided.
: 1. Determine whether the task is dynamic (D) or static (S). A dynamic task is one that involves continuous monitoring and response to varying parameters. A static task is basically a system reconfiguration or realignment.
: 1. Determine whether the task is dynamic (D) or static (S). A dynamic task is one that involves continuous monitoring and response to varying parameters. A static task is basically a system reconfiguration or realignment.
: 2. Determine level of difficulty (LOD) using established 1-5 rating scale. Levels 1 and 5 represent inappropriate (low or high) discriminatory level for the license being tested.
: 2. Determine level of difficulty (LOD) using established 1-5 rating scale. Levels 1 and 5 represent inappropriate (low or high) discriminatory level for the license being tested.
: 3. Check the appropriate box when an attribute weakness is identified:
: 3. Check the appropriate box when an attribute weakness is identified:  
        $ The initiating cue is not sufficiently clear to ensure the operator understands the task and how to begin.
$ The initiating cue is not sufficiently clear to ensure the operator understands the task and how to begin.  
        $ The JPM does not contain sufficient cues that are objective (not leading).
$ The JPM does not contain sufficient cues that are objective (not leading).  
        $ All critical steps (elements) have not been properly identified.
$ All critical steps (elements) have not been properly identified.  
        $ Scope of the task is either too narrow (N) or too broad (B).
$ Scope of the task is either too narrow (N) or too broad (B).  
        $ Excessive overlap with other part of operating test or written examination.
$ Excessive overlap with other part of operating test or written examination.
: 4. Check the appropriate box when a job content error is identified:
: 4. Check the appropriate box when a job content error is identified:
* Topics not linked to job content (e.g., disguised task, not required in real job).
* Topics not linked to job content (e.g., disguised task, not required in real job).
Line 64: Line 86:
: 5. Based on the reviewer=s judgment, is the JPM as written (U)nacceptable (requiring repair or replacement), in need of (E)ditorial enhancement, or (S)atisfactory?
: 5. Based on the reviewer=s judgment, is the JPM as written (U)nacceptable (requiring repair or replacement), in need of (E)ditorial enhancement, or (S)atisfactory?
: 6. Provide a brief description of any U or E rating in the explanation column.
: 6. Provide a brief description of any U or E rating in the explanation column.
: 7. Save initial review comments as normal black text; indicate how comments were resolved using blue text so that each JPM used on the exam is reflected by a (S)atisfactory resolution on this form. 0                                                                          Page 2 of 3                                                      OBDI 202 - IOLE Process
: 7. Save initial review comments as normal black text; indicate how comments were resolved using blue text so that each JPM used on the exam is reflected by a (S)atisfactory resolution on this form.  


CP-2010-07                                                             DRAFT OPERATING TEST COMMENTS                                                             SCENARIOS Scenario       1.       2. 3.     4.       5.       6.       7.     8. 9.
0 Page 3 of 3 OBDI 202 - IOLE Process CP-2010-07 DRAFT OPERATING TEST COMMENTS SCENARIOS Scenario Set
: 10. Explanation (See below for instructions)
: 1.
Set        ES        TS    Crit    IC    Pred      TL      L/C    Eff  U/E/S 1                                                                             S 2           X                                                                 E     Needs two more I/C items to have enough for each position if one applicant takes credit for anothers during actual run of scenarios. ES 301-5 form needs to be changed accordingly.
ES
S    Added 2 more I/C items 3           X                                                                 S Instructions for Completing Matrix This form is not contained in or required by NUREG-1021. Utilities are not required or encouraged to use it. The purpose of this form is to enhance regional consistency in reviewing operating test scenario sets. Additional information on these areas may be found in Examination Good Practices Appendix D. Check or mark any item(s) requiring comment and explain the issue in the space provided.
: 2.
: 1. ES: ES-301 checklists 4, 5, & 6 satisfied.
TS
: 2. TS: Set includes SRO TS actions for each SRO, with required actions explicitly detailed.
: 3.
: 3. Crit: Each manipulation or evolution has explicit success criteria documented in Form ES-D-2.
Crit
: 4. IC: Out of service equipment and other initial conditions reasonably consistent between scenarios and not predictive of scenario events and actions.
: 4.
: 5. Pred: Scenario sequence and other factors avoid predictability issues.
IC
: 6. TL: Time line constructed, including event and process triggered conditions, such that scenario can run without routine examiner cuing.
: 5.
: 7. L/C: Length and complexity for each scenario in the set is reasonable for the crew mix being examined, such that all applicants have reasonably similar exposure and events are needed for evaluation purposes.
Pred
: 8. Eff: Sequence of events is reasonably efficient for examination purposes, especially with respect to long delays or interactions.
: 6.
: 9. Based on the reviewer=s judgment, rate the scenario set as (U)nacceptable (requiring repair or replacement), in need of (E)ditorial enhancement, or (S)atisfactory.
TL
: 7.
L/C
: 8.
Eff
: 9.
U/E/S
: 10. Explanation (See below for instructions) 1 S
2 X
E S
Needs two more I/C items to have enough for each position if one applicant takes credit for anothers during actual run of scenarios. ES 301-5 form needs to be changed accordingly.
Added 2 more I/C items 3
X S
Instructions for Completing Matrix This form is not contained in or required by NUREG-1021. Utilities are not required or encouraged to use it. The purpose of this form is to enhance regional consistency in reviewing operating test scenario sets. Additional information on these areas may be found in Examination Good Practices Appendix D. Check or mark any item(s) requiring comment and explain the issue in the space provided.
: 1.
ES: ES-301 checklists 4, 5, & 6 satisfied.
: 2.
TS: Set includes SRO TS actions for each SRO, with required actions explicitly detailed.
: 3.
Crit: Each manipulation or evolution has explicit success criteria documented in Form ES-D-2.
: 4.
IC: Out of service equipment and other initial conditions reasonably consistent between scenarios and not predictive of scenario events and actions.
: 5.
Pred: Scenario sequence and other factors avoid predictability issues.
: 6.
TL: Time line constructed, including event and process triggered conditions, such that scenario can run without routine examiner cuing.
: 7.
L/C: Length and complexity for each scenario in the set is reasonable for the crew mix being examined, such that all applicants have reasonably similar exposure and events are needed for evaluation purposes.
: 8.
Eff: Sequence of events is reasonably efficient for examination purposes, especially with respect to long delays or interactions.
: 9.
Based on the reviewer=s judgment, rate the scenario set as (U)nacceptable (requiring repair or replacement), in need of (E)ditorial enhancement, or (S)atisfactory.
: 10. Provide a brief description of problem in the explanation column.
: 10. Provide a brief description of problem in the explanation column.
: 11. Save initial review comments as normal black text; indicate how comments were resolved using blue text so that each JPM used on the exam is reflected by a (S)atisfactory resolution on this form. 0                                                                    Page 3 of 3                                                    OBDI 202 - IOLE Process}}
: 11. Save initial review comments as normal black text; indicate how comments were resolved using blue text so that each JPM used on the exam is reflected by a (S)atisfactory resolution on this form.}}

Latest revision as of 03:28, 14 January 2025

2010-07 Draft Operating Test Comments
ML102230004
Person / Time
Site: Comanche Peak  Luminant icon.png
Issue date: 06/21/2010
From: Kelly Clayton
Operations Branch IV
To:
Luminant Generation Co
References
50-445/10-302, 50-446/10-302
Download: ML102230004 (3)


Text

Attachment 10 Page 1 of 3 OBDI 202 - IOLE Process CP-2010-07 DRAFT OPERATING TEST COMMENTS ADMIN JPMS JPM#

1.

Dyn (D/S)

2.

LOD (1-5)

3. Attributes
4. Job Content Errors
5.

U/E/S

6.

Explanation (See below for instructions)

IC Focus Cues Critical Steps Scope (N/B)

Over-lap Job-Link Minutia RO (A1)

N/A since no Admin JPMs administered RO (A2)

RO (A3)

RO (A4)

SRO (A5)

SRO (A6)

SRO (A7)

SRO (A8)

SRO (A9)

Instructions for Completing Matrix This form is not contained in or required by NUREG-1021. Utilities are not required or encouraged to use it. The purpose of this form is to enhance regional consistency in reviewing operating tests. Additional information on these areas may be found in Examination Good Practices Appendix D. Check or mark any item(s) requiring comment and explain the issue in the space provided.

1.

Determine whether the task is dynamic (D) or static (S). A dynamic task is one that involves continuous monitoring and response to varying parameters. A static task is basically a system reconfiguration or realignment.

2.

Determine level of difficulty (LOD) using established 1-5 rating scale. Levels 1 and 5 represent inappropriate (low or high) discriminatory level for the license being tested.

3.

Check the appropriate box when an attribute weakness is identified:

The initiating cue is not sufficiently clear to ensure the operator understands the task and how to begin.

The JPM does not contain sufficient cues that are objective (not leading).

All critical steps (elements) have not been properly identified.

Scope of the task is either too narrow (N) or too broad (B).

Excessive overlap with other part of operating test or written examination.

4.

Check the appropriate box when a job content error is identified:

Topics not linked to job content (e.g., disguised task, not required in real job).

Task is trivial and without safety significance.

5.

Based on the reviewer=s judgment, is the JPM as written (U)nacceptable (requiring repair or replacement), in need of (E)ditorial enhancement, or (S)atisfactory?

6.

Provide a brief description of any U or E rating in the explanation column.

7.

Save initial review comments as normal black text; indicate how comments were resolved using blue text so that each JPM used on the exam is reflected by a (S)atisfactory resolution on this form.

0 Page 2 of 3 OBDI 202 - IOLE Process CP-2010-07 DRAFT OPERATING TEST COMMENTS CONTROL ROOM/IN-PLANT SYSTEMS JPMS JPM#

1.

Dyn (D/S)

2.

LOD (1-5)

3. Attributes
4. Job Content Errors
5.

U/E/S

6.

Explanation (See below for instructions)

IC Focus Cues Critical Steps Scope (N/B)

Over-lap Job-Link Minutia S1 N/A since no JPMs administered S2 S3 S4 S5 S6 S7 S8 P1 P2 P3 Instructions for Completing Matrix This form is not contained in or required by NUREG-1021. Utilities are not required or encouraged to use it. The purpose of this form is to enhance regional consistency in reviewing operating tests. Additional information on these areas may be found in Examination Good Practices Appendix D. Check or mark any item(s) requiring comment and explain the issue in the space provided.

1. Determine whether the task is dynamic (D) or static (S). A dynamic task is one that involves continuous monitoring and response to varying parameters. A static task is basically a system reconfiguration or realignment.
2. Determine level of difficulty (LOD) using established 1-5 rating scale. Levels 1 and 5 represent inappropriate (low or high) discriminatory level for the license being tested.
3. Check the appropriate box when an attribute weakness is identified:

$ The initiating cue is not sufficiently clear to ensure the operator understands the task and how to begin.

$ The JPM does not contain sufficient cues that are objective (not leading).

$ All critical steps (elements) have not been properly identified.

$ Scope of the task is either too narrow (N) or too broad (B).

$ Excessive overlap with other part of operating test or written examination.

4. Check the appropriate box when a job content error is identified:
  • Topics not linked to job content (e.g., disguised task, not required in real job).
  • Task is trivial and without safety significance.
5. Based on the reviewer=s judgment, is the JPM as written (U)nacceptable (requiring repair or replacement), in need of (E)ditorial enhancement, or (S)atisfactory?
6. Provide a brief description of any U or E rating in the explanation column.
7. Save initial review comments as normal black text; indicate how comments were resolved using blue text so that each JPM used on the exam is reflected by a (S)atisfactory resolution on this form.

0 Page 3 of 3 OBDI 202 - IOLE Process CP-2010-07 DRAFT OPERATING TEST COMMENTS SCENARIOS Scenario Set

1.

ES

2.

TS

3.

Crit

4.

IC

5.

Pred

6.

TL

7.

L/C

8.

Eff

9.

U/E/S

10. Explanation (See below for instructions) 1 S

2 X

E S

Needs two more I/C items to have enough for each position if one applicant takes credit for anothers during actual run of scenarios. ES 301-5 form needs to be changed accordingly.

Added 2 more I/C items 3

X S

Instructions for Completing Matrix This form is not contained in or required by NUREG-1021. Utilities are not required or encouraged to use it. The purpose of this form is to enhance regional consistency in reviewing operating test scenario sets. Additional information on these areas may be found in Examination Good Practices Appendix D. Check or mark any item(s) requiring comment and explain the issue in the space provided.

1.

ES: ES-301 checklists 4, 5, & 6 satisfied.

2.

TS: Set includes SRO TS actions for each SRO, with required actions explicitly detailed.

3.

Crit: Each manipulation or evolution has explicit success criteria documented in Form ES-D-2.

4.

IC: Out of service equipment and other initial conditions reasonably consistent between scenarios and not predictive of scenario events and actions.

5.

Pred: Scenario sequence and other factors avoid predictability issues.

6.

TL: Time line constructed, including event and process triggered conditions, such that scenario can run without routine examiner cuing.

7.

L/C: Length and complexity for each scenario in the set is reasonable for the crew mix being examined, such that all applicants have reasonably similar exposure and events are needed for evaluation purposes.

8.

Eff: Sequence of events is reasonably efficient for examination purposes, especially with respect to long delays or interactions.

9.

Based on the reviewer=s judgment, rate the scenario set as (U)nacceptable (requiring repair or replacement), in need of (E)ditorial enhancement, or (S)atisfactory.

10. Provide a brief description of problem in the explanation column.
11. Save initial review comments as normal black text; indicate how comments were resolved using blue text so that each JPM used on the exam is reflected by a (S)atisfactory resolution on this form.