ML102460270: Difference between revisions
StriderTol (talk | contribs) (Created page by program invented by StriderTol) |
StriderTol (talk | contribs) (Created page by program invented by StriderTol) |
||
(2 intermediate revisions by the same user not shown) | |||
Line 3: | Line 3: | ||
| issue date = 11/02/2009 | | issue date = 11/02/2009 | ||
| title = PV-2009-11-Draft Operating Test Comments | | title = PV-2009-11-Draft Operating Test Comments | ||
| author name = Garchow S | | author name = Garchow S | ||
| author affiliation = NRC/RGN-IV/DRS/OB | | author affiliation = NRC/RGN-IV/DRS/OB | ||
| addressee name = | | addressee name = | ||
Line 15: | Line 15: | ||
=Text= | =Text= | ||
{{#Wiki_filter: | {{#Wiki_filter:PV-2009-11 DRAFT OPERATING TEST COMMENTS ADMIN JPMS | ||
: 4. Job Content | |||
: 3. Attributes | |||
: 1. Determine whether the task is dynamic (D) or static (S). A dynamic task is one that involves continuous monitoring and response to varying parameters. A static task is basically a system reconfiguration or realignment. 2. Determine level of difficulty (LOD) using established 1-5 rating scale. Levels 1 and 5 represent inappropriate (low or high) discriminatory level for the license being tested. 3. Check the appropriate box when an attribute weakness is identified: | : 1. 2. Errors 6. | ||
=s judgment, is the JPM as written (U)nacceptable (requiring repair or replacement), in need of (E)ditorial enhancement, or (S)atisfactory? 6. Provide a brief description of any U or E rating in the explanation column. | 5. | ||
: 7. Save initial review comments as normal black text; indicate how comments were resolved using blue text so that each JPM used on the exam is reflected by a (S)atisfactory | JPM# Dyn LOD Explanation U/E/S (D/S) (1-5) IC Cues Critical Scope Over- Job- Minutia (See below for instructions) | ||
Focus Steps (N/B) lap Link RO (RA1) S 3 S RO (RA2) S 3 S RO (RA3) S 2 S RO (RA4) S 3 S SRO S 2 S (SA1) | |||
SRO S 2 S (SA2) | |||
SRO S 3 S (SA3) | |||
: 1. Determine whether the task is dynamic (D) or static (S). A dynamic task is one that involves continuous monitoring and response to varying parameters. A static task is basically a system reconfiguration or realignment. 2. Determine level of difficulty (LOD) using established 1-5 rating scale. Levels 1 and 5 represent inappropriate (low or high) discriminatory level for the license being tested. 3. Check the appropriate box when an attribute weakness is identified: | SRO S 3 S (SA4) | ||
$ The initiating cue is not sufficiently clear to ensure the operator understands the task and how to begin. | SRO S 3 S (SA5) | ||
$ The JPM does not contain sufficient cues that are objective (not leading). | Instructions for Completing Matrix This form is not contained in or required by NUREG-1021. Utilities are not required or encouraged to use it. The purpose of this form is to enhance regional consistency in reviewing operating tests. Additional information on these areas may be found in Examination Good Practices Appendix D. Check or mark any item(s) requiring comment and explain the issue in the space provided. | ||
$ All critical steps (elements) have not been properly identified. | : 1. Determine whether the task is dynamic (D) or static (S). A dynamic task is one that involves continuous monitoring and response to varying parameters. A static task is basically a system reconfiguration or realignment. | ||
$ Scope of the task is either too narrow (N) or too broad (B). | : 2. Determine level of difficulty (LOD) using established 1-5 rating scale. Levels 1 and 5 represent inappropriate (low or high) discriminatory level for the license being tested. | ||
$ Excessive overlap with other part of operating test or written examination. 4. Check the appropriate box when a job content error is identified: | : 3. Check the appropriate box when an attribute weakness is identified: | ||
=s judgment, is the JPM as written (U)nacceptable (requiring repair or replacement), in need of (E)ditorial enhancement, or (S)atisfactory? 6. Provide a brief description of any U or E rating in the explanation column. 7. Save initial review comments as normal black text; indicate how comments were resolved using blue text so that each JPM used on the exam is reflected by a (S)atisfactory resolution on this form. | * The initiating cue is not sufficiently clear to ensure the operator understands the task and how to begin. | ||
* The JPM does not contain sufficient cues that are objective (not leading). | |||
=s judgment, rate the scenario set as (U)nacceptable (requiring repair or replacement), in need of (E)ditorial enhancement, or (S)atisfactory. 10. Provide a brief description of problem in the explanation column. 11. Save initial review comments as normal black text; indicate how comments were resolved using blue text so that each JPM used on the exam is reflected by a (S)atisfactory resolution on this form.}} | * All critical steps (elements) have not been properly identified. | ||
* Scope of the task is either too narrow (N) or too broad (B). | |||
* Excessive overlap with other part of operating test or written examination. | |||
: 4. Check the appropriate box when a job content error is identified: | |||
* Topics not linked to job content (e.g., disguised task, not required in real job). | |||
* Task is trivial and without safety significance. | |||
: 5. Based on the reviewer=s judgment, is the JPM as written (U)nacceptable (requiring repair or replacement), in need of (E)ditorial enhancement, or (S)atisfactory? | |||
: 6. Provide a brief description of any U or E rating in the explanation column. | |||
: 7. Save initial review comments as normal black text; indicate how comments were resolved using blue text so that each JPM used on the exam is reflected by a (S)atisfactory 0 Page 1 of 4 OBDI 202 - IOLE Process | |||
resolution on this form. 0 Page 2 of 4 OBDI 202 - IOLE Process | |||
PV-2009-11 DRAFT OPERATING TEST COMMENTS CONTROL ROOM/IN-PLANT SYSTEMS JPMS | |||
: 4. Job Content | |||
: 1. 2. 3. Attributes 6. | |||
Errors 5. | |||
JPM# Dyn LOD Explanation IC Cues Critical Scope Over- Job- Minutia U/E/S (D/S) (1-5) (See below for instructions) | |||
Focus Steps (N/B) lap Link JS1 D 2 S JS2 D 3 S JS3 D 3 S JS4 D 3 S JS5 D 3 S JS6 D 3 S JS1 S 3 S JS2 S 3 S JP1 S 3 S JP2 S 3 S JP3 S 3 S Instructions for Completing Matrix This form is not contained in or required by NUREG-1021. Utilities are not required or encouraged to use it. The purpose of this form is to enhance regional consistency in reviewing operating tests. Additional information on these areas may be found in Examination Good Practices Appendix D. Check or mark any item(s) requiring comment and explain the issue in the space provided. | |||
: 1. Determine whether the task is dynamic (D) or static (S). A dynamic task is one that involves continuous monitoring and response to varying parameters. A static task is basically a system reconfiguration or realignment. | |||
: 2. Determine level of difficulty (LOD) using established 1-5 rating scale. Levels 1 and 5 represent inappropriate (low or high) discriminatory level for the license being tested. | |||
: 3. Check the appropriate box when an attribute weakness is identified: | |||
$ The initiating cue is not sufficiently clear to ensure the operator understands the task and how to begin. | |||
$ The JPM does not contain sufficient cues that are objective (not leading). | |||
$ All critical steps (elements) have not been properly identified. | |||
$ Scope of the task is either too narrow (N) or too broad (B). | |||
$ Excessive overlap with other part of operating test or written examination. | |||
: 4. Check the appropriate box when a job content error is identified: | |||
* Topics not linked to job content (e.g., disguised task, not required in real job). | |||
* Task is trivial and without safety significance. | |||
: 5. Based on the reviewer=s judgment, is the JPM as written (U)nacceptable (requiring repair or replacement), in need of (E)ditorial enhancement, or (S)atisfactory? | |||
: 6. Provide a brief description of any U or E rating in the explanation column. | |||
: 7. Save initial review comments as normal black text; indicate how comments were resolved using blue text so that each JPM used on the exam is reflected by a (S)atisfactory resolution on this form. 0 Page 3 of 4 OBDI 202 - IOLE Process | |||
PV-2009-11 DRAFT OPERATING TEST COMMENTS SCENARIOS Scenario 1. 2. 3. 4. 5. 6. 7. 8. 9. | |||
: 10. Explanation (See below for instructions) | |||
Set ES TS Crit IC Pred TL L/C Eff U/E/S 1 S 2 S 3 S 4 S Instructions for Completing Matrix This form is not contained in or required by NUREG-1021. Utilities are not required or encouraged to use it. The purpose of this form is to enhance regional consistency in reviewing operating test scenario sets. Additional information on these areas may be found in Examination Good Practices Appendix D. Check or mark any item(s) requiring comment and explain the issue in the space provided. | |||
: 1. ES: ES-301 checklists 4, 5, & 6 satisfied. | |||
: 2. TS: Set includes SRO TS actions for each SRO, with required actions explicitly detailed. | |||
: 3. Crit: Each manipulation or evolution has explicit success criteria documented in Form ES-D-2. | |||
: 4. IC: Out of service equipment and other initial conditions reasonably consistent between scenarios and not predictive of scenario events and actions. | |||
: 5. Pred: Scenario sequence and other factors avoid predictability issues. | |||
: 6. TL: Time line constructed, including event and process triggered conditions, such that scenario can run without routine examiner cuing. | |||
: 7. L/C: Length and complexity for each scenario in the set is reasonable for the crew mix being examined, such that all applicants have reasonably similar exposure and events are needed for evaluation purposes. | |||
: 8. Eff: Sequence of events is reasonably efficient for examination purposes, especially with respect to long delays or interactions. | |||
: 9. Based on the reviewer=s judgment, rate the scenario set as (U)nacceptable (requiring repair or replacement), in need of (E)ditorial enhancement, or (S)atisfactory. | |||
: 10. Provide a brief description of problem in the explanation column. | |||
: 11. Save initial review comments as normal black text; indicate how comments were resolved using blue text so that each JPM used on the exam is reflected by a (S)atisfactory resolution on this form. 0 Page 4 of 4 OBDI 202 - IOLE Process}} |
Latest revision as of 13:46, 13 November 2019
ML102460270 | |
Person / Time | |
---|---|
Site: | Palo Verde |
Issue date: | 11/02/2009 |
From: | Garchow S Operations Branch IV |
To: | Arizona Public Service Co |
References | |
Download: ML102460270 (4) | |
Text
PV-2009-11 DRAFT OPERATING TEST COMMENTS ADMIN JPMS
- 4. Job Content
- 3. Attributes
- 1. 2. Errors 6.
5.
JPM# Dyn LOD Explanation U/E/S (D/S) (1-5) IC Cues Critical Scope Over- Job- Minutia (See below for instructions)
Focus Steps (N/B) lap Link RO (RA1) S 3 S RO (RA2) S 3 S RO (RA3) S 2 S RO (RA4) S 3 S SRO S 2 S (SA1)
SRO S 2 S (SA2)
SRO S 3 S (SA3)
SRO S 3 S (SA4)
SRO S 3 S (SA5)
Instructions for Completing Matrix This form is not contained in or required by NUREG-1021. Utilities are not required or encouraged to use it. The purpose of this form is to enhance regional consistency in reviewing operating tests. Additional information on these areas may be found in Examination Good Practices Appendix D. Check or mark any item(s) requiring comment and explain the issue in the space provided.
- 1. Determine whether the task is dynamic (D) or static (S). A dynamic task is one that involves continuous monitoring and response to varying parameters. A static task is basically a system reconfiguration or realignment.
- 2. Determine level of difficulty (LOD) using established 1-5 rating scale. Levels 1 and 5 represent inappropriate (low or high) discriminatory level for the license being tested.
- 3. Check the appropriate box when an attribute weakness is identified:
- The initiating cue is not sufficiently clear to ensure the operator understands the task and how to begin.
- The JPM does not contain sufficient cues that are objective (not leading).
- All critical steps (elements) have not been properly identified.
- Scope of the task is either too narrow (N) or too broad (B).
- Excessive overlap with other part of operating test or written examination.
- 4. Check the appropriate box when a job content error is identified:
- Topics not linked to job content (e.g., disguised task, not required in real job).
- Task is trivial and without safety significance.
- 5. Based on the reviewer=s judgment, is the JPM as written (U)nacceptable (requiring repair or replacement), in need of (E)ditorial enhancement, or (S)atisfactory?
- 6. Provide a brief description of any U or E rating in the explanation column.
- 7. Save initial review comments as normal black text; indicate how comments were resolved using blue text so that each JPM used on the exam is reflected by a (S)atisfactory 0 Page 1 of 4 OBDI 202 - IOLE Process
resolution on this form. 0 Page 2 of 4 OBDI 202 - IOLE Process
PV-2009-11 DRAFT OPERATING TEST COMMENTS CONTROL ROOM/IN-PLANT SYSTEMS JPMS
- 4. Job Content
- 1. 2. 3. Attributes 6.
Errors 5.
JPM# Dyn LOD Explanation IC Cues Critical Scope Over- Job- Minutia U/E/S (D/S) (1-5) (See below for instructions)
Focus Steps (N/B) lap Link JS1 D 2 S JS2 D 3 S JS3 D 3 S JS4 D 3 S JS5 D 3 S JS6 D 3 S JS1 S 3 S JS2 S 3 S JP1 S 3 S JP2 S 3 S JP3 S 3 S Instructions for Completing Matrix This form is not contained in or required by NUREG-1021. Utilities are not required or encouraged to use it. The purpose of this form is to enhance regional consistency in reviewing operating tests. Additional information on these areas may be found in Examination Good Practices Appendix D. Check or mark any item(s) requiring comment and explain the issue in the space provided.
- 1. Determine whether the task is dynamic (D) or static (S). A dynamic task is one that involves continuous monitoring and response to varying parameters. A static task is basically a system reconfiguration or realignment.
- 2. Determine level of difficulty (LOD) using established 1-5 rating scale. Levels 1 and 5 represent inappropriate (low or high) discriminatory level for the license being tested.
- 3. Check the appropriate box when an attribute weakness is identified:
$ The initiating cue is not sufficiently clear to ensure the operator understands the task and how to begin.
$ The JPM does not contain sufficient cues that are objective (not leading).
$ All critical steps (elements) have not been properly identified.
$ Scope of the task is either too narrow (N) or too broad (B).
$ Excessive overlap with other part of operating test or written examination.
- 4. Check the appropriate box when a job content error is identified:
- Topics not linked to job content (e.g., disguised task, not required in real job).
- Task is trivial and without safety significance.
- 5. Based on the reviewer=s judgment, is the JPM as written (U)nacceptable (requiring repair or replacement), in need of (E)ditorial enhancement, or (S)atisfactory?
- 6. Provide a brief description of any U or E rating in the explanation column.
- 7. Save initial review comments as normal black text; indicate how comments were resolved using blue text so that each JPM used on the exam is reflected by a (S)atisfactory resolution on this form. 0 Page 3 of 4 OBDI 202 - IOLE Process
PV-2009-11 DRAFT OPERATING TEST COMMENTS SCENARIOS Scenario 1. 2. 3. 4. 5. 6. 7. 8. 9.
- 10. Explanation (See below for instructions)
Set ES TS Crit IC Pred TL L/C Eff U/E/S 1 S 2 S 3 S 4 S Instructions for Completing Matrix This form is not contained in or required by NUREG-1021. Utilities are not required or encouraged to use it. The purpose of this form is to enhance regional consistency in reviewing operating test scenario sets. Additional information on these areas may be found in Examination Good Practices Appendix D. Check or mark any item(s) requiring comment and explain the issue in the space provided.
- 1. ES: ES-301 checklists 4, 5, & 6 satisfied.
- 3. Crit: Each manipulation or evolution has explicit success criteria documented in Form ES-D-2.
- 4. IC: Out of service equipment and other initial conditions reasonably consistent between scenarios and not predictive of scenario events and actions.
- 5. Pred: Scenario sequence and other factors avoid predictability issues.
- 6. TL: Time line constructed, including event and process triggered conditions, such that scenario can run without routine examiner cuing.
- 7. L/C: Length and complexity for each scenario in the set is reasonable for the crew mix being examined, such that all applicants have reasonably similar exposure and events are needed for evaluation purposes.
- 8. Eff: Sequence of events is reasonably efficient for examination purposes, especially with respect to long delays or interactions.
- 9. Based on the reviewer=s judgment, rate the scenario set as (U)nacceptable (requiring repair or replacement), in need of (E)ditorial enhancement, or (S)atisfactory.
- 10. Provide a brief description of problem in the explanation column.
- 11. Save initial review comments as normal black text; indicate how comments were resolved using blue text so that each JPM used on the exam is reflected by a (S)atisfactory resolution on this form. 0 Page 4 of 4 OBDI 202 - IOLE Process