ML091271100: Difference between revisions
StriderTol (talk | contribs) (Created page by program invented by StriderTol) |
StriderTol (talk | contribs) (Created page by program invented by StriderTol) |
||
(2 intermediate revisions by the same user not shown) | |||
Line 2: | Line 2: | ||
| number = ML091271100 | | number = ML091271100 | ||
| issue date = 02/04/2009 | | issue date = 02/04/2009 | ||
| title = | | title = 2009-02 - Draft Operating Test Comments | ||
| author name = | | author name = Mckernon T | ||
| author affiliation = NRC/RGN-IV | | author affiliation = NRC/RGN-IV | ||
| addressee name = | | addressee name = | ||
Line 17: | Line 17: | ||
=Text= | =Text= | ||
{{#Wiki_filter: | {{#Wiki_filter:SO - 2009 - 02 DRAFT OPERATING TEST COMMENTS ADMIN JPMS | ||
: 4. Job Content | |||
: 4. Job Content | : 1. 2. 3. Attributes 6. | ||
Errors 5. | |||
: 1. Determine whether the task is dynamic (D) or static (S). A dynamic task is one that involves continuous monitoring and response to varying parameters. A static task is basically a system | JPM# Dyn LOD Explanation U/E/S (D/S) (1-5) IC Cues Critical Scope Over- Job- Minutia (See below for instructions) | ||
: 2. Determine level of difficulty (LOD) using established 1-5 rating scale. Levels 1 and 5 represent inappropriate (low or high) discriminatory level | Focus Steps (N/B) lap Link SRO None (A1a) | ||
=s judgment, is the JPM as written (U) | SRO None (A1b) | ||
SRO (A2) None SRO (A3) None SRO (A4) None Instructions for Completing Matrix This form is not contained in or required by NUREG-1021. Utilities are not required or encouraged to use it. The purpose of this form is to enhance regional consistency in reviewing operating tests. Additional information on these areas may be found in Examination Good Practices Appendix D. Check or mark any item(s) requiring comment and explain the issue in the space provided. | |||
: 1. Determine whether the task is dynamic (D) or static (S). A dynamic task is one that involves continuous monitoring and response to varying parameters. A static task is basically a system reconfiguration or realignment. | |||
: 4. Job Content Errors JPM# | : 2. Determine level of difficulty (LOD) using established 1-5 rating scale. Levels 1 and 5 represent inappropriate (low or high) discriminatory level for the license being tested. | ||
: 3. Check the appropriate box when an attribute weakness is identified: | |||
: 1. Determine whether the task is dynamic (D) or static (S). A dynamic task is one that involves continuous monitoring and response to varying parameters. A static task is basically a system | * The initiating cue is not sufficiently clear to ensure the operator understands the task and how to begin. | ||
: 2. Determine level of difficulty (LOD) using established 1-5 rating scale. Levels 1 and 5 represent inappropriate (low or high) discriminatory level | * The JPM does not contain sufficient cues that are objective (not leading). | ||
$ The initiating cue is not sufficiently clear to ensure the operator understands the task and how to begin. | * All critical steps (elements) have not been properly identified. | ||
$ The JPM does not contain sufficient | * Scope of the task is either too narrow (N) or too broad (B). | ||
$ All critical steps (elements) have not been properly identified. | * Excessive overlap with other part of operating test or written examination. | ||
$ Scope of the task is either too narrow (N) or too broad (B). | : 4. Check the appropriate box when a job content error is identified: | ||
$ Excessive overlap with other part of operating test or | * Topics not linked to job content (e.g., disguised task, not required in real job). | ||
=s judgment, is the JPM as written (U) | * Task is trivial and without safety significance. | ||
: 5. Based on the reviewer=s judgment, is the JPM as written (U)nacceptable (requiring repair or replacement), in need of (E)ditorial enhancement, or (S)atisfactory? | |||
N /A | : 6. Provide a brief description of any U or E rating in the explanation column. | ||
: 2. TS: Set includes SRO TS actions for each SRO, with required actions explicitly detailed. | : 7. Save initial review comments as normal black text; indicate how comments were resolved using blue text so that each JPM used on the exam is reflected by a (S)atisfactory resolution on this form. 0 Page 1 of 3 OBDI 202 - IOLE Process | ||
: 3. Crit: Each manipulation or evolution has explicit success criteria documented in Form ES-D-2. | |||
: 4. IC: Out of service equipment and other initial conditions reasonably consistent between scenarios and not predictive of | SO - 2009 - 02 DRAFT OPERATING TEST COMMENTS CONTROL ROOM/IN-PLANT SYSTEMS JPMS | ||
: 5. Pred: Scenario sequence and other factors avoid predictability issues. | : 4. Job Content | ||
: 6. TL: Time line constructed, including | : 1. 2. 3. Attributes 6. | ||
: 7. L/C: Length and complexity for each scenario in the set is reasonable for the crew mix being examined, such that all applicants have reasonably | Errors 5. | ||
: 8. Eff: Sequence of events is reasonably efficient for examination purposes, especially with respect to long delays or | JPM# Dyn LOD Explanation U/E/S (D/S) (1-5) IC Cues Critical Scope Over- Job- Minutia (See below for instructions) | ||
=s judgment, rate the scenario set as (U)nacceptable (requiring | Focus Steps (N/B) lap Link S1 None S2 None S3 None S4 None S5 None S6 None S7 None P1 None P2 None P3 None Instructions for Completing Matrix This form is not contained in or required by NUREG-1021. Utilities are not required or encouraged to use it. The purpose of this form is to enhance regional consistency in reviewing operating tests. Additional information on these areas may be found in Examination Good Practices Appendix D. Check or mark any item(s) requiring comment and explain the issue in the space provided. | ||
: 10. Provide a brief description of problem in the explanation column. | : 1. Determine whether the task is dynamic (D) or static (S). A dynamic task is one that involves continuous monitoring and response to varying parameters. A static task is basically a system reconfiguration or realignment. | ||
: 11. Save initial review comments as normal black text; indicate how comments were resolved using blue text so that each JPM used on the exam is reflected by a (S)atisfactory resolution on this form.}} | : 2. Determine level of difficulty (LOD) using established 1-5 rating scale. Levels 1 and 5 represent inappropriate (low or high) discriminatory level for the license being tested. | ||
: 3. Check the appropriate box when an attribute weakness is identified: | |||
$ The initiating cue is not sufficiently clear to ensure the operator understands the task and how to begin. | |||
$ The JPM does not contain sufficient cues that are objective (not leading). | |||
$ All critical steps (elements) have not been properly identified. | |||
$ Scope of the task is either too narrow (N) or too broad (B). | |||
$ Excessive overlap with other part of operating test or written examination. | |||
: 4. Check the appropriate box when a job content error is identified: | |||
* Topics not linked to job content (e.g., disguised task, not required in real job). | |||
* Task is trivial and without safety significance. | |||
: 5. Based on the reviewer=s judgment, is the JPM as written (U)nacceptable (requiring repair or replacement), in need of (E)ditorial enhancement, or (S)atisfactory? | |||
: 6. Provide a brief description of any U or E rating in the explanation column. | |||
: 7. Save initial review comments as normal black text; indicate how comments were resolved using blue text so that each JPM used on the exam is reflected by a (S)atisfactory resolution on this form. 0 Page 2 of 3 OBDI 202 - IOLE Process | |||
SO - 2009 - 02 DRAFT OPERATING TEST COMMENTS SCENARIOS Scenario 1. 2. 3. 4. 5. 6. 7. 8. 9. | |||
: 10. Explanation (See below for instructions) | |||
Set ES TS Crit IC Pred TL L/C Eff U/E/S N/A Instructions for Completing Matrix This form is not contained in or required by NUREG-1021. Utilities are not required or encouraged to use it. The purpose of this form is to enhance regional consistency in reviewing operating test scenario sets. Additional information on these areas may be found in Examination Good Practices Appendix D. Check or mark any item(s) requiring comment and explain the issue in the space provided. | |||
: 1. ES: ES-301 checklists 4, 5, & 6 satisfied. | |||
: 2. TS: Set includes SRO TS actions for each SRO, with required actions explicitly detailed. | |||
: 3. Crit: Each manipulation or evolution has explicit success criteria documented in Form ES-D-2. | |||
: 4. IC: Out of service equipment and other initial conditions reasonably consistent between scenarios and not predictive of scenario events and actions. | |||
: 5. Pred: Scenario sequence and other factors avoid predictability issues. | |||
: 6. TL: Time line constructed, including event and process triggered conditions, such that scenario can run without routine examiner cuing. | |||
: 7. L/C: Length and complexity for each scenario in the set is reasonable for the crew mix being examined, such that all applicants have reasonably similar exposure and events are needed for evaluation purposes. | |||
: 8. Eff: Sequence of events is reasonably efficient for examination purposes, especially with respect to long delays or interactions. | |||
: 9. Based on the reviewer=s judgment, rate the scenario set as (U)nacceptable (requiring repair or replacement), in need of (E)ditorial enhancement, or (S)atisfactory. | |||
: 10. Provide a brief description of problem in the explanation column. | |||
: 11. Save initial review comments as normal black text; indicate how comments were resolved using blue text so that each JPM used on the exam is reflected by a (S)atisfactory resolution on this form. 0 Page 3 of 3 OBDI 202 - IOLE Process}} |
Latest revision as of 07:11, 14 November 2019
ML091271100 | |
Person / Time | |
---|---|
Site: | San Onofre |
Issue date: | 02/04/2009 |
From: | Mckernon T NRC Region 4 |
To: | Southern California Edison Co |
References | |
50-361/09-301, 50-362/09-301 50-361/09-301, 50-362/09-301 | |
Download: ML091271100 (3) | |
Text
SO - 2009 - 02 DRAFT OPERATING TEST COMMENTS ADMIN JPMS
- 4. Job Content
- 1. 2. 3. Attributes 6.
Errors 5.
JPM# Dyn LOD Explanation U/E/S (D/S) (1-5) IC Cues Critical Scope Over- Job- Minutia (See below for instructions)
Focus Steps (N/B) lap Link SRO None (A1a)
SRO None (A1b)
SRO (A2) None SRO (A3) None SRO (A4) None Instructions for Completing Matrix This form is not contained in or required by NUREG-1021. Utilities are not required or encouraged to use it. The purpose of this form is to enhance regional consistency in reviewing operating tests. Additional information on these areas may be found in Examination Good Practices Appendix D. Check or mark any item(s) requiring comment and explain the issue in the space provided.
- 1. Determine whether the task is dynamic (D) or static (S). A dynamic task is one that involves continuous monitoring and response to varying parameters. A static task is basically a system reconfiguration or realignment.
- 2. Determine level of difficulty (LOD) using established 1-5 rating scale. Levels 1 and 5 represent inappropriate (low or high) discriminatory level for the license being tested.
- 3. Check the appropriate box when an attribute weakness is identified:
- The initiating cue is not sufficiently clear to ensure the operator understands the task and how to begin.
- The JPM does not contain sufficient cues that are objective (not leading).
- All critical steps (elements) have not been properly identified.
- Scope of the task is either too narrow (N) or too broad (B).
- Excessive overlap with other part of operating test or written examination.
- 4. Check the appropriate box when a job content error is identified:
- Topics not linked to job content (e.g., disguised task, not required in real job).
- Task is trivial and without safety significance.
- 5. Based on the reviewer=s judgment, is the JPM as written (U)nacceptable (requiring repair or replacement), in need of (E)ditorial enhancement, or (S)atisfactory?
- 6. Provide a brief description of any U or E rating in the explanation column.
- 7. Save initial review comments as normal black text; indicate how comments were resolved using blue text so that each JPM used on the exam is reflected by a (S)atisfactory resolution on this form. 0 Page 1 of 3 OBDI 202 - IOLE Process
SO - 2009 - 02 DRAFT OPERATING TEST COMMENTS CONTROL ROOM/IN-PLANT SYSTEMS JPMS
- 4. Job Content
- 1. 2. 3. Attributes 6.
Errors 5.
JPM# Dyn LOD Explanation U/E/S (D/S) (1-5) IC Cues Critical Scope Over- Job- Minutia (See below for instructions)
Focus Steps (N/B) lap Link S1 None S2 None S3 None S4 None S5 None S6 None S7 None P1 None P2 None P3 None Instructions for Completing Matrix This form is not contained in or required by NUREG-1021. Utilities are not required or encouraged to use it. The purpose of this form is to enhance regional consistency in reviewing operating tests. Additional information on these areas may be found in Examination Good Practices Appendix D. Check or mark any item(s) requiring comment and explain the issue in the space provided.
- 1. Determine whether the task is dynamic (D) or static (S). A dynamic task is one that involves continuous monitoring and response to varying parameters. A static task is basically a system reconfiguration or realignment.
- 2. Determine level of difficulty (LOD) using established 1-5 rating scale. Levels 1 and 5 represent inappropriate (low or high) discriminatory level for the license being tested.
- 3. Check the appropriate box when an attribute weakness is identified:
$ The initiating cue is not sufficiently clear to ensure the operator understands the task and how to begin.
$ The JPM does not contain sufficient cues that are objective (not leading).
$ All critical steps (elements) have not been properly identified.
$ Scope of the task is either too narrow (N) or too broad (B).
$ Excessive overlap with other part of operating test or written examination.
- 4. Check the appropriate box when a job content error is identified:
- Topics not linked to job content (e.g., disguised task, not required in real job).
- Task is trivial and without safety significance.
- 5. Based on the reviewer=s judgment, is the JPM as written (U)nacceptable (requiring repair or replacement), in need of (E)ditorial enhancement, or (S)atisfactory?
- 6. Provide a brief description of any U or E rating in the explanation column.
- 7. Save initial review comments as normal black text; indicate how comments were resolved using blue text so that each JPM used on the exam is reflected by a (S)atisfactory resolution on this form. 0 Page 2 of 3 OBDI 202 - IOLE Process
SO - 2009 - 02 DRAFT OPERATING TEST COMMENTS SCENARIOS Scenario 1. 2. 3. 4. 5. 6. 7. 8. 9.
- 10. Explanation (See below for instructions)
Set ES TS Crit IC Pred TL L/C Eff U/E/S N/A Instructions for Completing Matrix This form is not contained in or required by NUREG-1021. Utilities are not required or encouraged to use it. The purpose of this form is to enhance regional consistency in reviewing operating test scenario sets. Additional information on these areas may be found in Examination Good Practices Appendix D. Check or mark any item(s) requiring comment and explain the issue in the space provided.
- 1. ES: ES-301 checklists 4, 5, & 6 satisfied.
- 3. Crit: Each manipulation or evolution has explicit success criteria documented in Form ES-D-2.
- 4. IC: Out of service equipment and other initial conditions reasonably consistent between scenarios and not predictive of scenario events and actions.
- 5. Pred: Scenario sequence and other factors avoid predictability issues.
- 6. TL: Time line constructed, including event and process triggered conditions, such that scenario can run without routine examiner cuing.
- 7. L/C: Length and complexity for each scenario in the set is reasonable for the crew mix being examined, such that all applicants have reasonably similar exposure and events are needed for evaluation purposes.
- 8. Eff: Sequence of events is reasonably efficient for examination purposes, especially with respect to long delays or interactions.
- 9. Based on the reviewer=s judgment, rate the scenario set as (U)nacceptable (requiring repair or replacement), in need of (E)ditorial enhancement, or (S)atisfactory.
- 10. Provide a brief description of problem in the explanation column.
- 11. Save initial review comments as normal black text; indicate how comments were resolved using blue text so that each JPM used on the exam is reflected by a (S)atisfactory resolution on this form. 0 Page 3 of 3 OBDI 202 - IOLE Process