ML091271100: Difference between revisions

From kanterella
Jump to navigation Jump to search
(Created page by program invented by StriderTol)
(Created page by program invented by StriderTol)
 
(2 intermediate revisions by the same user not shown)
Line 2: Line 2:
| number = ML091271100
| number = ML091271100
| issue date = 02/04/2009
| issue date = 02/04/2009
| title = San Onofre - 2009-02 - Draft Operating Test Comments
| title = 2009-02 - Draft Operating Test Comments
| author name = McKernon T O
| author name = Mckernon T
| author affiliation = NRC/RGN-IV
| author affiliation = NRC/RGN-IV
| addressee name =  
| addressee name =  
Line 17: Line 17:


=Text=
=Text=
{{#Wiki_filter:Attachment 10 Page 1 of 3 OBDI 202 - IOLE Process SO - 2009 - 02 DRAFT OPERATING TEST COMMENTS ADMIN JPMS  
{{#Wiki_filter:SO - 2009 - 02                                                               DRAFT OPERATING TEST COMMENTS                                                             ADMIN JPMS
: 3. Attributes
: 4. Job Content
: 4. Job Content Errors JPM# 1. Dyn (D/S) 2. LOD (1-5) IC Focus Cues Critical Steps Scope (N/B) Over- lap Job-Link Minutia 5. U/E/S 6. Explanation (See below for instructions)
: 1.       2.               3. Attributes                                                                                6.
SRO (A1a)           None SRO (A1b)           None SRO (A2)           None SRO (A3)           None SRO (A4)           None             Instructions for Completing Matrix This form is not contained in or required by NUREG-1021. Utilities are not required or encouraged to use it. The purpose of this form is to enhance regional consistency in reviewing operating tests. Additional information on these areas may be found in Exam ination Good Practices Appendix D. Check or mark any item(s) requiring comment and explain the issue in the space provided.  
Errors      5.
: 1. Determine whether the task is dynamic (D) or static (S). A dynamic task is one that involves continuous monitoring and response to varying parameters. A static task is basically a system reconf iguration or realignment.  
JPM#      Dyn      LOD                                                                                                       Explanation U/E/S (D/S)    (1-5)     IC   Cues Critical Scope     Over-   Job-   Minutia                                     (See below for instructions)
: 2. Determine level of difficulty (LOD) using established 1-5 rating scale. Levels 1 and 5 represent inappropriate (low or high) discriminatory level fo r the license being tested. 3. Check the appropriate box when an attribute weakness is identified: The initiating cue is not sufficiently clear to ensure the operator understands the task and how to begin. The JPM does not contain sufficient c ues that are objective (not leading). All critical steps (elements) have not been properly identified. Scope of the task is either too narrow (N) or too broad (B). Excessive overlap with other part of operating test or wr itten examination. 4. Check the appropriate box when a job content error is identified: Topics not linked to job content (e.g., di sguised task, not required in real job). Task is trivial and without safety significance. 5. Based on the reviewer
Focus          Steps    (N/B)   lap    Link SRO                                                                                        None (A1a)
=s judgment, is the JPM as written (U)na cceptable (requiring repair or replacement), in need of (E)ditorial enhancement, or (S)atisfactory? 6. Provide a brief description of any U or E rating in the explanation column. 7. Save initial review comments as normal black text; indicate how comments were resolved using blue text so that each JPM used on the exam is reflected by a (S)atisfactory resolution on this form.
SRO                                                                                       None (A1b)
Attachment 10 Page 2 of 3 OBDI 202 - IOLE Process SO - 2009 - 02 DRAFT OPERATING TEST COMMENTS CONTROL ROOM/IN-PLANT SYSTEMS JPMS
SRO (A2)                                                                                     None SRO (A3)                                                                                     None SRO (A4)                                                                                     None Instructions for Completing Matrix This form is not contained in or required by NUREG-1021. Utilities are not required or encouraged to use it. The purpose of this form is to enhance regional consistency in reviewing operating tests. Additional information on these areas may be found in Examination Good Practices Appendix D. Check or mark any item(s) requiring comment and explain the issue in the space provided.
: 3. Attributes
: 1. Determine whether the task is dynamic (D) or static (S). A dynamic task is one that involves continuous monitoring and response to varying parameters. A static task is basically a system reconfiguration or realignment.
: 4. Job Content Errors JPM# 1. Dyn (D/S) 2. LOD (1-5) IC Focus Cues Critical Steps Scope (N/B) Over- lap Job-Link Minutia 5. U/E/S 6. Explanation (See below for instructions)
: 2. Determine level of difficulty (LOD) using established 1-5 rating scale. Levels 1 and 5 represent inappropriate (low or high) discriminatory level for the license being tested.
S1          None S2           None S3           None S4           None S5           None S6           None S7           None P1           None P2           None P3           None             Instructions for Completing Matrix This form is not contained in or required by NUREG-1021. Utilities are not required or encouraged to use it. The purpose of this form is to enhance regional consistency in reviewing operating tests. Additional information on these areas may be found in Exam ination Good Practices Appendix D. Check or mark any item(s) requiring comment and explain the issue in the space provided.  
: 3. Check the appropriate box when an attribute weakness is identified:
: 1. Determine whether the task is dynamic (D) or static (S). A dynamic task is one that involves continuous monitoring and response to varying parameters. A static task is basically a system reconf iguration or realignment.  
* The initiating cue is not sufficiently clear to ensure the operator understands the task and how to begin.
: 2. Determine level of difficulty (LOD) using established 1-5 rating scale. Levels 1 and 5 represent inappropriate (low or high) discriminatory level fo r the license being tested. 3. Check the appropriate box when an attribute weakness is identified:  
* The JPM does not contain sufficient cues that are objective (not leading).
$ The initiating cue is not sufficiently clear to ensure the operator understands the task and how to begin.  
* All critical steps (elements) have not been properly identified.
$ The JPM does not contain sufficient c ues that are objective (not leading).  
* Scope of the task is either too narrow (N) or too broad (B).
$ All critical steps (elements) have not been properly identified.  
* Excessive overlap with other part of operating test or written examination.
$ Scope of the task is either too narrow (N) or too broad (B).  
: 4. Check the appropriate box when a job content error is identified:
$ Excessive overlap with other part of operating test or wr itten examination. 4. Check the appropriate box when a job content error is identified: Topics not linked to job content (e.g., di sguised task, not required in real job). Task is trivial and without safety significance. 5. Based on the reviewer
* Topics not linked to job content (e.g., disguised task, not required in real job).
=s judgment, is the JPM as written (U)na cceptable (requiring repair or replacement), in need of (E)ditorial enhancement, or (S)atisfactory? 6. Provide a brief description of any U or E rating in the explanation column. 7. Save initial review comments as normal black text; indicate how comments were resolved using blue text so that each JPM used on the exam is reflected by a (S)atisfactory resolution on this form.
* Task is trivial and without safety significance.
Attachment 10 Page 3 of 3 OBDI 202 - IOLE Process SO - 2009 - 02 DRAFT OPERATING TEST COMMENTS SCENARIOS Scenario Set 1. ES 2. TS 3. Crit 4. IC 5. Pred 6. TL 7. L/C 8. Eff 9. U/E/S 10. Explanation (See below for instructions)
: 5. Based on the reviewer=s judgment, is the JPM as written (U)nacceptable (requiring repair or replacement), in need of (E)ditorial enhancement, or (S)atisfactory?
N /A                                 Instructions for Completing Matrix This form is not contained in or required by NUREG-1021. Utilities are not required or encouraged to use it. The purpose of this form is to enhance regional consistency in reviewing operating test scenario sets. Additional information on these areas may be found in Ex amination Good Practices Appendix D. Check or mark any item(s) requiring comment and explain the iss ue in the space provided. 1. ES: ES-301 checklists 4, 5, & 6 satisfied.  
: 6. Provide a brief description of any U or E rating in the explanation column.
: 2. TS: Set includes SRO TS actions for each SRO, with required actions explicitly detailed.  
: 7. Save initial review comments as normal black text; indicate how comments were resolved using blue text so that each JPM used on the exam is reflected by a (S)atisfactory resolution on this form. 0                                                                            Page 1 of 3                                                     OBDI 202 - IOLE Process
: 3. Crit: Each manipulation or evolution has explicit success criteria documented in Form ES-D-2.  
 
: 4. IC: Out of service equipment and other initial conditions reasonably consistent between scenarios and not predictive of scen ario events and actions.  
SO - 2009 - 02                                                             DRAFT OPERATING TEST COMMENTS                           CONTROL ROOM/IN-PLANT SYSTEMS JPMS
: 5. Pred: Scenario sequence and other factors avoid predictability issues.  
: 4. Job Content
: 6. TL: Time line constructed, including ev ent and process triggered conditions, such t hat scenario can run without routine exam iner cuing.  
: 1.      2.                3. Attributes                                                                              6.
: 7. L/C: Length and complexity for each scenario in the set is reasonable for the crew mix being examined, such that all applicants have reasonably sim ilar exposure and events are needed for evaluation purposes.  
Errors       5.
: 8. Eff: Sequence of events is reasonably efficient for examination purposes, especially with respect to long delays or interact ions. 9. Based on the reviewer
JPM#       Dyn     LOD                                                                                                        Explanation U/E/S (D/S)   (1-5)     IC   Cues Critical Scope     Over-   Job-   Minutia                                     (See below for instructions)
=s judgment, rate the scenario set as (U)nacceptable (requiring r epair or replacement), in need of (E)ditorial enhancement, or (S)atisfactory.  
Focus           Steps    (N/B)  lap    Link S1                                                                                      None S2                                                                                      None S3                                                                                     None S4                                                                                     None S5                                                                                     None S6                                                                                     None S7                                                                                     None P1                                                                                     None P2                                                                                     None P3                                                                                     None Instructions for Completing Matrix This form is not contained in or required by NUREG-1021. Utilities are not required or encouraged to use it. The purpose of this form is to enhance regional consistency in reviewing operating tests. Additional information on these areas may be found in Examination Good Practices Appendix D. Check or mark any item(s) requiring comment and explain the issue in the space provided.
: 10. Provide a brief description of problem in the explanation column.  
: 1. Determine whether the task is dynamic (D) or static (S). A dynamic task is one that involves continuous monitoring and response to varying parameters. A static task is basically a system reconfiguration or realignment.
: 11. Save initial review comments as normal black text; indicate how comments were resolved using blue text so that each JPM used on the exam is reflected by a (S)atisfactory resolution on this form.}}
: 2. Determine level of difficulty (LOD) using established 1-5 rating scale. Levels 1 and 5 represent inappropriate (low or high) discriminatory level for the license being tested.
: 3. Check the appropriate box when an attribute weakness is identified:
        $ The initiating cue is not sufficiently clear to ensure the operator understands the task and how to begin.
        $ The JPM does not contain sufficient cues that are objective (not leading).
        $ All critical steps (elements) have not been properly identified.
        $ Scope of the task is either too narrow (N) or too broad (B).
        $ Excessive overlap with other part of operating test or written examination.
: 4. Check the appropriate box when a job content error is identified:
* Topics not linked to job content (e.g., disguised task, not required in real job).
* Task is trivial and without safety significance.
: 5. Based on the reviewer=s judgment, is the JPM as written (U)nacceptable (requiring repair or replacement), in need of (E)ditorial enhancement, or (S)atisfactory?
: 6. Provide a brief description of any U or E rating in the explanation column.
: 7. Save initial review comments as normal black text; indicate how comments were resolved using blue text so that each JPM used on the exam is reflected by a (S)atisfactory resolution on this form. 0                                                                          Page 2 of 3                                                     OBDI 202 - IOLE Process
 
SO - 2009 - 02                                                         DRAFT OPERATING TEST COMMENTS                                                             SCENARIOS Scenario       1.       2. 3.     4.       5.       6.       7. 8.     9.
: 10. Explanation (See below for instructions)
Set        ES        TS    Crit    IC      Pred      TL      L/C    Eff  U/E/S N/A Instructions for Completing Matrix This form is not contained in or required by NUREG-1021. Utilities are not required or encouraged to use it. The purpose of this form is to enhance regional consistency in reviewing operating test scenario sets. Additional information on these areas may be found in Examination Good Practices Appendix D. Check or mark any item(s) requiring comment and explain the issue in the space provided.
: 1. ES: ES-301 checklists 4, 5, & 6 satisfied.
: 2. TS: Set includes SRO TS actions for each SRO, with required actions explicitly detailed.
: 3. Crit: Each manipulation or evolution has explicit success criteria documented in Form ES-D-2.
: 4. IC: Out of service equipment and other initial conditions reasonably consistent between scenarios and not predictive of scenario events and actions.
: 5. Pred: Scenario sequence and other factors avoid predictability issues.
: 6. TL: Time line constructed, including event and process triggered conditions, such that scenario can run without routine examiner cuing.
: 7. L/C: Length and complexity for each scenario in the set is reasonable for the crew mix being examined, such that all applicants have reasonably similar exposure and events are needed for evaluation purposes.
: 8. Eff: Sequence of events is reasonably efficient for examination purposes, especially with respect to long delays or interactions.
: 9. Based on the reviewer=s judgment, rate the scenario set as (U)nacceptable (requiring repair or replacement), in need of (E)ditorial enhancement, or (S)atisfactory.
: 10. Provide a brief description of problem in the explanation column.
: 11. Save initial review comments as normal black text; indicate how comments were resolved using blue text so that each JPM used on the exam is reflected by a (S)atisfactory resolution on this form. 0                                                                      Page 3 of 3                                                    OBDI 202 - IOLE Process}}

Latest revision as of 07:11, 14 November 2019

2009-02 - Draft Operating Test Comments
ML091271100
Person / Time
Site: San Onofre  Southern California Edison icon.png
Issue date: 02/04/2009
From: Mckernon T
NRC Region 4
To:
Southern California Edison Co
References
50-361/09-301, 50-362/09-301 50-361/09-301, 50-362/09-301
Download: ML091271100 (3)


Text

SO - 2009 - 02 DRAFT OPERATING TEST COMMENTS ADMIN JPMS

4. Job Content
1. 2. 3. Attributes 6.

Errors 5.

JPM# Dyn LOD Explanation U/E/S (D/S) (1-5) IC Cues Critical Scope Over- Job- Minutia (See below for instructions)

Focus Steps (N/B) lap Link SRO None (A1a)

SRO None (A1b)

SRO (A2) None SRO (A3) None SRO (A4) None Instructions for Completing Matrix This form is not contained in or required by NUREG-1021. Utilities are not required or encouraged to use it. The purpose of this form is to enhance regional consistency in reviewing operating tests. Additional information on these areas may be found in Examination Good Practices Appendix D. Check or mark any item(s) requiring comment and explain the issue in the space provided.

1. Determine whether the task is dynamic (D) or static (S). A dynamic task is one that involves continuous monitoring and response to varying parameters. A static task is basically a system reconfiguration or realignment.
2. Determine level of difficulty (LOD) using established 1-5 rating scale. Levels 1 and 5 represent inappropriate (low or high) discriminatory level for the license being tested.
3. Check the appropriate box when an attribute weakness is identified:
  • The initiating cue is not sufficiently clear to ensure the operator understands the task and how to begin.
  • The JPM does not contain sufficient cues that are objective (not leading).
  • All critical steps (elements) have not been properly identified.
  • Scope of the task is either too narrow (N) or too broad (B).
  • Excessive overlap with other part of operating test or written examination.
4. Check the appropriate box when a job content error is identified:
  • Topics not linked to job content (e.g., disguised task, not required in real job).
  • Task is trivial and without safety significance.
5. Based on the reviewer=s judgment, is the JPM as written (U)nacceptable (requiring repair or replacement), in need of (E)ditorial enhancement, or (S)atisfactory?
6. Provide a brief description of any U or E rating in the explanation column.
7. Save initial review comments as normal black text; indicate how comments were resolved using blue text so that each JPM used on the exam is reflected by a (S)atisfactory resolution on this form. 0 Page 1 of 3 OBDI 202 - IOLE Process

SO - 2009 - 02 DRAFT OPERATING TEST COMMENTS CONTROL ROOM/IN-PLANT SYSTEMS JPMS

4. Job Content
1. 2. 3. Attributes 6.

Errors 5.

JPM# Dyn LOD Explanation U/E/S (D/S) (1-5) IC Cues Critical Scope Over- Job- Minutia (See below for instructions)

Focus Steps (N/B) lap Link S1 None S2 None S3 None S4 None S5 None S6 None S7 None P1 None P2 None P3 None Instructions for Completing Matrix This form is not contained in or required by NUREG-1021. Utilities are not required or encouraged to use it. The purpose of this form is to enhance regional consistency in reviewing operating tests. Additional information on these areas may be found in Examination Good Practices Appendix D. Check or mark any item(s) requiring comment and explain the issue in the space provided.

1. Determine whether the task is dynamic (D) or static (S). A dynamic task is one that involves continuous monitoring and response to varying parameters. A static task is basically a system reconfiguration or realignment.
2. Determine level of difficulty (LOD) using established 1-5 rating scale. Levels 1 and 5 represent inappropriate (low or high) discriminatory level for the license being tested.
3. Check the appropriate box when an attribute weakness is identified:

$ The initiating cue is not sufficiently clear to ensure the operator understands the task and how to begin.

$ The JPM does not contain sufficient cues that are objective (not leading).

$ All critical steps (elements) have not been properly identified.

$ Scope of the task is either too narrow (N) or too broad (B).

$ Excessive overlap with other part of operating test or written examination.

4. Check the appropriate box when a job content error is identified:
  • Topics not linked to job content (e.g., disguised task, not required in real job).
  • Task is trivial and without safety significance.
5. Based on the reviewer=s judgment, is the JPM as written (U)nacceptable (requiring repair or replacement), in need of (E)ditorial enhancement, or (S)atisfactory?
6. Provide a brief description of any U or E rating in the explanation column.
7. Save initial review comments as normal black text; indicate how comments were resolved using blue text so that each JPM used on the exam is reflected by a (S)atisfactory resolution on this form. 0 Page 2 of 3 OBDI 202 - IOLE Process

SO - 2009 - 02 DRAFT OPERATING TEST COMMENTS SCENARIOS Scenario 1. 2. 3. 4. 5. 6. 7. 8. 9.

10. Explanation (See below for instructions)

Set ES TS Crit IC Pred TL L/C Eff U/E/S N/A Instructions for Completing Matrix This form is not contained in or required by NUREG-1021. Utilities are not required or encouraged to use it. The purpose of this form is to enhance regional consistency in reviewing operating test scenario sets. Additional information on these areas may be found in Examination Good Practices Appendix D. Check or mark any item(s) requiring comment and explain the issue in the space provided.

1. ES: ES-301 checklists 4, 5, & 6 satisfied.
2. TS: Set includes SRO TS actions for each SRO, with required actions explicitly detailed.
3. Crit: Each manipulation or evolution has explicit success criteria documented in Form ES-D-2.
4. IC: Out of service equipment and other initial conditions reasonably consistent between scenarios and not predictive of scenario events and actions.
5. Pred: Scenario sequence and other factors avoid predictability issues.
6. TL: Time line constructed, including event and process triggered conditions, such that scenario can run without routine examiner cuing.
7. L/C: Length and complexity for each scenario in the set is reasonable for the crew mix being examined, such that all applicants have reasonably similar exposure and events are needed for evaluation purposes.
8. Eff: Sequence of events is reasonably efficient for examination purposes, especially with respect to long delays or interactions.
9. Based on the reviewer=s judgment, rate the scenario set as (U)nacceptable (requiring repair or replacement), in need of (E)ditorial enhancement, or (S)atisfactory.
10. Provide a brief description of problem in the explanation column.
11. Save initial review comments as normal black text; indicate how comments were resolved using blue text so that each JPM used on the exam is reflected by a (S)atisfactory resolution on this form. 0 Page 3 of 3 OBDI 202 - IOLE Process