ML070430242

From kanterella
Revision as of 15:07, 7 December 2019 by StriderTol (talk | contribs) (Created page by program invented by StriderTol)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search
Meeting Slides from Category One Public Management Meeting Regarding Status of the Licensee'S Corrective Actions on the Turkey Point Substantive Cross-Cutting Issue
ML070430242
Person / Time
Site: Turkey Point  NextEra Energy icon.png
Issue date: 02/12/2007
From:
Florida Power & Light Co
To:
Office of Nuclear Reactor Regulation
Shared Package
ML070430223 List:
References
Download: ML070430242 (28)


Text

Management Meeting NRC Region II Turkey Point Nuclear Plant Units 3 and 4 Briefing on Substantive Cross-cutting Issue February 1, 2007 Enclosure 2 1

Agenda

  • Introductions
  • Corrective Action Program Progress
  • Focus on Human Performance
  • Closing Comments 2

WANO Index 100 95 90 85 80 Index 75 70 65 60 55 50 1Q04 2Q04 3Q04 4Q04 1Q05 2Q05 3Q05 4Q05 1Q06 2Q06 3Q06 4Q06 U3 Actual 97.2 94.8 99.7 86.1 80.8 83.0 82.9 73.4 65.9 87.3 92.4 92.4 U4 Actual 98.6 98.6 100.0 98.4 97.0 87.2 77.8 65.6 63.1 62.8 65.9 76.7 3

Equipment Reliability Index 4

Management Meeting Corrective Action Program Root Cause Evaluation 5

CAP Root Cause

  • Corrective Action Program has not been core business

- Contributing Factors

  • Program ownership and accountability
  • Competing priorities and increased CR generation
  • Previous corrective actions were not effective
  • Organizational and Programmatic (O&P) issues not adequately addressed 6

CAP Root Cause

  • Organization has not consistently demonstrated conservative decision making

- Contributing Factors

  • Assumptions that causes for events are known
  • Production focus
  • Taking actions that present the least organizational resistance 7

Contributing Cause

  • Program and process weaknesses contributed to ineffective issue resolution

- Contributing Factors

  • Insufficient investigation of repeat events
  • Improperly justified CR evaluation and action extensions
  • Improper closure of some corrective actions w/o adequate justification 8

Corrective Actions

  • Corrective actions aimed at driving behavior changes to increase program ownership and effectiveness

- Accountability (top to bottom)

- People (best and brightest)

- Training (the right picture) 9

Corrective Actions

  • Enhanced station level monitoring by the CROG

- Monthly review of program indicators and PI shortfall action plans

- Quarterly review of selected analyses that include O&P aspects

- Quarterly review of NRC Findings

- Quarterly review of station trend reports

- Action plans for Red and select Yellow Department Health Indicators 10

Corrective Actions

  • Implement Department level monitoring to ensure consistent CAP application
  • Scope:

- SL1 & SL2 CR evaluations

- Closed SL1 & SL2 actions

- Sampling of SL3 CR evaluations and actions

- Department indicators, Health Index, & trends

- Evaluation & action backlogs, timeliness, and due date extensions 11

Corrective Actions

  • CAPCOs to monitor and mentor CAP expectations

- Strengthen skill set

  • Event code classification
  • Identification of repeat issues
  • Root and apparent cause analysis

- Mentor department personnel

- Monitor closure of evaluations and actions

- Manage Department backlogs

- Train Department personnel 12

Corrective Actions

  • Revise Fleet procedure

- Identification of repeat issues

- Risk assessment for due date extensions

- Institutionalize CAP Health Index

- Include Training deficiencies in Significance Table

  • Other actions

- Employee communication plan

- Perform extent of condition reviews to determine station risk for repeat events

- Continue to screen work requests on MSPI systems through the new risk based process

- Six Sigma process reviews 13

Performance Indicators

  • Station CAP Health Index
  • RCE and ACE evaluation quality
  • Average age of open evaluations &

actions

  • Number of extensions
  • CR action backlogs
  • Number of repeat events 14

CAP Health Index Site CAP Health Index (Overall Performance) - December 2006 R Performance Indicator PI Definition Weight Score Quality Y Quality of CAP evaluations is paramount in determining the proper corrective actions.

Quality of Cause Analysis Percentage of Apparent and Root Cause evaluations passed first time during review process. 2.0 60.0%

Number of Repeat Events. Repeat events defined as "Two or more independent occurrences of the same condition which are the result of the same basic causes for which previous Repeat Events 2.0 3 corrective actions to prevent or minimize recurrence failed (typically within a two-year period)."

This applies to RCE and ACE Evaluations only. (0 MSPI)

Quality of Closure Reviews Percentage of CAPRs, Routine C/As, and Effectiveness Reviews passed during the month. 2.0 95.1%

Timeliness R Timely resolution of problems can minimize repetitive problems.

Overdue Condition Report Evaluations Percent of CR evaluations submitted by due date 1.0 74.4%

Overdue Condition Report Actions Percent of CR Actions submitted by due date 1.0 73.3%

Average Age of Open Evaluations Average age in days of open SL 1-3 CR evaluations (CAQ and Non-CAQ). 1.0 77.65 Average age in days of open SL 1-3 routine non-outage corrective actions and corrective Average Age of Open Actions 1.0 146 actions to prevent recurrence (CAQ and Non-CAQ).

CAP Management G Efficient management of the Corrective Action Program ensures timely correction of problems and prevents repeat events.

Open SL-1,2,3 CAQ Corrective Actions as a % of the total number of Actions generated in the CR Action Backlog - CAQ 2.0 29.0%

previous 12 months Open SL-1,2,3 Non-CAQ Corrective Actions as a % of the total number of Actions generated CR Action Backlog - NCAQ 1.5 26.7%

in the previous 12 months Average number of days elapsed between CR initiation and record closure for SL 1-3 CAQ Average CR Cycle Time 1.0 126.0 and Non-CAQ Non-outage CRs closed during the last 12 months (rolling 12 month average).

15

CAP Indicators QUALITY OF CAUSE ANALYSIS / INVESTIGATION STATION AVERAGE Definition / Goal Percentage of Root Cause and Apparent Cause evaluations that met established grading criteria as determined by CAPCOs, CROG or PID. Sample of 25% of closed evaluations will be reviewed to the criteria specified in the CAP Handbook. The percent of reviewed accepted is a 3-month rolling average. Three consecutive data points below the station goal will result in an increased sample size to 50%.

100 100 STATION GOAL > 80%

80 80 P ercent A ccept 60 60 N um ber 40 40 20 20 0 0 Jan-06 Feb-06 Mar-06 Apr-06 May-06 Jun-06 Jul-06 Aug-06 Sep-06 Oct-06 Nov-06 Dec-06

  1. Closed # Reviewed # Rejected GOAL 3-mo rolling avg 16

CAP Indicators QUALITY OF CAUSE ANALYSIS / INVESTIGATION FOR MSPI SYSTEMS Definition / Goal Percentage of SL1, SL2, SL3 MSPI evaluations that met established grading criteria as determined by CAPCOs, CROG or PID. 100% of closed MSPI evaluations will be reviewed to the criteria specified in the CAP Handbook.

50 100 40 80 STATION GOAL > 80%

P ercent A ccept 30 60 N um ber 20 40 10 20 9 19 39 14 0 0 Sep-06 Oct-06 Nov-06 Dec-06

  1. Reviewed  % of Reviewed Accepted GOAL 17

Effectiveness Measures

  • RCE corrective action implementation
  • CAP self-assessment
  • Trend reports (site-wide & department)
  • NRC identified cross-cutting aspects
  • PI&R inspection results 18

Management Meeting Focus on Human Performance 19

HU Root Cause

  • Performed a root cause

- Behaviors not reinforced

- Resource requirements not adequately addressed

  • Key actions

- Provide HU training to Senior Managers

- Develop department specific HU plans and supporting training, activities include:

  • Department HU advocates
  • Coach the coach program
  • Embed HU into initial and continuing training
  • Evaluate and develop plans to reduce procedure related errors 20

Closing the Gaps

  • Using training to reinforce the right behaviors and decisions

- Labs instilling good behaviors

- Just in time training used to prepare for plant evolutions

- Simulator runs are being used to assist in decision making for plant problems

- Procedure errors are being caught and corrected

- Observation program is improving

- Many of these are being driven through improved use of the corrective action program 21

Closing the Gaps

  • Closing the Operational Gaps

- Operability screening

- Operational Decision Making

- Increased involvement in the work control process

- Improved risk management strategies in outage design and implementation, including schedule adherence

- Mispositioning events

- Procedure backlogs 22

Closing the Gaps

  • Effective use of the corrective action program to monitor and trend issues
  • Improve use of plant OE
  • Improving Engineering briefings
  • Human Performance walkdowns
  • Increasing manager & supervisor involvement in field
  • Performance indicators and effectiveness measures 23

Performance Indicators

  • Station level indicators

- Station Human Performance Clock Resets

- Department Human Performance Clock Resets

- Field observations

  • Stakeholder indicators

- Human Performance related LERs

- Findings with HU cross-cutting aspects

  • Industry indicators

- HU Causal Factor Trends vs. Industry Median

  • Process indicators
  • Personnel indicators 24

Focused Oversight

  • Improved use of management observation program
  • Corrective action oversight group
  • Human Performance oversight group
  • 3 Tier Committee System for Training
  • Training Corrective Action Review Board
  • Effective use of mentors and oversight 25

Summary

  • Reducing human performance events is key to reducing risk
  • Observation program is providing mentoring
  • Station management is providing focus on the strategies

- Integrated communication plan

- Consistent reinforcement of expectations for change in culture 26

Closing Comments

  • Root cause evaluations for PI&R cross-cutting issue and HU weaknesses complete
  • Actions being implemented - progress evident
  • Recognize key leadership role - site and corporate
  • Performance Improvement Team remains in place 27

Management Meeting Open Discussion Questions 28