ML19332B551
| ML19332B551 | |
| Person / Time | |
|---|---|
| Site: | 05000000, Brunswick |
| Issue date: | 03/07/1989 |
| From: | Jordan E NRC OFFICE FOR ANALYSIS & EVALUATION OF OPERATIONAL DATA (AEOD) |
| To: | Stello V NRC OFFICE OF THE EXECUTIVE DIRECTOR FOR OPERATIONS (EDO) |
| Shared Package | |
| ML19332B552 | List: |
| References | |
| FOIA-89-306 NUDOCS 8903140558 | |
| Download: ML19332B551 (17) | |
Text
W
?
\\
^ ' Y J. ps macsoy.
UNITED STATES NUCLEAR REGULATORY COMMISSION M'
e[
a
'f.
g wAssiwoTow. o. c. aosse
%..... /
W2/Tf wer au MEMORANDUM FOR:
Victor Stello, Jr.
1 l
~
Executive Director for Operations FROM:
' Edward L. Jordan,. Director Office for Analysis and Evaluation of Operational. Data
SUBJECT:
DIAGNOSTIC EVALUATION OF BRUNSWICK STEAM ELECTRIC PLANT In accordance with your memoraadum of January 3,1989,
Subject:
Diagnostic Evaluations, Enclosure 1 is our preliminary plen for the Brunswick Diagnostic Evaluation. These plans have been provided to NRR and Region II and will be discussed with Tom Murley and Stewart Ebneter. Any substantive change in these plans will be provided to you prior to the tean's arrival on-site.
- Also Enclosure 2 is a suggested memorandum for your signature that includes your approval of the Schedule of Principal Activities, the Team Organization, and the Evaluation Methodology for the Brunswick Diagnostic Evaluation.
$Ti d Jordan Director Office r Analysis and Evaluation of Operational Data
Enclosures:
As Stated-l-
/
l l
l 7-Ak/
(Wap@ ny a
\\
n4 :,7 3 --
op..
4 PRELIMINARY DIAGNOSTIC EVALUATION PLAN FOR BRUNSWICK STEAM ELECTRIC PLANT I
I l.
1 1
l L
March 1989
yy.,..
r BRUN5 WICK DIAGNOSTIC EVALUATION PLAN 1.0 Facility Name:
Brunswick Steam Electric Plant, Units 1 and 2 Licensee:
Carolina Power and Light Company (CP&L)
Docket No.:
50-325 and 50s.324 Location:
20 Miles South of Wilmington, NC 2.0 Dates of Principal Activities Notification of Licensee January 6,1989 Initial Trip to Carolina Power & Light
'Janua ry 19, 1989 Company TeamMeetings(Bethesda)
March 23, 1989 1pm-5pm March 24, 1989 9am-1pm April 5, 1989 9am-1pm Team Preparation March 27-April 7, 1989 Initial Onsite Activities April 10-21, 1989 Follow Up Onsite Activities May 1-5, 1989 Functional Area Report Outlines Completed May 12, 1989 Initial Draft Report Completed May 26, 1989 NRC Senior Management Briefing Week of May 29, 1989 ED0 Briefing Week of June 5,1989 Exit Meeting with Licensee Week of June 5, 1989 Submit Report for Management Review June 2-16, 1989 Final Report Issued June 16, 1989
Q;.,
3.0 Team Organization Name Organization Team Manager R. Lee Spessard Director,DOA/AE00 Team Leader John Craig Chief SPLB/NRR Diagnostic Assistant Henry Bailey DEIIB/DOA/AEOD Management /0rganization Jesse Crews RV Management /0rganization Fred A11enspach LBEB/NRR Management / Organization Dr. Robert Matlock Contractor Management /0rganization Dr. Jonathan Wert Contractor Operations / Training Bruce Little RIII Operations / Training Dave Hill RIII Operations / Training Eric Leeds DEllB/DOA/AEOD Operations / Training Bill Thurmond TTC/D0A/AE0D Maintenance Kevin Wolley DEIIB/D0A/AEOD Surveillance / Testing Chris V6ndenburgh RSIB/NRR Design / Engineering Support Ron Lloyd DEIIB/DOA/AEOD Design / Engineering Support Robert Gura Contractor Design / Engineering Support Gary Overbeck Contractor Quality Programs Robert Perch DEIIB/DOA/AE00 4.0 Overall Diagnostic Evaluation Team Goals Provide information to supplement SALP, PIs, and assessment data to NRC senior management.
Evaluate actions and involvement of licensee management and staff in safe plant operation.
Evaluate the effectiveness of licensee safety' improvement programs.
Determine root cause(s) of safety performance problems.
I l
5.0 Selection Basis for Brunswick Diagnostic Evaluation For the SALP period ended August 1988, declining performance was noted in three of the areas evaluated. these areas were Engineering / Technical Support and Safety Assessment / Quality Programs, both of which were assigned Category 3 ratings, and Operations, which was assigned a rating of Category 2 with a declining trend.
1 2
=
.Brubswick has a long standing problem with valve operability.
In January
' 1988 an Augmented Investigation Team (AIT) was dispatched to Brunswick to inv'stigate multiple containment isolation valve failures on Unit 2.
Another e
AIT was sent to Brunswick in July 1988 to investigate multiple equipment 1
failures, including HPCI and RHR injection valves, while shutting down Unit 1 for a planned outage. These problems, combined with repetitive safety system failures, gave rise to serious concerns about the licensee's commitment to effective root cause analysis and the capability of the engineering support staff.
In addition, repetitive equipment failures require the plant operators to work around equipment problems and have resulted in a lack of confidence in l
the plant.
The managers running Brunswick in 1982 were responsible for significant improve-ments in performance but since that time appeared to have become complacent with respect to continued improvement and maintaining high standards.
This laissez-faire attitude resulted in reactive management rather than active pursuit and resolution of new problems. Another by-product of this complacency has been poor comunications between various levels of management and the staff.
In 1988, the licensee took positive steps toward resolving these problems. Several management changes were made, including replacement of the Site Manager, Plant Manager, and the Operations and Technical Support Managers. A Corporate Management Oversight Team was established in July to L
evaluate Brunswick management and staff in the short-term and a long-term Corporate and Plant Management Program Plan was begun in September.
The long-term plan included a nuclear management appraisal to be conducted by Cresap, an independent management contractor.
A diagnostic evaluation will provide additional information to NRC senior management regarding the involvement of licensee management and staff in safe plant operation, effectiveness of improvement programs, and the determination of root causes for safety performance problems observed at Brunswick.
l 6.0 Goals and Objectives for the Brunswick Diagnostic Evaluation 6.1 Operations / Training, Maintenance, Surveillance / Testing:
Evaluation Goals Assess the effectiveness (including strengths, weaknesses, problems and issues) of operations, maintenance, and surveillance.
Identify the root causes for the identified problems and the arcas in need of improvement.
3
)
t e*
dbjectives p
Identify additional specific performance / programmatic problems and their causes.
Identify communication, coordination or cooperation problems and their causes.
Identify problems /causes associated with management oversight, involvement, leadership and communications or organizational climate.
Assess the effectiveness and the prospects for sustained and permanent improvement due to any new programs, including those initiated as a result of licensee self assessments and other industry (e.g., JVMA) assessments.
6.2 Quality Programs:
Evaluation Goals Assess the effectiveness (including strengths, weaknesses, problems, and issues) of the quality programs and administrative controls affecting quality for the Brunswick Plant._ Identify the root causes for the identified problems and the areas in need of improvement.
-0bjectives Identify problems /causes associated with the corporate and plant quality assurance (Q/A) organization capabilities to identify and report on substantive plant problems and issues.
Identify problems /causes associated with the ability of the line organization to identify and act upon substantive recurring problems.
Assess the management attitude and comitment to high quality performance of all safety-related work activities.
Assess the effectiveness and the prospects for sustained and permanent l
improvement due to any new programs.
l l
Assess.the effectiveness of on-site and corporate Nuclear Safety Section o.
to identify issues and effect timely corrective action.
6.3 Engineering / Technical and Support:
Evaluation Goals Assess the effectiveness (including strengths, weaknesses, problems, and issues)oftheengineering/technicalsupportprovidedbynuclearenginiering and plant technical support groups.
Identify any problem areas or issues adversely affecting the delivery of quality services in these areas. Determine the root causes for any identified problems and issues and the areas in need of' improvement.
Objectives Identify communication, coordination, or cooperation problems /causes associated with the engineering / technical group interface.
Assess the effectiveness and the prospects for sustained and permanent improvement due to any new programs or directives in these areas.
6.4 Management Controls and Organizational Climate and Culture:
Evaluation Goals and Objectives Evaluate how licensee corporate and plant management react and contribute to safety-related problems that affect plant operations.
Findings will be related to inadequate planning, staffing, organizing, directing and controlling of plant activities.
Evaluate organizational climate and culture by investigating selected factors influencing corporate and plant performance.
Examples:
o Organizational comunications o
Employee involvement in problem solving o
Work team effectiveness and cohesiveness o
Adequacy of problem solving activities o
Appropriateness of organizational goals o
Organizational morale and trust o
Corporate and plant loyalty and commitment o
Motivation 5
[
Emp'hasis'will be on the factors that contribute to safety-related weaknesses in plant performance.
7.0 Evaluation Methodology The diagnostic evaluation of Brunswick will involve the overlapping and phased implementation of information collection, problem / issue identification and cause determination steps in and across several functional areas in each of four levels. An overview of the Evaluation Methods is shown in Figure 1.
The first level and foundation for the diagnostic evaluation methodology principally involves the finding or verification of specific technical (performance) problems or issues and a determination of their specific proximate causes. The second level involves the identification or verification of programmatic problems, issues, strengths and weaknesses in the corporate and plant safety programs, policies, and administrative procedures, and their implementation and their relationship to the first level technical safety problems in each of the functional areas as well as their possible relationships to higher level problems, issues and weaknesses. The third level involves the identification of problems, issues, strengths and weaknesses associated with corporate and plant management practices (i.e., resource allocation, leadership, comunications, involvement and oversight) and organizational climate / safety attitudes within the plant staff and the corporate support organizations. The results of these evaluations in each functional area are expected to provide detailed information on the strengths and weaknesses associated with the performance of the Brunswick Plant safety program and its implementation; specific problems and issues; and broadest levelrootcauses(Level 4).
Daily team meetings will be held during all phases to share observations, findings-and concerns and to coordinate team efforts in response to issues developed at each level.
l l
l 6
l
DIAGNOSTIC EVALUATION PROCESS
~
~
. ~
EVALUATION SEQUENCE
- ~
j i
l LEVEL 4:
ROOT s
CAUSES s 's s
l
's 's 'x N
\\ s N
s j
\\N 's i
NN' i
[ LEVEL 3:
MANAGEMENT
\\'
OVERSITE AND CLIMATE
\\
l
[p,
~
I LEVEL PROGRAM ADEQUACY LEVEL 1:
PERFORMANCE - BASED REVIEW i
Figure 1 j
q.-
-p-..
3
,'.'7'hLevel1-Performance-BasedReview The first level of the Brunswick Diagnostic Evaluation provides the performance-based foundation for the diagnostic evaluation.
It consists of data collection, analysis and evaluation activities to find or verify specific technical problems or issues associated with the implementation of the-Brunswick plant and Carolina Power & Light Company corporate safety program.
These specific performance-based technical problems will be found or verified by detailed technical assessments. The Level 1 assessment actually starts during the initial in-office reviews of licensee correspondence and records.
Areas are selected that require more detailed onsite verification. This Level 1 assessment will initially focus on the Service Water System (SWS).
The assessment will involve reviews of SWS design control and modifications, equipment proc,edures and records, system walkdowns, activity observations and technically oriented interviews with working level, supervisory level and department level staff. Control room shifts and shift turnover activities will be observed by the team members assigned to operations.
In parallel with shift coverage, the remaining DET members will continue their individual evaluation activities.
Based on the individual reviews in these functional areas and related team discussions, it is expected that a significant number of specific technical problems and issues will be identified or verified and characterized with respect to their nature and safety significance.
The team's initial finding may result in an additional area or system being chosen for review or it may result in a more detailed review of the SWS.
Examples of areas or systems which may be identified for additional review include but are not limited to motor operated valves (MOVs), the electrical power system, instrument air system or containment heat removal systems.
Each functional area will evaluate the effectiveness of these actions to the extent they are appropriatetothefunctionalarea(s).
While compliance with specific regulatory or licensing requirements or licensee commitments is not the primary focus of the Level I review, identification of I
compliance deficiencies should be considered an important secondary purpose.
Such regulatory requirements and commitments should be reviewed as guideposts in directing and motivating the investigative process in the determination of higher level problems and issues and their associated causes, i
i 8
l l
r:n.
[.
, ~ * '
Following technical problem identification and characterization the proximate cause(s)'shouldbedetermined.
Proximatecause(s)(e.g.,incompleteor inadequateprocedures,procedureimplementation)shouldbedeterminedbasedon interviews, observations and document reviews.
To the extent possible and l
where time allows, these proximate cause(s) for each problem or issue should be l
I
-pursued to determine the root cause.
It is expected that the functional area inspections performed in Level I will provide performance issues which can be used in the assessment of the effectiveness of Brunswick Plant and corporate programs, (Level 2), in each of the functional areas. For example, the presence or absence of significant j
performance issues found from this assessment will provide a basis to evaluate the actual performance of the quality programs of the line and 0/A organizations.
7.2 Level 2 - Program Adequacy l
The second level of the Brunswick diagnostic evaluation matrix is aimed at identifying the strengths, weaknesses, problems, and issues associated with the licensee's programmatic controls over each of the functional areas addressed in the evaluation plan. These evaluations should be motivated in large part by the deficiencies identified during the first level review.
Based on our specific objectives, areas of high interest would be engineering, quality programs, surveillance, maintenance, plant operations and training.
Programmatic issues from the SWS evaluation will be identified.
Following the identification and characterization of programmatic strengths-amd weaknesses, a review of new improvement programs should be undertaken to determine the adequacy of the programs and their implementation.
Regardless of the existence or lack thereof of new improvement programs, an interview oriented investigationshouldbeconductedtodetermine(totheextentpossible)the origins, scope, and nature of root causes of the problems. This information will be used to determine the necessary improvements needed to fully address the programatic level deficiencies. The results of this phase also provide focus to help determine higher order issues and their associated causes. For example, programmatic issues from each functional area will be used to address the adequacy of corrective action for recurring problems and deficiencies.
9
n
~~
~
~
s,,
f
'Funbtional area. inspectors wi11' pursue performance issues and, where possible,
. examine Q/A audits for the functional' area to identify apparent Q/A programmatic deficiencies or weaknesses. Once the performance issues and the related Q/A progrannatic weaknesses have been identified, the team specialists in Q/A should pursue the issues to determine if management and/or organizational causes exist.
7.3 Level 3 - Management Oversight and Climate 1
The third level of the evaluation methodology matrix-is intended to evaluate both corporate and plant management effectiveness (i.e., strengths and weaknesses) and organizational climate and culture in each of the functional areas.
7.3.1 Hanagement Emphasis should be placed on identifying root causes for the programmatic and
)
personnel performance problems and issues from a management oversight, involvement, communication, leadership, and direction perspective. The methodology used to evaluate management effectiveness will, in general, consist of interviewing working, supbrvisory, and management level personnel.
In general, interviews should be used to confirm an issue which is either preidentified or emerges during the two-week period.
The management effectiveness review will also include licensee staff interviews necessary to evaluate the root causes of any problems or issues adversely affecting Brunswick quality verification organization effectiveness.
7.3.2 Organizational climate and Culture This evaluation is intended to identify any organizational climate problems
)
or issues which appear to be strong contributors to personnel performance problems or issues.
1 10
[
~.
7 The focu's of this area will be on the employee's qualifications, morale, motivation, and attitude towards plant policies and requirements. The met'hodology used to identify and understand the causes for organizational climate issues relevant to safety program performance will consist of predeveloped interviews. Where significant organizational climate issues relevant to personnel performance are suspected, interviews will be used to validate their presence (conducted at the employee and supervisory level) and to understand their root causes (conducted at working through sehior management 1bvelsattheplantandcorporateoffices).
In all cases, questions relating to organizational climate will be developed by the team management specialists. When no preidentified organizational climate issues exist, interviews should be conducted by each team member.
In cases where organizational climate issues are suspected, questions and interviews should be conducted / directed by the team management analysis specialists together with DET management and leaders.
For significant new issues in this area, the approach taken should be similar to that for new issues in Level 3.
7.4 Level 4 - Root Cause(s)
The fourth level of the evaluation methodology is intended to identify the root
- cause(s) for identified problems [at the highest level (1, 2, or 3) of the evaluation at which they occur] that, if corrected, will prevent recurrence.
L-In addition to the above criterion for root cause, there are two other criteria L
that should be applied:
I (1) Correction of the identified cause must be within the licensee's control.
(2) Correction of the identified cause must be consistent with the overall objectives of the plant (e.g., to produce power safely and economically).
Under this definition, the most immediate cause is the proximate cause and this cause is usually at the performance level.
In the search for the root cause, a problem and cause list is compiled. This list frequently shows that a cause at
- t, 31
hi,
[
p, t...
F+
1.evel= 1 becomes a problem at Level 2 and so on.
For example, " training:
' deficiency" is a cause subcategory at the performance level that is commonly used in-the industry. The problem at the programmatic level might.be training i
deficiency caused by a deficient training program. The problems at the
~
~
management level might be a deficient training program caused by lack of i
2 resource allocation. The root cause might be lack of management support for training.
It should be emphasized that root causes may be found at any evaluation level, i
Causes are identified for problems in each functional area.
A list of these causes from all the functional areas in the evaluation are compiled for each i
evaluation level. A pattern of causes at one level may indicate a problem at a higher level.
For example, poor procedures in several functional areas may indicate a programmatic problem in the procedure preparation program.
The management and organization functional area evaluates the problems and cause from each of the other functional areas in the search for higher level causes.
.8.0 Evaluation Preparation All DET members will participate in a team meeting from 1:00-5:00 p.m. on Thursday, March 23, 1989 and 9:00 AM - 1:00 PM on Friday, March 24, 1989.
This meeting will be held in the Maryland National' Bank Building in Bethesda, Maryland. Contact Henry Bailey (492-9006) for entry, if required.
At the team meeting,sitespecificinformation(licenseeprocedures,.previousinspection reports, etc.) will be distributed and discussed to essist DET members in the preparation process.
Each DET member should present their individual evaluation plan to the team leader at the meeting on April 5, 1989.
Each DET member should prepare a set of interview questions to supplement the core set of questions to be supplied by the management interview consultants. DET management will contact each team member during the preparation phase to provide guidance and review their individual detailed evaluation plans.
12
e
='
1 l
9.D. Diabnostic Evaluation Documentation L
As ' issues are developed during the evaluation, each DET member will document l
theirissuesindetailusingtheDiagnosticEvaluationObservation(DE0) form.
l Completed DEO forms should be given to the team leader and revised as new information becomes available. The DEOs will be used to brief licensee
~
management, and NRC management at the conclusion of the evaluation.
1 At the end of the second week of evaluation effort the leader for each functional area will document-the composite results of the diagnostic evaluation in their area by completing the Performance Evaluation Notebook E
.(PEN).
The PEN for each area will provide the results in the areas of i
performance, programs, management and organization, and root causes. The PEN for each functional area will be presented by each functional area leader at l
the final team meeting before leaving the site.
l At the conclusion of the evaluation, a draft report is to be issued to the team J
1eader, using the format of the Fermi DET report by May 19, 1989. Writing styles (including the level of detail to be presented) should resemble that I
presented in the Fermi DET report using DET guideline 3.
Copies of the Fermi and McGuire DET reports will be available at the March 23 team meeting.
i 10.0 Coordination and Logistics Travel arrangements, working hours, assignment of rental cars, motel reservations, licensee background material and conduct of the diagnostic l
evaluation will be discussed at the team preparation meeting on March 23-24, 1989.
l Security clearances and radiation training requirements must be satisfied to receive unescorted access to the Brunswick Steam Electric Plant. Any problems j
l should be immediately discussed with the team leader.
a py any.
f jo,,,
- UNITED STATES Y
g NUCLEAR REGULATORY COMMISSION
{'
- j WASHINGTON, D. C. 20666 MEMORANDUM FOR
Edward L. Jordan, Director Office for Analysis-and Evaluation of Operational Data i
FROM:
Victor Stello, Jr.
Executive Director for Operations
SUBJECT:
DIAGNOSTIC EVALUATION OF THE BRUNSWICK STEAM ELECTRIC PLANT I have reviewed and approved your plans for the Brunswick Diagnostic Evaluation as sumarized below:
Schedule of Principal Activities Team Preparation March 27 - April 7,1989 Onsite Activities April 10 - April 21,1989 May 1 - May 5, 1989 i
NRC Management Briefing Week of June 5,1989 Exit Meeting with Licensee Week of June 5,1989 h'
Issue Evaluation Report June 16,1989 Team Organization
- Name Organization Team Manager R. Lee Spessard Director,00A/AE00 l-Team Leader John Craig Chief, SPLB/NRR l
Diagnostic Assistant Henry Bailey DEIIB/D0A/AE0D Management / Organization Jesse Crews RV Management /0rganization Fred Allenspach LBEB/NRR Management / Organization Dr. Robert Matlock Contractor Management / Organization Dr. Jonathan Wert Contractor Operations / Training Bruce Little RIII Operations / Training Dave Hill RIII Operations / Training Eric Leeds DEIIB/00A/AE00 Operations / Training Bill Thurmond TTC/DOA/AEOD Maintenance Kevin Wolley DEIIB/DOA/AE0D Surveillance / Testing Chris Vandenburgh RSIB/NRR Design / Engineering Support Ron Lloyd DEIIB/DOA/AE00 Design / Engineering Support Robert Gura Contractor Design / Engineering Support Gary Overbeck Contractor Quality Programs Robert Perch DEIIB/00A/AE0D
- Additional members may be added in the near future at your discretion as the i
evaluation proceeds.
l L
Emwra
~
Edward L., Jordan I
.1 Evaluation Methodology The Diagnostic Evaluation Team (DET) will ascertain the current status of plant performance in the functional areas of design and engineering support operations / training, maintenance, surveillance / testing, quality programs, an,d.
management and organization through the performance of observations, interviews and document reviews.
The evaluation will consider activities conducted at the j
corporate headquarters as well as at the plant site.
If significant 1
are noted, emphasis will be placed on determining the root cause(s). problems As necessary, the evaluation process will progress from the identification of problems, proximate causes and related programmatic issues to the consideration of management and organizational strengths and weaknesses.
Following the onsite evaluation activities, the DET will prepare an evaluation report for submittal to me in accordance with NRC Manual Chapter 0520, "NRC j
t Diagnostic Evaluation Program."
i l'
Victor Stello, Jr.
Executive Director for Operations cc:
S. D. Ebneter, RII T. E. Murley, NRR l
J. M. Taylor, EDO L
1-l v
l t
l
-- -