ML073170716

From kanterella
Jump to navigation Jump to search
Letter from FENOC to NRC Submittal of the 2007 Engineering Programs Effectiveness Independent Assessment Report for the Davis-Besse Nuclear Power Station
ML073170716
Person / Time
Site: Davis Besse Cleveland Electric icon.png
Issue date: 11/08/2007
From: Bezilla M
FirstEnergy Nuclear Operating Co
To: Caldwell J
Division of Nuclear Materials Safety III
References
Download: ML073170716 (89)


Text

FENOC 1 Si01 Nor111 Sfare Haiile 2 FmrEnergy Nuclear Operating Ccfnpany Oak Harbor Ohio 4.3449 Mark 8. Bezilla Vice President - Niiclea, Docket Number 50-346 License Number NPF-3 Serial Number 1-1507 November 8, 2007 Mr. James L. Caldwell, Administrator United States Nuclear Regulatory Commission Region 111 2443 Warrenville Road, Suite 210 Lisle, IL 60532-4352

Subject:

Submittal of the 2007 Engineering Programs Effectiveness Independent Assessment Report for the Davis-Besse Nuclear Power Station

Dear Mr. Caldwell:

The purpose of this letter is to submit the assessment report for the 2007 Engineering Programs Effectiveness Independent Assessment for the Davis-Besse Nuclear Power Station (DBNPS). This submittal is in accordance with the Nuclear Regulatory Commission (NRC) letter dated March 8,2004, Approval to Restart the Davis-Besse Nuclear Power Station, Closure of Confirmatory Action Letter, and Issuance of Confirmatory Order, which requires submittal of the assessment results within forty-five (45) days of the completion of the assessment.

The on-site activities of the Engineering Programs Effectiveness Independent Assessment were conducted from September 10 to September 21,2007, in accordance with the Assessment Plan submitted via letter Serial Number 1-1495, dated June 12,2007. The final debrief of results was presented to the DBNPS management on October 5,2007, marking the end of the assessment. The enclosed report contains the results of the Independent Assessment. No issues rising to the level of an Area for Improvement were identified in the Independent Assessment; therefore, no action plans are included to address areas for improvement.

RECEIVEN ~OV 1 320Q7

Docket Number 50-346 License Number NPF-3 Serial Number 1-1507 Page 2 of 2 If you have any questions or require additional information, please contact Mr. Raymond A. Hruby, Jr., Manager - Site Regulatory Compliance at (419) 321 -8000.

Sincerely yours, Mark B. Bezilla v LJS - Commitment List - 2007 Independent Assessment, Engineering Programs Effectiveness, Davis-Besse Nuclear Power Station cc: USNRC Document Control Desk DB-1 NRC/NRR Project Manager DB-1 Senior Resident Inspector Utility Radiological Safety Board

Docket Number 50-346 License Number NPF-3 Serial Number 1-1507 , Page 1 of 1 COMMITMENT LIST The following list identifies those actions committed to by the Davis-Besse Nuclear Power Station (DBNPS) in this document. Any other actions discussed in the submittal represent intended or planned actions by the DBNPS. They are described only for information and are not regulatory commitments. Please notify the Manager - Site Regulatory Compliance at (419) 321-8000 at the DBNPS with any questions regarding this document or associated regulatory commitments.

COMMITMENTS DUE DATE None NIA

Docket Number 50-346 License Number NPF-3 Serial Number 1-1507 2007 INDEPENDENT ASSESSMENT OF THE ENGINEERING PROGRAMS EFFECTIVENESS AT THE DAVIS-BESSE NUCLEAR POWER STATION (85 pages follow)

Independent Assessment Engineering Programs Effectiveness Davis-Besse Nuclear Power Station COIA-ENG-2007 September 10 - September 21, 2007 Prepared by:

THE MARATHON CONSULTING GROUP 1000 Abbey Court Alpharetta, GA 30004 Phone 678-879-8700 WWW.MARATHONINC.COM Team members:

John Garrity President and CEO, The Marathon Consulting Group, Team Leader Paul Borer Senior Vice President, The Marathon Consulting Group Harold Baumberger Vice President, The Marathon Consulting Group Rod Filipek Supervisor, I&CIDigital Design Engineering, St. Cucie Nudear P o w Plant, FPL Mark Flaherty Manager, Engineering Services, Calvert Cliffs Nuclear Power Plant, Constellation Energy Joseph Pechacek Manager-Program and Component Engineering, JAFitzpatrick Nuclear Power Plant, Entergy Submitted by: Reviewed and Accepted by:

-.JaccI((6m John H. Garrity Team Leader V c V

Section 1 Assessment Results and Conclusions ............................................... 2 1.1 Executive Summary ........................................................................... 2 1.2 Introduction........................................................................................ 6 1.3 Scope of Assessment ........................................................................ 7 1.4 Methodology .................................................................................... 10 1.5 Conclusions ..................................................................................... 13 1.5.1 Overall Rating of Engineering Programs Effectiveness ...................13 1.5.2 Assessment Ratings by Assessment Areas ..................................... 13 1.5.3 CR summary .................................................................................... 47 1.5.4 Findings ........................................................................................... 49 1.6 R8ferenCeS ...................................................................................... 55 1.6.1 List of persons interviewed and meetings observed ........................ 55 1.6.2 Reference Documents ..................................................................... 56 1.7 Team Members' Biographies ........................................................... 66 Section 2 Assessment of Internal Self-Assessment Performance ................... 78 Appendix 1 Action Plans............................................................................... 79 Appendix 2 Independent Assessment Plan submittal................................... 80

Section 1 Assessment Results and Conclusions 1.1 Executive Summary The Engineering Programs Independent Assessment Team found the engineering programs at Davis-Besse (DB) to be effective overall, and found performance in each of the six areas designated for assessment to be effective.

The team reviewed engineering work products in a number of areas in depth, and did not find any discrepancies that were considered to be either significant in terms of the validity of the work product, or indicative of a systematic deficiency in engineering work performance or quality management.

Findings were categorized into three types, defined as an Area of Strength (AS),

an Area for Improvement (AFI), or an Area in Need of Attention (ANA):

An Area of Strenqth is an identified performance, program, or process element within an area of assessment that is significant in obtaining desired results.

An Area for Improvement is an identified performance, program, or process element within an assessed area that requires improvement to obtain the desired results with consistency and effectiveness. All Areas for Improvement identified in the Assessment Report will be addressed by the Action Plan@) submitted to the NRC.

An Area in Need of Attention is an identified performance, program, or process element within an area of assessment that, although sufficient to meet its basic intent, management attention is required to achieve full effectiveness and consistency. Areas in Need of Attention are not addressed by Action Plan@)

submitted to the NRC, but are considered for entry into the Corrective Action Program.

2007 Findings The Teams findings in 2007 consisted of:

0 Areas of Strength (AS) 0 Areas For Improvement (AFI) 5 Areas in Need of Attention (ANA)

Page 2

The five Areas in Need Of Attention (ANAs) identified in 2007 are designated as:

1 ANA Waivers of Design Interface Evaluation (DIE) Reviews 2 ANA Equipment Reliability (ER) Program - Forecast of Preventive Maintenance (PM) Strategy Workload, Single Point Vulnerability (SPV) Analysis 3 ANA -

System Health Activities Plant Computer, 480 V AC, Radiation Monitors 4 ANA Analysis of Process-Related Weaknesses 5 ANA Assessment Strategy and Implementation These findings are described in section 1.5.4 of this report.

By comparison, the teams findings in 2006 consisted of:

2 Areas of Strength (AS) 0 Areas For Improvement (AFI) 7 Areas in Need of Attention (ANA)

Strengths The Team did not identify new strengths in Engineering Programs this year.

However, the Team reviewed the strengths identified in 2006, 1 AS DIE Process and 2 AS Margin Management, the strengths identified in 2005, 1 AS Improved Engineering Performance and Environment, and in 2004, (1 AS) Rapid Response Team Effectiveness in Supporting Resolution of UrgenVEmergent Issues, Especially As Needed By The Plant, (2 AS) Internalization Of Engineering Principles And Expectations, and (3 AS) Engineering Assessment Board Influence On Quality Of Engineering Work Products. All these previously identified strengths continue in 2007.

Overall Conclusions The Independent Assessment Team made several overall conclusions from the 2007 Assessment:

Davis-Besse Engineering Programs continue to be effective in both technical and organizational aspects.

Improvements have been noted over the four annual assessments. DB Engineering has addressed problems, often self-identified, and has improved performance in technical quality of engineering work products and in throughput of engineering work processes.

Page 3

The nature of work observed has transitioned from post-restart backlog reduction + steady state plant support to predominantly steady state plant support. The resources made available from reduction of backlog work have been utilized to reduce the amount of engineering work contracted outside the Company and by increasing the effort for improvement items, e.g. PM Strategy development.

Condition Reports (CRs) written during the assessment Five CRs were written during the assessment:

CR 07-26522 NORM-ER-3103-D Rev.3 Does Not Indicate Replacement Frequency The PM template for breakers does not contain a replacement task and frequency for molded case circuit breakers. The June 24, 2004 Technical Bulletin TB-04-03 indicates the life of Westinghouse supplied molded case circuit breakers for mild environment applications is 20 years.

CR 07-26574 COIA-ENG-2007 - C-EE-02.01-010-Rev 30 (DC Calc) Addendum A03, Calculation Error Plant staff noted an error in the application of calculation assumption 2.5 related to the percentage of solenoid loads to be considered in relay panels. The assumption specified the use of 67% of the solenoid loads while the calculation had actually used 100%. This error did not adversely affect the conclusions and a Post-It-Note initiated to correct the error in the next revision. Discussion with plant staff determined that a Condition Report should have been initiated to document this error. Correction of errors, even though in the conservative direction, is outside the purpose of a Post-It-Note. This CR was initiated to document the calculation error.

CR 07-26779 COlA ENG 2007 - NNI Module Not Included for Calibration Under PM 4222 PMA DB-REV-06-0048 requested a revision to NNI PM 4222 to incorporate calibration of seven NNI modules associated with the Reactor Pressure Control String. This was a corrective action associated with resolution of CR05-05988. The PMA was rejected because the modules were thought to be included in another PMA, 05-1184. However, Memory Module 3-8-9 had not been included in PMA DB-REV-05-1184, and therefore the corrective action would not have been fully implemented.

CR 07-27185 COIA-ENG-2007 Snapshot Self Assessment Not Finalized In preparations for the NRC Inspection on the Heavy Load Control of Reactor Vessel Head, a Snap Shot Self Assessment was performed Page 4

during the first part of July 2007. Self Assessment number DB-SA 068 has been assigned to this activity. To date, the self assessment report has not be written. NOBP-LP-2001 does not specify a time the report needs to be completed. However, it is recognized that the longer the time between the completion date and the final report date, the greater the potential of not being able to capture all of the applicable information.

CR 07-27389 COlA ENG-2007 - AOV Program Self Assessment Implementation Not Evaluated The AOV program tri-annual performance assessment was performed by participating on a snapshot assessment team at Beaver Valley. While the general aspects of the fleet AOV program procedure and practices could be evaluated there, the team could not find any evidence from interviews and document review that the implementation aspects of the program at Davis-Besse were thoroughly assessed Page 5

1.2 Introduction The Confirmatory Order Modifying License dated March 8, 2004, required FENOC to conduct independent assessments of the effectiveness of the engineering program annually for a period of five years. The assessment conducted by the Independent Assessment Team and reported in this document is the fourth annual independent assessment of the engineering program.

The plan for this Independent Assessment was formulated in accordance with the guidance of FENOCs procedure DBBP-VP-0009 Management Plan for Confirmatory Order Assessments Rev 3 effective 12/2/2005, and also with benefit of the guidance of FENOCs procedure NOBP-LP-2001 FENOC Self AssessmenVBenchmarking Rev 9 effective 10/10/2006. The Assessment Plan was submitted to the USNRC via serial letter 1-1495 dated June 12, 2007 (see Appendix 2)

The members of the Independent Assessment Team were drawn from the nuclear power industry. There were three team members from operating US nuclear plants and three from the Marathon Consulting Group. The resumes of the team members are included in the Assessment Plan and also presented in Section 1.7 Resumes of this report. The Team members were:

John Garrity The Marathon Consulting Group, Team Leader Paul Borer The Marathon Consulting Group Harold Baumberger The Marathon Consulting Group Rod Filipek St. Lucie Nuclear Power Plant, FPL Mark Flaherty Calvert Cliffs Station, Constellation Nuclear Joseph Pechacek JA Fitzpatrick Nuclear Power Plant, Entergy

  • Left during week due to unanticipated home station duty requirements The Independent Assessment Team commenced work on the Davis-Besse (De) independent assessment during the week of June 25, 2007, with information gathering activities and discussions with FENOC management. The team gathered information from FENOC relevant to the DB assessment and posted this information to an Internet FTP site established for this purpose over a period of several weeks. The weeks of August 20 and August 27 were devoted to intensive review of FENOC documents and formulation of interview strategies, questions, and interview lists. The Team spent the weeks of September 10 and September 17 at the Davis-Besse site conducting initial and follow-up interviews and reviewing additional FENOC supplied material.

Page 6

1.3 Scope of Assessment The scope of the Engineering program assessment included primarily activities and performance since the 2006 Independent Assessment Assessment information was drawn from a variety of sources, including:

Documents supplied by FENOC, including procedures, performance data and reports, program descriptions, engineering work products such as modification packages, calculations, etc., Corrective Action Program (CAP) work items and records, and assessments (partial list of documents provided in Section 1.6.2)

Assessments performed by others such as NRC, INPO, and independent assessors and reviewers FENOC task, project, program, and business plans and status reports Interviews with FENOC personnel (interview list provided in Section 1.6.1)

The assessment concentrated on engineering performance in six areas of interest:

1. Modifications
2. Calculations
3. System Engineering
4. Implementation of the Corrective Action Program by Engineering
5. Effectiveness of Assessment Activities
6. Corrective Acton Taken in Response to Findings identified in the 2006 Independent Assessment Within each of these areas, sub-areas were identified for review. These sub-areas are shown below:
1. Plant Modification Process The team will perform a review of activities to assess the effectiveness of the plant modification process:
a. Selection and prioritization of potential modifications, including assessment of delayed modifications on plant and operating personnel
b. Owner acceptance sub-process (review of contracted work)
c. Quality of modification packages since 2006 assessment (Permanent and Temporary Modifications)
d. Closeout of modification packages and supporting document updates
e. Effectiveness of modifications
f. Interaction and support from parallel processes
g. Workload management Page 7
2. Calculation Process The team will assess the following attributes of the plant calculation process:
a. Workload management, including appropriateness of work priorities
b. Acceptance criteria and owner acceptance sub-process (review of contracted work)
c. Margin management and allocation
d. Linkages and consistency with other calculations
e. Preservation of design bases
f. Documentationltraceabilitylattribution
g. Calculation health and improvement program
h. Interaction and support from parallel processes
i. Systems descriptions design information
j. Engineering rigor and attention to detail
3. System Engineering Programs and Practices The team will investigate the following items:
a. System Engineering alignment and plant support
b. System Health evaluation and reporting
c. Process for prioritizing, communicating, and resolving health deficiencies and program deficiencies
d. Equipment Reliability Improvement Program as reflected in FENOC Excellence Plans
e. Maintenance Rule system monitoring and trending
f. Experience and expertise, including use of operating experience
g. Margin awareness and margin allocation
h. Interaction and support from parallel processes
i. Access to knowledge of Engineering information in calculations
j. Workload management
4. Implementation of the Corrective Action Process by Engineering The Assessment Team will assess the following:
a. Condition Report ownership and appropriate initiator involvement
b. Quality of root and apparent causes produced by Engineering and associated management behavior and guidance C. The Assessment Team will review selected Condition Reports related to Engineering Section performance initiated since the 2006 Independent Assessment of Engineering Performance and independently assess the corrective actions taken
5. Effectiveness of Davis-Besse Assessment Activities Page 8

The Assessment Team will evaluate the effectiveness of the Davis-Besse Nuclear Power Station's assessment activities associated with the implementation of Engineering programs as follows:

a. Planning of assessments over the short and long term for ongoing assessment of Engineering performance
b. Review the results of the Davis-Besse Quarterly Quality Assessments that evaluated Engineering; Determine if the assessments were comprehensive and if effective actions were taken to correct problems or weaknesses identified.
c. Evaluate the effectiveness of self-assessment capability by reviewing corrective actions associated with self-assessment reports, audits (including audits of the offsite safety committee activities), and evaluations conducted of Engineering program implementation.
d. Determine if the Engineering staff is aggressive in correcting self-assessment and assessment findings, and determine whether the corrective actions are adequate, timely, properly prioritized, and that effectiveness reviews are ensuring the desired results.
e. Determine the receptivity and responsiveness of management and staff to issues raised in self-assessments and assessments.

6 . Corrective actions taken in response to the Areas in Need of Attention (ANAs) identified during the 2006 Independent Assessment of the Davis-Besse Engineering Program Effectiveness The Assessment Team will evaluate the responses to the seven (7) Areas in Need of Attention (ANAs) identified during the 2006 Independent Assessment:

1 ANA Inattention to detail in calculations 2 ANA Implementation of requirements from calculations 3 ANA Equipment Reliability Program 4 ANA Red plant health systems 5 ANA ECP Revision Reviews 6 ANA Follow-ups to assessments and last year's COIA-ENG- 2005 7 ANA Management of engineering workload Page 9

1.4 Methodology The assessment was performed in accordance with the sequence of steps, summarized below. These steps were developed in preparation for the 2004 COlA of engineering programs and have been utilized in each successive assessment.

1. Develop the assessment scope, including areas to be assessed and assessment topics under each area. This step included consideration of FENOC management's views, FENOC's procedural and business planning guidance for assessments in general, and the need to meet the particular assessment requirements for Davis-Besse.
2. Develop the assessment plan, including the overall objectives and approach, the framework for conducting the assessment, and including review and comments by FENOC engineering and corporate management and staff.
3. Determine the team size and composition requirements
4. Recruit the team, including industry peers.
5. Develop a document library and means to provide access to team members.

This included collecting documents from FENOC's corporate offices and the Davis-Besse site such as procedures, performance reports, engineering work products, and organizing them for access by team members through a website established for this purpose.

6. Develop a list of plant personnel to be interviewed and typical interview questions or areas of inquiry. A list of plant personnel to be interviewed was developed by defining the organizational positions to be interviewed for each assessment area and topic, and selecting one or more team members to represent that interview area of interest.
7. Develop the detailed interview schedule. Plant administrative support personnel scheduled interviews and published schedules notifying interviewees and team members of the time, date, location, subject, and participants of each interview. Typically an interview was scheduled for an hour, and interviewees were scheduled to meet with from one or two Team members. Follow-up interviews were scheduled during the assessment as needed. Approximately sixty formal interviews were conducted, with approximately sixty different individuals interviewed, and additional follow-up discussions were held as necessary. The first week on site was dedicated to interviews and assessment of the areas of modifications, corrective action program use by engineering, and system engineering, while the second week focused on the areas of calculations, effectiveness of assessment activities, Page 10

and corrective action taken in response to ANAs identified in the 2006 independent assessment.

8. Assemble the team and provide orientation. The team assembled for an orientation session the Sunday evening before the assessment. The interview schedules were briefed, any new documents received were noted, and the overall assessment schedule was discussed. The assessment plan and scope, the background for and development of the assessment scope, and the guidance provided for focused self-assessments by the FENOC fleet procedure, were discussed.
9. Obtain badges for unescorted access to the plant.

10.Conduct interviews and document reviews. During the assessment period, results of interviews and document reviews were summarized on daily records of facts and observations. Items of interest were those thought to require further follow-up or having the potential for becoming findings. The daily records were collected, consolidated, and distributed to team members on a daily basis.

11.Organize items of interest. Toward the end of each of the assessment weeks, items of interest from daily records were binned to identify evolving issues in the form of potential Areas of Strength, Areas For Improvement, and Areas in Need of Attention in each of the assessment areas. Potential findings were documented.

12.Provide regular counterpart briefings. The Team briefed site counterparts on a regular basis to keep the site staff informed of items of interest and potential findings, and also to support generation of Condition Reports when appropriate (five were generated during the assessment) 13.Consolidate items of interest into Areas of Strength (AS), Areas for Improvement (AFls), and Areas in Need of Attention (ANAs). Near the end of each assessment week, issue summaries were developed to reflect available information and to support generation of management briefing and exit talking points.

14. Brief plant engineering management at exit. Site management was briefed at a formal exit on Friday of the second week of the assessment. FENOC key corporate, site, and engineering personnel were included in this briefing. The briefings were conversational in style, with a team member for each assessment area discussing the significant findings in his area. For each potential finding, the issue and appropriate examples or other supporting information was presented and questions were answered. The daily counterpart briefings and management pre-exit briefings assured that the site personnel being briefed already knew of all findings and that appropriate CRs had been generated.

Page 11

15. Provide assessment preliminary findings. Site management briefing summaries and talking point outlines were provided to FENOC in electronic file form after the assessment was complete. (At this stage, the findings were still considered draft, but useful information for FENOC).
16. Provide report for Davis-Besse. This report is the report for information and action by Davis-Besse and FENOC.

Page 12

1.5 Conclusions The Assessment Team's conclusions are summarized in this section. These findings are based on extensive working field notes and Team discussions conducted each day during the assessment period and after.

1.5.1 Overall Rating of Engineering Programs Effectiveness The Independent Assessment Team rates the effectiveness of Engineering Programs as Effective, with no identified Areas for Improvement and several Areas in Need of Attention.

Davis-Besse Engineering Programs continue to be effective in both technical and organizational aspects.

0 Improvements have been noted over the four annual assessments. DB Engineering has addressed problems, often self-identified, and has improved performance in technical quality of engineering work products and in throughput of engineering work processes.

The nature of work observed has transitioned from post-restart backlog reduction + steady state plant support to predominantly steady state plant support. The resources made available from reduction of backlog work have been utilized to reduce the amount of engineering work contracted outside the Company and by increasing the effort for improvement items, e.g. PM Strategy development.

Specific findings in the 2007 independent assessment included 0 Areas of Strength (AS) 0 Areas For Improvement (AFI) 5 Area in Need of Attention (ANA) 1.5.2 Assessment Ratings by Assessment Areas This Section (1.5.2) presents the Independent Assessment Team's conclusions about the effectiveness of Engineering performance in each of the six assessment areas.

The System Engineering area had two ANA findings. There was one ANA finding in each of the Calculations, Corrective Action Program, and Assessments areas. There were no findings in the Modifications and the Follow-Up to 2006 assessment areas. All the Findings are described in section 1.5.4, and those descriptions are referenced under the heading "Findings for This Area" in the discussion of each of the six assessment areas.

Page 13

Each of the five findings for 2007 related to a single assessment area, Le. there were no cross-cutting findings (findings applicable to more than one area). In the 2006 assessment, four (4) findings were considered cross-cutting findings applicable to two or more areas.

The distribution of Findings between unique and cross-cutting has changed since last year when four of the ANA findings were considered cross-cutting. (In 2005 only one Finding was unique and five were cross-cutting.) This indicates that the assessment findings are becoming less systemic within the Engineering organization. This supports the team's assessment that the Engineering department continues to improve and issues are becoming more isolated.

1.5.2.1 Modifications Area Effectiveness Rating Overall, the team rated the modification process Effective. This is based on the quality of ECPs, interviews with engineers and managers, and Engineering Assessment Board (EAB) performance indicator trends.

There are no Findings associated with this area. The Engineering organization continues to produce quality modifications. The finding from the 2004 COIA, Modification Tracking and Closeout, continues to be addressed and further improvement in the reduction of the backlog of open modifications was noted.

The number of open modifications has been reduced. The milestone for the issuance of all modifications one year prior to the 15'h refueling outage was met.

This allows sufficient time for outage and installation planning and should help increase modification effectiveness.

Source Information The Independent Assessment Team conducted interviews of selected Engineering and Site personnel and reviewed selected documents from the reference library (See Sections 1.6.1 & 1.6.2).

The team reviewed selected Engineering Change Packages (ECPs), interviewed design and system engineers and managers, fleet oversight staff, and the EAB member.

Documents Reviewed ECP 07-0034 Isolate Faulted Heater BankslRestore SCR Bank ECP 07-0019 Abandon Four Freeze Protection Circuits 1 ECP06-0143 Alloy 600 Mitigation Page 14

I ECP07-0062 HPI Ball Valve Addition I 1 ECP04-0271 CS Thermal Relief Check Valve I ECP 07-0131 Circ Water Piping Repair ECP 06-0012 Replacement Pump MSD Demin Bodyfeed Pump EDG Governor Replacement Observations The plant staff did an excellent job of prioritizing the modifications for the upcoming refueling outage. There are about 17 major modifications scheduled for the 15'h refueling outage, and all ECP packages were produced one year before the original outage start date.

The use of outside vendors to develop modification packages or to produce calculations has been greatly reduced in 2007. As a result, the past issue of dealing with vendor quality problems in the acceptance process is not causing quality issues this year.

Several ECP packages were reviewed, focusing on descriptions, 10CFR50.59 screens, regulatory applicability determinations, and various design interface documents. The assessment team concluded the technical content of ECPs and associated documents was acceptable.

The EA6 Quarterly Report for the period April 1 through June 30, 2007 was reviewed. The report results were then discussed with the responsible Engineering managers. EAB review scope includes all ECPs and associated calculations, selected 50.59 evaluations and selected Operability Evaluations.

The EAB evaluated 67 products during this period, and have documented a sideways trend in FENOC design engineering product quality. The EAB grades continue to be satisfactory, but below the station stretch goal. Common quality problems identified by EAB are the lack of clear articulation of the problem setup and lack of clearly defined assumptions. The EAB does participate in quarterly engineering department continuing training to share lessons learned. Action has been taken to address in-house and vendor quality issues in the past COIAs.

Vendor usage by the design department has been minimal, with only four vendor documents reviewed by the EAB.

The number of Engineering Change Process documents in Design Engineering is now about 300, compared to about 500 in early 2006, about 700 in early 2005, and about 900 in early 2004. The total engineering backlog is actively tracked and has been reduced by about 400 items in 2007, with a current population of about 1170 items.

Page 15

The modification closeout process was reviewed to ensure timely drawing and record updates. Currently there are approximately 17 packages in the closeout process. The modification closeout process has improved over the past three years and is keeping this backlog to a manageable level.

The list of temporary modifications was reviewed to ensure plans were in place to remove them during the next refueling outage. Current plans are to exit the outage with only two temporary modifications in place.

There is a strong commitment to proactively address staffing issues that will occur due to retirements over the next few years. The department has hired or plans to hire several engineers in 2007 and 2008. There is also a strong engineering co-op student program with a local university, which is the source of several new hires. The director has authorization to increase his headcount over baseline in 2008 to accommodate the new hires. There are also plans to supply seven SRO candidates from engineering over the next four years. These candidates will eventually fill key engineering or plant staff positions.

Specific Issues for This Area None Findings for This Area None Cross Cutting Findings Applicable to This Area None CRs issued during assessment of this area None 1.5.2.2 Calculations Area Effectiveness Rating Overall the team rated the calculation area as Effective based on the quality of work products reviewed and the continuing progress being made. Backlogs have been lowered near target levels in plant engineering and technical services engineering, and continue to improve in design engineering.

Source Information Page 16

The Independent Assessment Team conducted interviews of selected Engineering and Site personnel and reviewed selected documents from the reference library (See Sections 1.6.1 and 1.6.2).

In particular, the team reviewed the plant Design Basis Assessment Reports (DBAR), with emphasis on the Calculation Health and Calculation Quality sections, Condition Reports related to calculations, and a sample of new and revised staff and vendor calculations issued since the last assessment.

Interviews were conducted with engineers related to the work products reviewed.

Finally, the team independently reviewed thirteen calculations performed since last year for conformance to standards and expectations with respect to technical rigor.

Calculations reviewed included:

C-EE-002.01-010, Rev 30, DC Calc - Battery and Charger Sizing, Short Addendum A03 Circuit, and Voltage Drop C-EE-002.01-010, Rev 30, Addendum A02 C-EE-002.01-010, Rev 30 C-EE-015.03-008, Rev 4, AC Power Systems Analysis Addendum A14 C-EE-015.03-008, Rev 4, Addendum A28 C-EE-015.03-008, Rev 4, Addendum A29 C-ME-011.01-141, Rev 1 Service Water System NPSH Analysis C-ME-040.01-004, Rev 0 Boric Acid Addition Tank Vortex Formation

~

C-NSA-011.01-016. Rev 1 Service Water System Design Basis Flowrate Analysis and Testing Requirements C-NSA-011.01-017. Rev 1 Pump Curve Acceptance Requirements for Service Water Pump 2 C-NSA-049.01-004, Rev 1 Vortex Formation wlECCS Suction from BWST C-NSA-050.03-028, Rev Auxiliary Feedwater Minimum Performance 1, Addendum A01 Page 17

C-NSA-052.01-003, Rev. HPI Pump Acceptance Criteria 8, Addendum A06 I 06-066521 06-073281 06-116401 07-152331 07-162831 07-16351 I 07-121431 07-124981 07-21042 07-21095 06-112451 07-146051 07-12143) 07-20103)

Follow up was performed related to one CR that appeared to have implications related to the effectiveness of the design review process:

CR 07-24820 identified an error in the methodology and calculation of allowable stresses for determining the acceptability of a pipe patch to be welded to a Closed Cooling Water pipe. The methodology error involved the failure to include a 1.5 stress intensification factor and the use of an incorrect formula for allowable shear stresses. The calculation error involved the incorrect calculation of a seldom-used trigonometric function. The above errors were not caught during the design review, and ultimately were identified by a Nuclear Regulatory Commission inspector. This CR was thoroughly investigated and determined to be due to a human performance error resulting from a lack of questioning attitude. The team interviewed the design reviewer to understand the causes of the failures noted. The team concluded that, although errors were made, the design review process was adhered to and appears sound.

0 .

The errors identified in the CRs noted above are considered to be isolated and not indicative of adverse trends in calculation rigor and quality. Plant staffs Page 18

identification of the error in the DC Calculation is a noteworthy example of the application of a questioning attitude while updating a calculation.

The quality of calculations is also monitored by using Engineering Assessment Board (EAB) scores presented in the DBAR Calculation Quality Section. EAB scores show a mixed trend since last assessment with declining performance (rising scores and above the goal of 1.O) in the last quarter of 2006 and with 2007 scores generally lower and below goal (August 2007 year-to-date average of 0.57), but overall not achieving scores as good as the last assessment period (late 2005 and first half of 2006).

The EAB Fourth Quarter 2006 Report noted:

The Twelve Week Rolling Average Scorefor all products reviewed by EAB had been less than or equal to 0.5for every month since November of 2004, until this quarter.

The average score this quarter was 0.98. Potential contributing causesfor this decline may be attributed to:

1. A learning curve by new design personnel.
2. NOBP-CC-2005 EngineeringAssessment Board was implemented at Davis-Besse on 9/6/06,. This is a fleet wide businesspractice for implementation of EAB.

The grading criteria contained in this business practice are different than the grading previously used.

3. Many of the modifcation packages and calculations which are required to support 15RFO were provided to EAB for review this quarter. These packages were prepared to meet deadlinesfor issue. Time pressurefor issuance of these packages may be part of the reason forpoor quality.
4. EAB has waived review of many of the more simple, non-safety packages which previously had been reviewed. This affects the average since before these would likely have received a good grade and helped the average.

Condition Report 07-13550 has been initiated to evaluate and determine any corrective actions to the declining EAB scores.

The Apparent Causes of CR 07-13550 were determined to include (1) more stringent fleet-wide grading criteria, (2) inadequate use of human performance tools and (3) unclear guidance for design input parameters. Corrective actions are addressing human performance issues and guidance for design inputs. The assessment team examined this area and concurs with the Davis-Besse analysis and actions.

EAB scores in 2007 have improved from the 0.98 score of the fourth quarter of 2006, and the 2007 year-to-date average is 0.57 through the first eight months.

Although the scores remain above the pre-4Ih quarter 2006 scores of between 0.2 and 0.4, the differences are judged to attributable to a change in the grading criteria, and not indicative of a decline in performance.

Page 19

Overall, it is concluded that the calculation area has made continued progress since the last assessment. The technical rigor of calculations has remained excellent. Findings noted represent opportunities to improve and are not considered significant weakness or shortcomings.

Specific Issues for This Area The following observations are provided for consideration. None of these rose to the level of a finding:

0 Filenet search results for calculations list Addenda and Post-It-Notes from previous revisions that have already been incorporated in the current Revision. In some calculations, this can produce a long list of Addenda and Post-It-Notes that may have to be reviewed to determine all changes to the current calculation.

Consideration should be given to statusing incorporated Addenda and Post-It-Notes as superseded or add a note: Incorporated into Revision xx.

0 Several cases were noted where changes to System Descriptions to add information or update design parameters as the result of revised calculations were considered enhancements. The policy System Descriptions is for use as a roadmap or tool, but not as the definitive reference for design basis information. It is the teams opinion that treating System Description impacts as enhancements may send a message that they are low priority. Keeping the descriptions up-to-date is considered important in maintaining a complete and consistent design basis and overall configuration control. Consider adding more emphasis to the maintenance of System Descriptions.

0 Training related to the use of the Calculation Utility is not included in Engineering Personnel Job Familiarization Guide (JFG). It is understood that this training requirement is being added to the next revision of the JFG.

Findings for This Area There was one Finding uniquely associated with the Calculation assessment area:

1 ANA Waivers of DIE Reviews Findings are described in detail in Section 1.5.4 Cross Cutting Findings Applicable to This Area Page 20

None CRs issued during assessment of this area CR 07-26574 COIA-ENG-2007 - C-EE-02.01-010-Rev 30 (DC Calc),

Calculation Error CRs initiated during the 2007 assessment are described in detail in Section 1.5.3 1.5.2.3 System Engineering Area Effectiveness Rating The Independent Assessment Team rates the System Engineering area as Effective.

Source Information The Independent Assessment Team conducted interviews of selected Engineering and Site personnel and reviewed selected documents from the list of documents provided in advance by FENOC (See Sections 1.6.1 and 1.6.2).

The team reviewed recent and past Plant Health Reports, and interviewed system engineers responsible for the following plant systems:

Med Voltage AC Boric Acid Addition Doors and Hatches 480 V AC Freeze ProtectionlHeat Trace Plant Computer ICs NNI Radiation Monitoring, Process and Area Component Cooling Water Feedwater HPI-EC Turbine Generator Containment Spray In addition, the team reviewed recent and past Fleet Engineering Programs Health Reports, and interviewed Davis-Besse program owners responsible for the following engineering programs:

Boric Acid Corrosion Control Equipment Reliability Service Water System Reliability Preventive Maintenance.

Page 21

Plant Engineering supervisors and the Plant Engineering manager were interviewed, as were selected management personnel from the Plant organizations responsible for operations and maintenance.

Observations In 2007, as in past assessment periods, System Engineering was generally praised as effective and responsive to problems and support assistance needs of Operations and Maintenance. Both the technical expertise as well as the dedication of the system engineers to support of plant activities were remarked upon by interviewees, and the Team found the same.

System engineers interviewed regarding the status and health of their systems were knowledgeable and engaged in system health monitoring and reporting.

Maintenance Rule systems overall health was reported to be Green for the current quarter (2Q 2007), improved from overall health White at the time of the 2006 assessment. This was attributable to completion of significant system health improvement related work as well as a change in the calculation of individual and overall system health. The change in the overall calculation of system health was stimulated by information obtained through benchmarking and discussions with industry groups. This change took effect after the 1Q 2007 Plant Health Report was published, and the 2Q 2007 Plant Health Report incorporates the new rating calculation protocols.

At the time of the 2006 assessment, eight systems were designated as in red health condition. At the time of the 2007 assessment, two systems remained in the red system health designation status. A review of system health information and interviews with the system owners of the 2006 red systems were conducted.

A summary of the recent history red system health ranking is shown below:

Page 22

System health recovery plans for the two remaining red systems (Plant Computer and 480 VAC) were reviewed and discussed with the responsible system engineers.

The Plant Computer health recovery plan was out of date. Actions different than the ones described in the recovery plan were being implemented, and the plan was scheduled to be updated in October. The new intentions involve expedited procurement and installation of several key plant computer system elements instead of refurbishment of the older existing elements. The Team found no fault with the newer intended course of action, and concurs that the plan should be updated as soon as practicable to reflect the new course of action. In the meantime, additional management attention should be provided for the activities being carried on to substitute for the attention normally provided through the plan development and approval process.

The 480 VAC system recovery plan (Rev 3 8/9/07) was generally satisfactory, but there were two elements the Team felt require attention.

First, several Maintenance Work Orders for replacement of molded case circuit breakers were within about six weeks of scheduled execution, but at the T-6 point, engineering and procurement restraints still existed that threatened execution of the work. No specific steps were being taken to overcome the restraints and it was judged unlikely that the scheduled work would be accomplished. This seemed contrary to recent actions taken to ensure system health related work is periodically reviewed for restraints and action taken when restraints are identified.

Second, in the plan section Corrective Actions and Estimated or Actual Corntietion Dates, For the five breaker failure issues: Molded Case Circuit Breakers ECD 10/01/07 the action to be taken was unclear. The action item indicates that The MCCB PM Orders that replace the greater than 20 year old MCCBs will be validated to be properly prioritized in accordance with existing FENOC guidance as Orders accomplishing MR (a)(?) Action Plan corrective actions.

How and when this action would lead to actual breaker replacement could not be determined. The Team suggests that actual breaker replacement is the action needed and the action that would serve as the completion criterion, not validation of prioritization, which leaves the estimated completion date for the MCCB replacements indeterminate.

The health recovery plan issues discussed above contributed to Finding 3 ANA on system health.

Page 23

A single point vulnerability study is not included in the current Equipment Reliability Excellence Plan. SPV reviews in the past have identified the EHC and ICs as presenting reliability challenges, and there are long-term projects to replace these systems with single failure proof digital designs. Interviews with station personnel responsible for the Equipment Reliability program, the Preventive Maintenance program, and for conduct of the activities presented in the Equipment Reliability portion of the Engineering Excellence Plan indicated additional SPV study is not thought to be of significant importance at present.

Most stations are performing or have performed some form of SPV study to identify plant equipment issues that cannot be addressed by revising PM regimes, but rather will require some other forms of mitigation, such as adoption of challenge-limiting operating procedures and reduction of vulnerabilities through design changes The lack of a plan to conduct single point vulnerability studies contributed to Finding 3 ANA on system health.

The health recovery plans were generally found to be suitable vehicles for identifying and guiding the work necessary to improve system performance and health to higher levels. The plans now often address activities necessary to progress to excellent (green) system health, and now often address threats to improving or continuing good system performance.

Completion of work identified in system health recovery plans since the 2006 assessment was markedly greater than in the period prior to the 2005 assessment.

The measures taken to support preparation and execution of work to improve system health were observed to be widely utilized and responded to. In particular, the designation of work orders in SAP as system health related, and presentation of these designations in the work management process work week restraint reports was judged to be having a significant beneficial impact. All personnel reviewing upcoming work weeks for work scope and work status now can see immediately which work orders have been designated as system health related, and thus are better able to take actions to make those orders ready by removing restraints, and to better judge schedule change requests which might reschedule system health related orders. This information is not only available through the work week restraint reports, but also more generally available through the SAP data query and sorting tools so individuals across the site can better monitor and respond to readiness activities they are responsible for.

Although problems with the generation of the Plant Health Report observed during the last assessment appear to have been resolved to a significant extent, some plant health report areas continue to be populated with data the system engineers do not trust and do not rely on. The data presented in the Report Card section of each report are suspect. In several instances, data in the Report Card section was contradicted by the system engineer in the SEs discussion of the Page 24

Nuclear Safety, Material Condition and Operational Focus elements of overall system performance. For example, in the 2Q 2006 Plant Health Report, page 203, the system engineers discussion of Operational Focus scoring elements indicates 0 Open Operator Work Arounds when the corresponding Report Card data table on page 201 says 1. Similarly where the SEs discussion indicates 5 open TMs, the Report Card data table indicates 7. The data included in the Report Card section of the System Health Reports should be able to be relied on by the system engineers as they craft their discussions in the Analysis section of the report. Instead, they have to independently obtain the correct information by other means.

The Team sought the data in the report card section of the plant health report in the form of a spreadsheet or database, with the intention of determining the dominance of the various factors across the systems. Contact with the FENOC fleet responsible individual indicated the report card data are not captured in one location in the form of a database or spreadsheet and are thus unavailable for analysis.

The Plant Engineering section addressed the Equipment Reliability program area in interviews and follow-up discussions.

Notable progress has been made in this area since the last assessment.

Criticality classification and validation for all approximately 66,000 components has been completed (7241 components are critical, and 12,700 are non-critical for a total of about 20,000 critical and non-critical components overall. This is consistent with other plant classification outcomes).

Multiple phases of analysis of the critical and non-critical components are planned. Phase 1 analysis, currently underway, involves the development of 42 PM templates and the analysis of about 14,000 critical and non-critical components covered by these templates (about 70% of the total number of critical and non-critical components).

At the time of this assessment all 42 Phase 1 PM templates had been completed and released for use in analysis of components.

At the time of this assessment, analysis of components in the phase 1 set was complete for 2,881 critical and 3,961 non-critical components for a total of 6,482 components analyzed to date with the 42 phase 1 PM templates (out of the 14,000 components of phase 1, and out of the 20,000 total population of critical and non critical components to be analyzed through all phases). Analysis of all components addressed by thirty-four phase 1 templates has been completed (4,294 components). Analysis of the remaining phase 1 components is proceeding using the other eight phase 1 templates but is not complete. The number of components analyzed to date (6,482) includes those analyzed to these eight templates.

Page 25

The analysis to date has resulted over 2,150 PM revision requests. Relatively few of these have been fully implemented and fewer still have been executed.

Completion of this phase 1 of the PM Strategy will involve generation of additional PM revision requests beyond the 2,150 or so developed to date.

Completion of later phases of the PM Strategy program will involve development of additional PM templates and PM revision requests. These PM revision requests will have to be implemented (processed through the work planning and scheduling processes to the point where they are incorporated into work week and outage scopes and schedules and made fully ready to work) and executed (carried out in the field such that the revised PMs have been performed on all affected equipment). Both implementation and execution involve potentially significant use of resources. Many stations find that their implementation and execution forces are initially challenged with the work to be done flowing out of the PM basis analysis, and unable to conduct the implementation and execution work without considerable delay while they prioritize and organize the work and resource it to get it all done.

At this point, a rough extrapolation of the number of PM changes indicates a total projected number of PM revisions for phase 1 of about 4,600 revisions in all for phase 1 analysis yield, and about 6,600 PM revisions in all for analysis yield of all phases. It should be possible to estimate the average planner and scheduler labor input to process a PM revision request and then estimate the total and to go resource requirement for the implementation work. It should also be possible to estimate the average yield of the implementation of a PM revision request in terms of the craft and support labor required to execute the planned and scheduled work orders for PM revision requests. These estimates would give an indication of the implementation and execution resources that could be required to carry out the overall PM upgrade.

The FENOC fleet is developing performance indicators for monitoring the PM strategy work. Indicators that allow the implementation and execution forces to see in advance how much and what types of work they will have to perform will allow them to get ready to do the work before it all flows out of the analysis phase in engineering. The results of the analysis performed to date can be used to support the implementation work and execution work by determining the yield of work output produced on average thus far and extrapolating that yield foward to predictions of the total of implementation and execution work that will result when the analysis is complete.

Specific Issues for This Area The following observations are made with respect to this area. None of these observations rise to the level of a finding, but are provided for consideration in the plant's improvement efforts.

Page 26

The Plant Computer health improvement plan should be updated, and in the meantime, management attention work being carried out should be heightened.

The 480 VAC health improvement plan should be reviewed for work restraints that might not be removed in time to support implementation, and the criteria for advancement into the next health status level should be reviewed to ensure they are objective and quantified.

A single point vulnerability study should be considered.

Projection of the number and nature of PM revision requests and related implementation documents, as well as the number and nature of field tasks that are eventually likely to result should be made available for station personnel who will soon have to plan and carry out the implementation and execution work to use in preparing to do the work without undue delay.

Data anomalies in the Plant Health Report should be resolved.

Findings for This Area The following findings are applicable only to this one assessment area:

2 ANA ER Program - Forecast of PM strategy workload, SPV analysis; 3 ANA System health activities - Plant Computer, 480 V AC, Radiation Monitors Findings are described in detail in Section 1.5.4 Cross Cutting Findings Applicable to This Area None CRs issued during assessment of this area CR 07-26522 NORM-ER-3103-D Rev.3 Does Not Indicate Replacement Frequency CR 07-26779 COlA ENG 2007 - NNI Module Not Included for Calibration Under PM 4222 CRs initiated during the 2007 assessment are described in detail in Section 1.5.3 1.5.2.4 Use of the Corrective Action Process (CAP) by Engineering Area Effectiveness Rating Page 27

The Independent Assessment Team's overall rating for the Corrective Action area is Effective. Progress is continuing to be made on corrective action backlogs. Engineering's implementation of the CAP is very good to excellent.

Source Information The Independent Assessment Team members reviewed a number of applicable Condition Reports in their assessment of the areas of Modifications, Calculations, and System Engineering, In addition to the insights provided with respect to the areas under review, this also provided insight into Engineering's use of the Corrective Action Program. Additionally, a sample of Engineering root cause analyses, limited and full apparent cause analyses, and a sample of " A F Condition Reports closed since the last assessment were reviewed. The team also used reviews of Condition Reports and Cause Analysis related to Engineering performed in the 2007 COlA of the Corrective Action Program.

The team also reviewed the DBAR section related to Design Engineering Condition Report (CR) Backlog Reduction to determine progress being made with respect to Backlog Reduction of investigations and corrective action completionhesolution. Similar statistics were obtained for Plant Engineering from the available management reports The engineering assessment used the results of the 2007 Confirmatory Order Independent Assessment of the Corrective Action Program, completed early this year, where appropriate. The engineering assessment focused on CAP implementation and did not assess the CAP processes that are common to all station organizations that had previously been assessed.

Observations for This Area Efforts to reduce the Corrective Action backlogs in Engineering are progressing satisfactorily and workdown curve targets are being met. Plant Engineering and Technical Support Engineering are very near their end targets. Design Engineering has further to go, with some impact noted from moving up 15RF0, but continues to make good progress.

The team found that Engineering was promptly initiating Condition Reports when appropriate. One exception was noted in that a CR was not issued to document an error in DC Calculation C-EE-020.01-010, Revision 30. Plant staff identified this calculation error during preparation of Revision 30, Addendum A03. A Post-It-Note was issued to address correcting the calculation in the next revision. The team questioned whether a CR should have been initiated, and after this discussion the station issued CR 07-27754 to document this error. It was noted that this error was in the conservative direction, that is, once corrected would increase overall calculation margin. This failure to initiate a CR when appropriate is considered an isolated instance.

Page 28

Condition reports appeared to be appropriately classified as Significant or Adverse. Issues that are not considered Adverse to Quality are documented and tracked through SAP notifications. The type of actions included requiring root cause analyses (SR or AR), full apparent cause analyses (AA), limited apparent cause analyses (AL), and no causal analyses required (AF, investigation but no causal evaluation required) and closed to trending, AC, corrective action, if any, already complete). The items chosen for root cause, full apparent cause and limited apparent cause analyses appeared appropriate; Two root cause evaluations were reviewed:

CR 07-18074, HPI Train 1 Discharge Piping Air Intrusion Root cause analysis of this event determined that the root cause was multiple valve failures (five valves total) that allowed gas from the Core Flood Tanks nitrogen blanket to leak into the HPI discharge piping.

Contributing causes identified were (1) lack of a questioning attitude (assumed excess use of nitrogen was a packing leak that had previously occurred without sufficient investigation) and (2) procedure content (HPI discharge venting procedure implement to address operating experience was not adequate to detect the presence of gas in the HPI discharge line).

The identified causes were examined in detail and the team believes that the root cause analysis missed opportunities to identify additional or alternate causes. Although the fact finding and chronological history appear to have identified the pertinent supporting information, the cause determination(s) were not to a depth that contributed to long-term resolution of the issue and station improvement. The root cause (five leaking valves) was a restatement of the results of the Problem Solving Team and added no new information or insight. The question of why five valves in series would fail (e.9. maintenance, design, operation, foreign material, common mode failure) was not addressed.

Additionally, no corrective action was included to investigate the failure mode after valves were repaired or replaced, and determine the failure mechanism. The failure mode was suggested as seat wear, but there was no physical evidence to either support or refute this, nor was there information provided to support or rule out other possible failure modes.

Although lack of a questioning attitude was identified as a contributing cause, there appear to have been multiple instances of lack of a questioning attitude, including (1) the assumption that the excess nitrogen use was the recurrence of a known problem, (2)

Managements failure to question that assumption, (3) operator failures to question and quit reporting continuing abnormal nitrogen additions, Page 29

and failure to question the lack of excess nitrogen in the containment atmosphere during a containment entry.

A process issue not identified is related to unresolved issues/assumed but not proven causes. A basic management principle for dealing with these types issues is (1) determine what the worst case scenario is and rule it out, and (2) put in place a monitoring program to determine if the conditions remain as expected or departed from expectations.

Failure to monitor the condition and establish criteria, and failure to reassess the assumed cause were significant contributors, if not the root cause of this event. Since the station has an Operational Decision Making Issue (ODMI) procedure that invokes these processes, an opportunity to examine why the process was not implemented (e.g. it meet the threshold) and whether a lesson learned is that it should have been (e.g. adjust the threshold?). An opportunity to examine and improve a process was missed.

Finally, the contributing cause of procedure content appears to be related to the implementation of the Operating Experience (OE) program, specifically INPO SOER 97-1. There was no discussion of and examination of whether the procedure content met the intent of the OE item. If the intent of the OE was merely to remove any trapped gases, then the procedure met the intent. However, if the intent of the OE item is identify unwanted intrusion of aidgas, the OE item was inadequately implemented since it was difficult, if not impossible, to detect whether gas or water was coming out the vent. An opportunity to examine the OE process effectiveness was missed.

Despite the shortcomings existing in the causal analysis, many corrective actions were included that address causes identified above but not identified (or developed) in the report. The training material for engineers to address lack of a questioning attitudellessons learned from the event did identify the multiple opportunities, the lack of continuing monitoring and the failure to reexamine the assumed cause.

Additionally, an interim action to install a sixth valve in line with five leaking valves did potentially address a possible design issue with the existing valves by installing a ball valve rather another gate valve similar to those that failed.

Overall, the team assessed that this Root Cause Analysis missed several opportunities to identify causes in a depth consistent with station expectations, particularly in the area of examining process implementation and effectiveness (ODMI 8 OE).

CR 06-11269, EDG Vent Damper May Not Be Structurally Adequate for Design Tornado DP Page 30

The root cause of this event was determined to be that the original design specification for the HVAC dampers did not identify tornado dlp requirements and that tornado d/p requirements had been interpreted to apply only to the support structure (walls/doors) and dampers specifically designed for the purposes.

Extent of Condition reviews also identified similar issues with HVAC dampers for some other HVAC dampers in non-safety-related rooms.

Potential issues with failed HVAC and room overheating were examined for each incidence as part of an operability review. The Operability Determination identified compensatory measures to be taken in the event of a tornado to ensure operability of the HVAC system dampers.

The team assessed this Root Cause Analysis to be clear, concise, and well-written. The cause identified appears appropriate and corrective actions were developed that will contribute to both short and long term resolution of this issue.

One full apparent cause evaluation was reviewed:

CR 07-24820, Errors in LP Condenser Piping Repair Calculation (C-CSS-043.03-006)

The Casual Analysis was reviewed relating to calculation errors related to a repair patch for a safety-related system. The team was initially somewhat skeptical that two engineers could independently make the same math and methodology errors, without some breakdown of the independence of the design review. However, the casual analysis did an excellent job of explaining exactly what happened. Interviews with the independent reviewer confirmed to the team's satisfaction that this issue was solely a human performance issue as identified in the causal analysis, and not a weakness in the design review process.

Overall, the team judged this apparent cause analysis to be clear, concise and meeting station expectations.

One limited apparent cause evaluation was reviewed:

CR 06-06652, CST Anti-Vortex Procedure Direction This CR was related to failure to update Operations Procedure OP-06233 to change the required level the changeover of CST suction for the AFWPs to prevent loss of AFWP suction. This level was previously 3 feet based on AFWP net positive suction head requirements, but the CST vortex calculation identified that vortexing (leading to air entrainment) may occur starting at a level of 3.44 feet. This issue was Page 31

identified during the last Engineering COIA, and the causal analysis was performed after the last Engineering COIA.

The cause was determined to be procedure content (Cause Code 802) in that the content of the procedure was not properly coordinated with the changes in calculation C-ME-037.01-003 Rev 1 and calculation C-ICE-037.01-001 Rev 0. This cause is a restatement of the problem and does not address why the procedure change was not coordinated with the calculation change.

The process for identifying impacts of design (including calculation) changes on other organizations is the Design Interface Evaluation (DIE) review process. Although not identified or discussed in the casual analysis, a review of Calculation C-ME-037.01-003 Rev 1, indicates that the mandatory DIE reviews by Operations, Maintenance and Systems Engineering, were waived by the principal engineer and the waiver approved by the supervisor. This review process is intended to psrmit the reviewing organization to identify potential impacts of the proposed change. Failure to send the DIE to Operations, in this case, prevented operations from performing this review and presumably identifying the needed change. (It was also noted that a later Condition Report (CR. 06-11157), also found that the System Descriptions affected by this calculation was also not updated).

Overall, the team felt that the causal analysis was not performed in sufficient depth to identify a process issue (waiving a DIE review when an impact existed) not a procedure content issue was the identified cause of this problem. Though limited apparent cause analyses are not intended to be to the depth of analysis of a root or full apparent cause, the team feels that an opportunity to identify (and correct) a process issue with waiving DIE reviews was missed.

One " A F Investigation was reviewed:

CR 07-23106 documents that commitments to NUREG 0612 with respect to Heavy Loads had not been incorporated in the USAR. The investigation reviewed RIS 2005-25 Supplement 01 and concluded that it is required that all analyses of new safety issues performed by the licensee at the NRC's request, in this case, Davis Besse responses to NRC's request for information related to methodology for compliance with NUREG 0612 be included in the USAR. The corrective action specified was to update the FSAR to capture the analysis performed for RV Head Drop over fuel assemblies in the USAR. There were no investigation or corrective actions related to the determining the adequacy of the process of identifying needed USAR changes.

Although such investigation or corrective action is not required for a

" A F type CR, the team felt that a more in-depth investigation or an Page 32

update to require some form of a casual analysis may have been warranted in this case. It is recognized that the current USAR update process might be significantly different from the one in place at the time of the response to the request for information related to NUREG 0612. However, an opportunity was missed to examine the current USAR change process and determine whether additional guidance may be needed to capture USAR changes such as the one.

Specific Issues for This Area The team assessment was structured to avoid duplication with the recently completed Corrective Action COIA. The team focused on implementation of the program by Engineering. Process issues found in the Corrective Action COIA should be considered as equally applicable to Engineering.

There were no observations in this area.

Findings for This Area 4 ANA Analysis of process related weaknesses Findings are described in detail in Section 1.5.4 Cross Cutting Findings Applicable to This Area None CRs issued during assessment of this area None 1.5.2.5 Effectiveness of Assessment Process Area Effectiveness Rating Overall, the team rated the self-assessment process as Effective. This is based on the quality of self-assessments, interviews with engineers and managers, and the receptivity and responsiveness management exhibits toward the self-assessment process.

Source Information The Independent Assessment Team conducted interviews of selected Engineering and Site personnel and reviewed selected documents from the reference library (See Sections 1.6.1 and 1.6.2).

Page 33

The team reviewed the following self-assessments:

I Number 1TM e- I DB-SA-07-002 Design Eng 2006 2"d Half IPA DB-SA-07-007 Plant Eng 2006 Znd Half IPA I DB-SA-07-014 I TechSvcs 2006 2"dHalf IPA I I DB-SA-07-052 I Design Eng 2007 1st Half IPA I I DB-SA-07-058

~~

I Plant Eng

~

2007 1st Half IPA

~~ ~

I DB-SA-07-065 Tech Svcs 2007 1st Half IPA DB-SA-07-035 DB AOV Program Assessment DB-SA-07-26 Cathodic Protection System DB-SA-07-070 DB 2007 Collegial Assessment of Performance DB-PA-06-04 DB Fleet Oversight Quarterly Performance Report DB-PA-07-01 DB Fleet Oversight Quarterly Performance Report DB-PA-07-02 DB Fleet Oversight Quarterly Performance Report Observations In 2007, 11 engineering-related self-assessments were scheduled and eight reports were available at the time of the 2007 COIA. Of the three reports unavailable for review, one is the self-assessment for the Component Design Basis Review - it is nearly complete, and on track for completion in 2-3 weeks; the second was an assessment of Cyber Security which has restricted distribution; and the third of Reactor Head Movement Control is overdue.

A review of eight assessments, (six IPAs and two snapshot) was performed and the following noted:

e The Independent Performance Assessments appeared to be critical and resulted in identification of areas to improve. The corrective actions are tracked by condition reports or SAP notifications. (13 condition reports and five SAP notifications generated)

Page 34

The use of external assessment team members to gain outside perspective of engineering effectiveness was quite limited.

a This COlA is one of only two focused assessments scheduled for Davis-Besse engineering in 2007.

The implementation of some snapshot self-assessments needs attention (further explained in 5 ANA below):

> A snapshot self-assessment was performed in preparation for a NRC inspection for movement of heavy loads in early July 2007.

As of September 20, 2007, the self assessment has not been formally documented with a written report.

9 The AOV program tri-annual performance assessment was performed by participating on a snapshot assessment team at Beaver Valley. While the general aspects of the fleet AOV program procedure and practices could be evaluated there.

The team could not find any evidence from interviews and document review that the implementation aspects of the program at Davis-Besse were thoroughly assessed.

In May and June 2007, a collegial assessment of lNP0 05-005.Guidelines for Performance Improvement at Nuclear Power Stations was performed. Several performance gaps were identified, and among them was the need for a prioritized, long-range, living self-assessment plan. This COlA team has raised issues similar in nature for the past 3 years - see the following quote from the 2006 report: The assessment team was not able to determine the stations strategy for selecting assessment areas. The Fleet and station develop schedules for focused and snapshot self-assessments but there is no strategy guidance that we have found after inquiring for the past iwo COIA assessments. It was noted that in 2005 several focused self-assessments were related to engineering subjects.

but in 2006 there are none scheduled. This year, the team probed to find ownership for and definition of the strategy for engineering self assessments. At the final Friday debrief, the Director of Engineering stated the Engineering department would own the strategy in the future.

The Design Basis Assessment Report is produced quarterly and assesses the health of design engineering section design basis, engineering programs, calculations, drawings, design criteria manual, engineering quality, and engineering workload activities. This practice provides management with a thorough periodic review (self assessment) of key design basis health issues.

The team observed part of the meeting of the Company Nuclear Review Board configuration control and equipment reliability sub-committee. The sub-committee is staffed with knowledgeable personnel who asked probing Page 35

questions, mentored where appropriate, and helped prepare the engineering managers for the challenges of the future.

Integrated Performance Assessments are performed by each Engineering Section twice each year. This practice establishes a consistent process for trending human, organizational, and program performance by binning and analyzing condition reports and other assessment information. The aggregate information is presented to the management team and later rolled up to the fleet level. It should provide early warning of isolated issues that may be occurring but are spread across the fleet and hard to detect or correlate.

Three DB Fleet Oversight Quarterly Reports were reviewed for engineering issues and are listed above. These were Quarterly Quality Assessment reports for Q4-2006, (21-2007, and Q2-2007. The assessments covered the following engineering areas: Equipment performance trending, problem solving and decision making, and selected modifications. Condition reports were generated as necessary.

The receptivity, responsiveness, and aggressiveness of management and staff to resolving issues raised in self-assessments were evaluated by conducting interviews of engineers, oversight personnel, and managers. In addition, several condition reports and SAP notifications for 2005 and 2006 self assessment items were reviewed to evaluate aggressiveness to address issues. The sample revealed that 10 of 11 were complete and closed. Overall, the results of the interviews and CRhotification review of previous years self assessments indicated management was aggressively addressing the issues generated by the self-assessments.

Specific Issues for This Area None Finding for This Area 5 ANA Assessment strategy and implementation Findings are described in detail in Section 1.5.4 Cross Cutting Findings for This Area None CRs issued during assessment of this area CR 07-27185 COIA-ENG-2007 Snap Shot Self Assessment Not Finalized Page 36

CR 07-27389 COIA-ENG-2007 - AOV Program Self Assessment Implementation Not Evaluated CRs initiated during the 2007 assessment are described in detail in Section 1.5.3 1.5.2.6 Follow-up to ANAs from 2006 Area Effectiveness Rating The Independent Assessment Team rates DB Engineering Performance in this area as Effective. Opportunities for improvement were noted Source Information The team reviewed the actions taken on last assessment's Areas in Need of Attention (ANAs) including those documented on Condition Reports initiated following the last assessment.

There were no Areas For Improvement (AFls) from the 2006 COIA-ENG assessment, therefore no Condition Reports or Action Plans were required.

However, FENOC's DB CNRB recommended during the February 2006 meeting that DB use CRs to track action on the findings of the 2005 assessment, and this practice was continued in response to the 2006 assessment findings. The team considers this a good practice.

Seven CRs were initiated to track response to the 2006 findings. These CRs covered each of the ANAs from the 2006 assessment.

The team reviewed these seven CRs as well as additional related documents provided by FENOC for the assessment library (see Section 1.6.2). The team also interviewed individuals responsible for the Investigation Summary and for authorizing closure of the CR corrective actions for the Condition Reports issued to resolve the 2006 Findings.

The following seven CRs were initiated to address findings from the 2006 assessment:

CR07-12143 COIA-ENG-2006 Area In Need of Attention - Inattention to Detail In Calculations (ANA 1)

CR 07-12147 COIA-ENG-2006 Area In Need of Attention -

Implementation of Requirements from Calculations (ANA 2)

CR 07-12148 COIA-ENG-2006 Area In Need of Attention - Equipment Reliability Program Implementation (ANA 3)

CR 07-12150 COIA-ENG-2006 Area In Need of Attention - Red Status Plant Health Systems (ANA 4)

Page 37

CR 06-6388 COIA-ENG-2006 Area In Need of Attention - Design Verification and Review of ECR Revisions(ANA-5)

CR 07-12151 COIA-ENG-2006 Area In Need of Attention - Follow-ups to Assessments and COIA-Eng-2005 CR 07-12153 COIA-ENG-2006 Area In Need of Attention - Management of Engineering Workload The following CRs were related to issues raised in the 2006 assessment CR 06-6388 Inconsistent practice in reviews and design verification for ECP revisions.

CR 06-6652 CST vortex calculation for AFW Pump suction, Procedural Direction Numerous additional documents were reviewed during the assessment of this area. Many are mentioned in the following discussion, and most are listed in the document library list in Section 1.6.2 Observations Follow-UDof findinos from COIA-ENG-2006 Flndlng Short Name CR Number CAS 1 ANA Inattention to detail in calculations CR07-12143 2 2 ANA Implementation of requirements from calculations CR07-12147 7 3 ANA Equipment Reliability Program CR07-12148 none Page 38

4 ANA Red plant health systems CR07-12150 2 5 ANA ECP Revision Reviews CR06-6388 2 6 ANA Follow-ups to assessments and last years COIA- CR07-12151 5 ENG-2005 7 ANA Management of engineering workload CR07-12153 4 There were also two CRs initiated during the 2006 assessment:

CR 06-6388 Inconsistent practice in reviews and design verification for ECP revisions.

CR 06-6652 CST vortex calculation for AFW Pump suction.

Twenty-two CAS were initiated from the seven CRs . All CAS had been closed before the 2007 assessment site visit began. The CAS were found to generally responsive to the problems identified in the CRs, which were in turn, generally responsive the findings of the 2006 assessment. The team noted that one 2006 finding did not result in any corrective actions (ANA 3). This is discussed below.

The CAS are discussed in relation to each of the respective Findings.

Review of response to 1 ANA Inattention to detail in calculations.

This finding was addressed by CR 06-12143, for DBDM, CR category AF, closed 5122107.

The CR problem statement captured the problem the team identified, and the CR investigation was thorough and included consideration of additional instances of similar problems. The investigation included review of the errors to identify human performance tools associated with each error.

Each individual error identified in the CR was resolved as indicated in the Origination section of the CR. Resolution of most individual items was complete, with a couple of items yet to complete but being tracked to completion.

Corrective action # I was to present the errors identified in the CR together with the applicable human performance tools to Design Engineering personnel on 4/26/07. The presentation material is available as an attachment to CR06-12143 in CREST.

Corrective Action #2 was to define the variables associated with volumetric flow rate and partial volume in Calculation 015.044 Rev 01 with a post-it-note (PIN) completed 4/6/07.

Review of response to 2 ANA Implementation of requirements from calculations Page 39

This finding was addressed by CR 07-12147 for DBDM, CR category AF, closed 9/21/07 although the CASfor this CR were all closed prior to that date.

The CR problem statement captured the problem the Team identified, and the CR investigation was thorough. The investigation covered each of the three examples provided by the Team.

There were seven corrective actions for this CR.

Corrective Action # I called for signs to be posted at each gate into the dry Fuel Storage Area that would prohibit flammable materials and combustible liquids from being stored on the pad. This action was closed on 3/24/07, However, despite this action and the procedural requirements against flammables and combustible liquids on the pad, some combustible liquids were subsequently discovered to have been placed in C-Vans stored on the pad. This resulted in a non-cited violation from the NRC. The team views this problem as not the direct responsibility of Engineering, but rather a more general shortcoming in the performance of the Plant I Site organization in imposing and adhering to procedure requirements and obeying signage, likely involving change management processes and techniques. The shortcomings of previous actions taken to exclude combustibles from the dry cask spent fuel storage pad in the context of change management processes were not addressed.

Corrective actions 2-7 involved presentation of issues and lessons learned from this CR in use of the DIE process and SAP notification to communicate requirements of calculations and analyses to users to ensure effective and timely implementation. The presentation material is attached to the CR in CREST.

The team notes that the more general issue of use of the DIE process to achieve knowledge of and control to engineering requirements for operation and maintenance of the plant is the subject of 2007 finding IANA Waivers of DIE Reviews.

Review of response to 3 ANA Equipment Reliability Program This finding was addressed by CR 07-12148 for Plant Engineering, CR category AF, closed 5/8/07 The CR description of the problem stated the following from the 2006 COlA assessment report "the equipment reliability program has taken too long to be implemented and is therefore not providing the benefits needed in protecting against equipment performance degradation and equipment failures" and it went on to add Page 40

Failures of equipment attributable to PM scope omissions or excessive intervals have been previously identified by other assessments (INPO, DE-SS-06-26 Common Cause Review A F l #2)

The Plant Engineering CR resulted in the conclusion Process should remain as is and No further action is required at this time and no corrective actions were generated to resolve this finding. The Team took this response to indicate Davis-Besse felt the Equipment Reliability Program contains the elements needed for success in this area, and that these elements would be implemented in due time in accordance with the ER section of the Engineering Excellence Plan and CAP items generated from other assessments.

The team reviewed DB-SS-06-26 Davis-Besse Common Cause Review July 2006 approved August 24, 2006, which was referenced in the response to CR 07-12148. This Snapshot Assessment was a review of site CRs with either root or apparent cause evaluations completed between 6/1/2005 and 6/30/2006. The assessment found seven common factors applicable to a majority of root or full apparent cause evaluations, including two factors not previously identified as trends, one of which was preventive maintenance deficiencies.

An AFI was written for the PM deficiency, which indicated PM deficiencies were a common factor in seven root or full apparent cause evaluations completed within the past year and in another two of the limited apparent cause evaluations....Plant Engineering organization to develop comprehensive actions to improve implementation of the PM process such that consequential events are mitigated (Pg 2).

CR06-03303 was initiated to address the problem stated by the AFI as PM deficiencies were a common factor in seven of the 69 root or full apparent cause evaluations and in another two of the 707 limited apparent cause evaluations completed within the last thirteen months.

Nine examples of preventive maintenance deficiencies addressed by root or full apparent cause evaluations were explicitly addressed in the problem statement and investigation summary for this CR. Each of the examples was itself the subject of a previously issued CR. Each of the seven CRs with a root or full apparent cause evaluation was determined to have led to corrective action including development of PM measures to prevent recurrence, and thus the PM deficiencies found in the cause analyses for the individual CRs were being remedied through specific PM revision requests. There were two CRs with limited apparent cause evaluations (CR06-01977 and CR06-01179). These CRs were found to have led to appropriate corrective actions (PM revision for CR06-01179, no PM revision for CR06-01977). Although CR06-03303 resulted in no additional corrective actions (beyond those initiated for the nine CR examples)

Page 41

the team concluded that this was an acceptable outcome, especially in light of the progress observed in the ER Program area described further below.

The CAS specified for the nine CRs used as examples in CR 06-03303 were reviewed in varying levels of depth to determine the effectiveness of their implementation through to revisions of PM tasks.

One CR resulted. CR 05-0988 addressed an instance of pressurizer spray valve RC2 unexpectedly opening for approximately 53 seconds on 12/28/05. The analysis concluded that PM revisions for several NNI modules in the Reactor Pressure Control system were warranted. Seven modules were named for institution pf periodic inspection and calibration, including module 3-8-9 (Memory

- SCR controller). PMA DB-REV-06-0048 was issued on 1/27/06 to bring about these PM tasks for all seven modules. This PMA was closed on 7/24/2006 when it was recognized that PMA DB-REV-05-1184 initiated 10/25/2005 addressed the same subject (almost).

The earlier PMA in fact only addressed six of the seven modules addressed by PMA DB-REV-06.0048, but did NOT include module 3-8-9 (in section 42 revision description). Thus PM for module 3-8-9 was dropped. CR 07-26779, initiated during the 2007 COIA, documents this omission.

The general response to CR 07-12148, the CR written to address the 2007 COIA finding, is believed by the Team to signal DB's view that since considerable initiative has been undertaken to address failures associated with individual equipment failures associated with PM deficiencies of sufficient significance to require at least a limited cause analysis, and since ER Program implementation progress in the past year has been significant, no additional corrective actions were required to address the ANA finding #3 of the 2006 assessment report.

The Team concurs with respect to the response to the CR The ER Program progress achieved since the 2006 assessment includes:

Completion of validation of classification of station equipment according to equipment criticality definitions based on INPO AP-913.

Generation of a number of PM Templates applicable to Davis-Besse (Phase 1)

Analysis of the PM regimes for many critical and non-critical components associated with the templates Generation of PM revision requests for a number of the completed analyses Transmittal of PM revision requests from engineering analysis workers to implementation workers for development of revised or new maintenance work orders and order execution schedules Beginning of execution of the new and revised work orders in the field Additional information about the Team's review and assessment of the ER program can be found in section 1.5.2.3 System Engineering Page 42

Review of response to 4 ANA Red plant health systems This finding was addressed by CR 07-12150 for DBPE, CR category AF, closed as of 7/2/07, The CR problem statement captured the problem the Team identified.

The investigation noted a number of actions that had been taken by the time the CR was written to improve overall system health, generally in the areas of 1) improved identification of system health deficiencies and threats to system health, 2) achieving authorization and priority for tasks needed, and 3) supporting decisions to make priority tasks ready for execution and to carry them through to completion. It went on to indicate ...the communication of work priority from the system expert to work management is the outstanding aspect that needs to be addressed.

There were two corrective actions.

The first corrective action addressed development of a process for system engineering to communicate to Work Management the priority of orders which would affect system health (health improvement items). This process involves designation of a system health color by the system engineer in SAP for work orders necessary to move a system from one health color level to another, and also to establish a schedule note indicating the order is associated with system health improvement.. This information is prominently presented in the restraint code reports generated by Work Management, which are held in hand by personnel reviewing work week scope and content and thus are reliably used as decision input. Conversations with system engineers and work week management personnel indicate they are highly aware of this information and understand the need to maintain work so designated in work week scopes. This information also supports outage work scope review and decisions about outage scope made using the scope control and schedule change request processes.

The second corrective action called for addition of anticipatory factors such as parts unavailability, obsolescence, age related degradation and assessment of impact of deferred work to the System Health reports. NOBP-ER-3009 Rev 2 (dated 6/25/07) accomplished the addition of obsolescence information. The 2Q 2007 Plant Health Report shows inclusion of information of these types.

These changes have and will continue to support more reliable and timely execution of work items supporting system health improvement.

In addition, as discussed in section 1.5.2.3, significant progress has been made since the last assessment in executing maintenance work orders and plant modifications to resolve longstanding equipment reliability and condition issues, and further, significant additional work order and modification executions are Page 43

scheduled during the remainder of the current operating cycle and during 15RFO. Reviews indicate a high level of readiness of these work items, providing confidence that the problem resolutions and plant health improvements they represent will occur as planned and scheduled.

Review of response to 5 ANA ECP Revision Reviews This finding was addressed by CR 06-6388 for DBDM, CR category AF, closed 5/25/07, The CR problem statement captured the problem the Team identified The investigation for this CR concluded that the problems identified were attributable to knowledge deficiencies on the part of some engineering staff (not deficiencies in the intent or delineation of instructions).

There were two corrective actions.

The first called for a presentation to be made to the Design and Rapid response Engineering staff to address the procedural requirements and expectations. This was completed in December 2006.

The second corrective action called for Fleet review of the issues to evaluate the need for procedure changes. This action is complete, with no changes found to be required.

Review of response to 6 ANA Follow-ups to assessments and last year's COIA-ENG-2005 This finding was addressed by CR07-12151 for DBTS, CR category AF, closed 7/2/07.

This ANA addressed the effectiveness of the DB response to the findings of the 2005 assessment. In 2006, the Team reviewed the resolution of the 2005 findings and found that some findings or elements of findings were not being captured and/or acting on, resulting in forgone opportunities for performance improvement.

CR07-12151 captured the problem in the description of condition by quoting from the Team's 2006 report, so the problem statement reflected the issue presented by the Team in 2006.

The investigation section of the CR for the 2006 finding of shortcomings in the responses to the 2005 findings essentially quoted more extensively from the Team's 2006 report. The investigation did not address the matter of why or in what way the responses to the 2005 findings came to have shortcomings that could lead to forgone opportunities for performance improvement. It instead Page 44

addressed five specific examples of problems from the 2005 report identified in the 2006 report as having not been satisfactorily resolved.

There were five corrective actions for this CR.

CAS 1,2, and 3 relate back to the copper dust in containment issue of 2005. In the 2005 and 2006 reports the Team suggested some principles of problem management in general by using examples from the copper dust issue (For one, describing the intended achievable ultimate end state to aid in determining if the problem has been taken to its achievable end, and for another using decision tree type tools to anticipate potential contingencies and predetermining decisions to be made, decision criteria, and information that would be needed to support decisions). DB has responded only to the examples, not the more general matters. The Team will not be pursuing these matters further at this time.

CA 4 makes note of a measure previously articulated by DB concerning an enhancement of the DIE process to much more effectively communicate the results of engineering evaluations to Operations, Maintenance, and others when the Engineering evaluations result in requirements for control of the plant configuration that must be executed by the plant. The CAS description directs that a (SAP) Notification be created to:

"Add to section 4.2.1.2 of NOP-CC-2004, Design Interface Reviews and Evaluations, the intent of the following statement:

Design inputdassumptions under control of Operations or other organizations should be flagged in the body of the engineering document, restated in the resultdconclusions of the document and discussed on the DIE cover page. Additionally, the notifications or condition reports generated to track procedure changes as a result of the DIE review should be reviewed and/or augmented by Engineering to provide assurance that the applicable engineering requirements are propagated effectively. "

Notification 600366019 was created in response and assigned to Fleet. The revision is planned for inclusion in the procedure revision expected by the end of 2007.

The Team anticipates completion of action on this item such that it becomes effective at the DB site.

CA number 5 was not implemented as written because data identifying the specific instances of SAP notifications addressing self-assessments found in 2005 to have been inadequately resolved could not be recovered. However, the DB individual assigned to the CA acted commendably in trying to determine if he could find additional instances of the problem the Team identified. He reviewed a number of self-assessments and traced the findings into SAP, finding that all seemed to be progressing to resolution properly. The Team, in 2007, requested Page 45

a search for any SAP notifications that were opened and closed on the same day without justification, as potential additional examples of a problem using SAP that caused items to be closed prematurely and without having been accomplished.

Three were found, but based on preliminary information on those three, it seems unlikely that this remains a problem.

With all the above taken into account, the Team found the results of use of the CAP by Engineering to capture and resolve the findings of the 2006 findings was much improved. This is attributed to DB's engineering management awareness of the problem and their actions taken to ensure that the CRs for the 2006 findings would be improved in quality and effectiveness.

Review of response to 7 ANA Management of engineering workload This finding was addressed by CR 07-12153 for DBTS, CR category AF, closed 6/27/07.

CR07-12153 captured the problem identified in the finding.

There were four corrective actions for this CR.

The first three CAS called for work management tools to be selected, development of an implementation plan, and application of the software and plan as a pilot project in the Rapid Response organization. These actions were substantially completed. CA #M was a place-holder with no unique action to be taken.

The engineering work management process is, as the response to the CR indicates, a work in progress. The Team found that many engineering personnel, from individual engineers to managers, were utilizing the SAP and P5 tools to varying degrees in managing their individual and group workloads. Progress has been made, benefit is beginning to be realized, and management will need to extend the plan and it's deployment to improve the results.

Review of responses to CRs generated during the 2006 Assessment There were two CRs generated during the 2006 assessment CR 06-6388 Inconsistent practice in reviews and design verification for ECP revisions.

This CR was written during the 2006 assessment, is the response to 5ANA, and is discussed above.

CR 06-6652 CST vortex calculation for AFW Pump suction.

Page 46

There is a discussion of the cause determined by the limited apparent cause analysis performed for this CR in section 1.5.2.4 above in the results of the assessment of area 4, Use of the CAP by Engineering. In that section it is cited as an example in support of 4 ANA Analysis of process related weaknesses In brief, this CR was found to address the specific instances cited, but to be substantially lacking in effectiveness to investigate and resolve the significant underlying process problems the specific examples represented.

Findings for This Area There were no Findings uniquely associated with this assessment area Cross Cutting Findings Applicable to This Area None CRs issued during assessment of this area None 1.5.3 CR summary This section summarizes CRs written during the assessment related to assessment reviews, discussions, and findings Five CRs were written during the Independent Assessment.

CR 07-26522 NORM-ER-3103-D Rev.3 Does Not Indicate Replacement Frequency The PM template for breakers does not contain a replacement task and frequency for molded case circuit breakers. The June 24, 2004 Technical Bulletin TB-04-03 indicates the life of Westinghouse supplied molded case circuit breakers for mild environment applications is 20 years.

CR 07-26574 COIA-ENG-2007 - C-EE-02.01-010-Rev 30 (DC Calc) Addendum A03, Calculation Error Plant staff noted an error in the application of calculation assumption 2.5 related to the percentage of solenoid loads to be considered in relay panels. The assumption specified the use of 67% of the solenoid loads while the calculation had actually used 100%. This error did not adversely affect the conclusions and a Post-It-Note initiated to correct the error in the next revision. Discussion with plant staff determined that a Condition Report should have been initiated to document this error. Correction of Page 47

2007 Condition Reports Areas of assessment Page 48

1.5.4 Findings This section presents the Findings of the Independent Assessment Team and shows the relationship between findings and the six assessment areas.

The table below shows a list of the 2007 findings and relates them to the assessment areas.

Page 49

1 I

~

2007 Findings Areas of assessment 1 ANA Waivers of DIE Reviews X 2 ANA ER Program - Forecast of PM strategy workload, SPV analysis X 3 ANA System Health Activities - Plant Computer, 480 V AC, Radiation Monitors X 4 ANA Analvsis of Process Related Weaknesses X 5 ANA Assessment strategy and implementation I 1x1 Discussion of findings None of the five ANAs were applicable to more than one assessment area. By comparison, three of the seven 2006 ANAs were applicable to more than one assessment area, and five of the six ANA findings in the 2005 assessment were applicable to more than one assessment area. Thus the 2007 findings were somewhat narrower in applicability.

Three of the 2007 findings are related to findings of the 2006 COIA-ENG assessment. These relationships are not considered strong enough to be considered repeat findings. Rather, they are thought to be similar in some respects, but in all cases lesser in scope and significance, than the related findings from 2006.

The 2007 finding 1 ANA Waivers of DIE Reviews is related to the 2006 finding 2 ANA Implementation Of Requirements From Calculations in that the instances cited in this year's findings have involved information not being translated into engineering and plant documents through the DIE process, although for different reasons.

The 2007 finding 2 ANA ER Program - Forecast of PM strategy workload, SPV analysis is related to the 2006 finding 3 ANA Equipment Reliability Program, although the 2007 finding addresses issues of reduced significance and scope.

Page 50

The 2007 finding 3 ANA System Health Activities - Plant Computer, 480 V AC, Radiation Monitors is related to the 2006 finding 4 ANA Red Plant Health Systems, although the issues are smaller in number and scope.

Findings statements The following section contains the findings statements and their bases.

1 ANA Waiver of DIE Reviews The waivers of mandatory Design Interface Evaluation (DIE) reviews by the principal engineer have resulted in several instances where impacts of calculation alterations on plant procedures or documentation have not been identified and updated in a timely manner.

Waiving of mandatory DIES is allowed with supervisory approval and this approval was obtained in each of the instances noted.

The justification for waiving mandatory DIE reviews was that the necessary changes to plant procedures and documentation are identified in a parallel process or review, e.g. as a result of CRs that initiated the calculation alteration or as part of the DIE review of an ECP package. The team found that that the assumption that the other process or review would identify the impacts of calculation alterations is not necessarily valid (see examples below).

The principal engineer and the design organization may not always be in a position to determine all the impacts a calculation change may have. A verification that the output documents assigned a DIN number in the calculation Design Index Numbers (DINS) are not affected is not sufficient to identify whether any additional documents may be affected. It is noted that many System Descriptions are not listed as output documents, although the calculation is listed as a System Description reference.

Specific examples include:

Mandatory DIE review by Systems was waived for calculation C-ME-011.01-141. The System Description value (SD-18, Table 2.1-4) for pump submergence requirement is 4.33 ft, while the calculation result is 4.14 feet. Additionally, this calculation is SD reference 4.1.117 to the SD; reference 4.1.117 needs to be added to the list of data sources on the table.

Mandatory DIE review by Systems was waived for calculation C-NSA-049.01-004 Rev 1. The calculation analyzed the acceptability of tank low-level setpoints to prevent vortexing. Follow up with the principal engineer identified that there were System Description impacts from the calculation change. This calculation is not referenced in the SD. Notification 600352155, dated 12/6/06, identified the need to update several System Page 51

Descriptions to include anti-vortexing requirements and reference appropriate calculations. In this case, a DIE review form sent to PE would have served to alert the preparer of the System Description changes to any additional changes resulting from Rev 1 (issued May 2007).

2ANA -

ER Program Forecast of PM strategy workload, SPV analysis Single Point Vulnerability analysis; and forecast of PM strategy workload on PM implementation and execution areas are not planned or evident Industry practice in the equipment reliability area generally involves conducting single point vulnerability analysis to inform plans to improve reliability where PM changes are not practicable. Davis-Besse does not have comprehensive SPV results or plans to conduct SPV analysis, according to the FENOC Engineering Excellence Plan and interviews with Davis-Besse station personnel.

The PM Strategy work underway at DB at present is recognized to be somewhat behind many in the industry. Engineering analysis of critical and non-critical components is about 30% complete. The results of that analysis and the thousands of PM revision requests it will generate, and the thousands of work order and schedule changes and field task executions that will be required to give full effect to station equipment will be a formidable burden for the stations work management and field forces. Those organizational elements need planning information to organize the resources that will be needed to carry ut the work without undue delay. Davis-Besse Engineering currently are not forecasting the workloads that will impact the organizations downstream of the engineering analysis in the ER process.

3 ANA -

System health activities Plant Computer, 480 V AC, Radiation Monitors Supplemental management attention for system health activities and plans are needed for Plant Computer, 480 V AC, and Radiation Monitors process and area.

These plans involve the need to be revised to reflect actual plans and intentions, to remove ambiguities, and the need to consider contingencies and uncertainties such as continuing availability of parts and services, system expertise, and uncertainties in approval of future phases of replacement and/or upgrade.

The PC plan has been overtaken by events, and is not being followed to a significant degree.

The 48OVAC plan contains ambiguities and calls for actions that could only be scored complete through highly subjective judgments. And some near-term scheduled activities were found to be restrained with no plan to relieve the restraints.

Page 52

The Radiation Monitoring System health is subject to threats and uncertainties in several dimensions, such as continuing availability of parts, availability of current service providers, maintenance of internal system expertise, and future authorizations for the next phases of subsystem upgrades and replacements.

4 ANA Identification of Process Weakness in Causal Analyses and Investigations I Some casual analysis and investigations have not been performed in sufficient I I depth to identify process weaknesses.

Examples include:

CR 07-18074, HPI Train 1 Discharge Air Intrusion, Root Cause Analysis, did not identify issues related to Operational Decision Making process as a root or contributing cause and did not investigate potential weaknesses in the Operating Experience program implementation. Additionally, the root cause was identified as five in-series valve failures. This root cause did not go further in depth to determine the cause of the valve failures (design, maintenance, etc.). Seat wear was identified as the failure mode, but there was no evidence provided to support this hypothesis or to discount other plausible failure modes. No corrective action was put in place to further evaluate the failure mode when the failed valves were repaired/replaced.

CR 06-6652, CST Vortex Procedure Deficiency, Limited Apparent Cause identified procedure content as the cause, but did not investigate or identify why the DIE process was not effective, or in this case, not used.

CR 06-11157, CST Usable Volume not Updated in System Description, C F Investigation, determined to the CREST Cause Code to be Work Practices/ Workmanship in that Plant Engineering staff failed to identify the needed System Description update as part of the DIE process. The investigation did not identify the Systems mandatory DIE review was waived by the principal engineer. The process issue related to the waiver of the DIE was not investigated or addressed.

CR 07-23106, USAR Description for Control of Heavy Loads, A F Investigation, determined that the USAR description of the control of heavy loads was minimal and did not contain information supplied in a response to an NRC request for information with respect to methods for implementation of NUREG 0612, as required. The corrective action was to update the USAR with respect to this area. The investigation did not examine the processes used to identify needed USAR changes nor the engineering-regulatory compliance interface with respect to this process.

The CREST Cause Code assigned was BOI, Document content. The team felt that an opportunity was missed to examine the current process for identifying needed USAR changes with respect to how that process Page 53

currently handles responses to NRC requests for information and how that process is implemented in engineering. It is recognized that this process may be a regulatory compliance process rather than an engineering process.

Failure to identify or consider related process issues in casual analyses and investigations may result in failing to self-identify and address needed process improvements, either directly through corrective actions or indirectly through identification of adverse trends through the trending of CREST Cause Codes.

This issue is similar, in some respects, to the comment in the 2006 Engineering COIA, Corrective Action area, related to the treatment of some process and program issues as C F and not requiring causal analysis or consideration of process-related issues, and solely addressing the specific deficiencies identified.

The 2006 COIA comment was not deemed to rise to the level of a finding due to the relatively few instances identified.

5 ANA Assessment Strategy and Implementation Engineering management needs improve implementation of selected snapshot self assessments and also develop a long-range strategy for assessment of various Engineering processes and programs.

Two of three snapshot self assessments reviewed were in need of improvement in implementation and timeliness.

The self assessment of the AOV program took credit for tri-annual review of DB AOV process and implementation by going to Beaver Valley and participating in their snapshot assessment. The assessment team member then wrote a report about a couple of fleet programmatic issues, but didnt assess AOV program implementation at DB.

The pre-inspection snapshot self assessment for RPV head movement control, conducted in July, is still not formally documented with a written report.

Consideration should be given to establishing a long-range strategy for Engineering assessments; that is, determining the areas of focus for the future; choosing whether the assessments should be focused on compliance with current procedures and practices or the performance gaps using standards of industry excellence as a benchmark. A plan that includes a mix of IPAs, pre-inspection self assessments, snapshot and focused self assessments on selected topics to be assessed should be developed in accordance with the long-range strategy. The strategy and plan should be periodically reviewed and adjusted as necessary. This review and adjustment could be done as part of the IPA process.

Page 54

1.6 References 1.6.1 List of persons interviewed and meetings observed Persons lntervlewed Last name First name Position Andrews D Senior Nuclear Staff Specialist Bair Dick Staff Nuclear Engineer Barteck Gabe Staff Nuclear Engineer Beier Mike Staff Nuclear Specialist Bezilla Mark Site Vice President DB Nuclear Bleau Claire Supervisor, Electrical I IBC Engineering (Design)

Bodi Jim Nuclear Engineer Boles Brian Director, Site Maintenance Bondy Don Staff Nuclear Specialist Byrd Ken Manager, Design Engineering Chimahusky E Supervisor, Nuclear Operatons Oversight Chowdhary Jason Staff Nuclear Engineer Chung George Staff Nuclear Engineer Dejong Bill Staff Nuclear Engineer Dunn K Supervisor, Nuclear Document Control Fehl John Staff Nuclear Engineer Gatter Shane Adv Nuclear Engineer Grabnar John Director, Site Engineering Gulvas Tom Staff Nuclear Engineer Haley Dan Staff Nuclear Engineer Harder Lynn Manager, Site Radiation Protection Hengge Craig Supervisor, Nuclear Rapid Response Hennessy Brian Supervisor. Nuclear Performance Improvement Hook Jon Supervisor, Nuclear MechanicallStructuraI Engineering Hruby Raymond Manager, Performance Improvement lmlay Dave Manager, Technical Services lsherwood D Senior Nuclear Specialist Johnson Eric Staff Nuclear Engineer Kaminskas Vito Director, Site Operations Kemp Jessica Senior Nuclear Engineer, Eng Analysis Kendrick Gary Manager, Site Work Management Kline Bill Fleet Engineering Programs Manager Kreft Tim Staff Nuclear Engineer Laurer Tim Supervisor, Nuclear FIN Maintenance Mainhardt Peter Staff Nuclear Engineer Mallernee Jane Senior Nuclear Specialist Marley Jim Staff Nuclear Engineer Meyers Mark Nuclear Journeyman Electrician Migas Andy EAB Chairman Page 55

Persons lntewlewed Last name First name Position Miller Andy Adv Nuclear Engineer Moore Connie Supervisor, Nuclear Configuration Control Nasser Dirul Senior Nuclear Engineer Nesser Kathy Senior Nuclear Specialist Osting Steve Staff Nuclear Engineer Otermat Jon Staff Nuclear Engineer Patchett Wayne Nuclear Engineer PIy maIe Scott Manager, Plant Engineering Price Clark Director, Performance Improvement Quaderer Aaron Adv Nuclear Engineer Schreiner Dennis Senior Consultant Sharp Rick Staff Nuclear Specialist Slauterbeck Keith Staff Nuclear Engineer Slosneric Steve Staff Nuclear Engineer Smith Robert Staff Nuclear Engineer Syrowski Jim Staff Nuclear Engineer Thompson, Blaine Senior Nuclear Engineer Wax Mark Senior Nuclear Engineer Wuokko Dale Supervisor, Nuclear Compliance Zawacki Mike Senior Nuclear Engineer Zellers Kevin Supervisor, Nuclear Engineering Analysis Meetings observed Engineering Support Training Engineering Briefing Engineering Overview MAOM Plant Health Committee 1.6.2 Reference Documents The information listed below was provided in advance or during the assessment by FENOC for the use of the Independent Assessment Team Some document titles were changed to support organization of the documents within the data tile document library, or to make the titles more indicative of the contents.

A number of INPO documents were reviewed at the site. These documents remained in the control of FENOC personnel and were obtained under non-disclosure agreements. These documents are not individually listed.

Llbrary # Document Name 10 FENOC englneering assessment planning Information 10.001 COIA-ENG-2006 DB Report 061012 FiNAL.doc Page 56

Library # Document Name 10.002 Final plan for 2007 Eng COlA 060507 w JHG signature.doc 10.003 Project contacts 070822.~1s 10.004 Three Plant FSA REPORT 041124.doc 10.005 COIA-ENG-2005 DE Report 122905 final.doc 10.006 COIA-ENG-2004 DE IAR.doc 10.007 2007 Engineering COlA 1 Grabnar ovewiew.ppt 11 INPO reference material 11.001 SOER02-4.doc 11.002 CR 2002-00891 RCA RPV head degradation Rev 1.pdf 12 assessment plans, reports 12.001 List of DE Self-Assmts 2006 and 2007 070820 markup.xls 12.002 DE-SA-07-002 Design Eng 2d Half 2006 IPA.pdf 12.002a DEE-06-00099 Design Eng 2006-1 IPA.pdf 12.004 DE-SA-07-004 Maint 2006-2 IPA.pdf 12.005 DE-SA-07-005 Operations 2006-2 IPA.pdf 12.006 DE-SA-07-006 Outage Mgmt 2006-2 IPA.pdf 12.007 DE-SA-07-007 Plant 8 Equip Reliab Eng 2d Half 2006 IPA.pdf 12.011 DE-SA-07-011 Site Proj 2d Half 2006 IPA.pdf 12.013 DE-SA-07-013 Supply Chain 2006-2 IPA.pdf 12.014 DE-SA-07-014 Tech Sew 2d Half 2006 IPA.pdf 12.016 DE-SA-07-016 Work Mgmt 2006-2 IPA.pdf 12.017 DE-SA-07-017 DE Site Summary 2006-2 IPA.pdf 12.019 DE-SA-07-019 Training Effectiveness EvaLpdf DE-SA-07-020 Reactor Oversight Process Cross-Cutting Aspects.pdf 12.020 12.021 DE-SA-07-021 Eng Pkg Review for 10CFR50.54(q).pdf 12.022 DE-SA-07-022 CAP 1mplementation.pdf 12.026 DE-SA-07-26 Cathodic Protection.pdf 12.027 DE-SA-07-027 Lg PM Driver FEG Windows.pdf 12.030 DE-SA-07-030 FENOC Approach to NElC Audits.pdf 12.032 DE-SA-07-032 CAP CR Initiation Comparison.pdf 12.033 DE-SS-07-033 Prep for WAN0 Peer Rvw.pdf 12.035 DE-SA-07435 AOV Triannual.pdf 12.037 DE-SA-07-037 Assmt of Updated 02-07 TRNG Warning Flags.pdf 12.038 DE-SA-07-038 Assmt of Ind Fundamentais.pdf DE-SA-07-043 Collective Review of Three Station Events for 12.043 Organizational Learning.pdf 12.052 DE-SA-07-052 Design Eng 2007-01 IPA.pdf 12.054 DE-SA-07-054 Maintenance2007-1 IPA.pdf 12.055 DE-SA-07-055 Operations 2007-1 IPA.pdf 12.056 DE-SA-07-056 Outage Mgmt 2007-1 IPA.pdf 12.057 DE-SA-07-057 Procedures 2007-1 IPA.pdf 12.058 DE-SA-07-058 Plant Eng 2007-1 1PA.pdf 12.063 DE-SA-07-063 Site Projects 2007-1 IPA.pdf 12.065 DE-SA-07-065 Tech Svcs Eng 2007-1 IPA.pdf 12.067 DE-SA-07-067 Work Mgmt 2007-1 lPA.pdf DE-SA-07-070 Collegial Assessment of Performance Improvement 12.070 1mplementation.pdf 12.101 DES-06-00423 Supply Chain 2006-1 IPA.pdf 12.101 DE-SA-05-05 DE Fuse Control Focused Self-Assessment.pdf DE-SA-06-06 Diesel Fuel Storage and Transfer Self-Assessment.pdf 12.101 12.101 DE-SS-06-10 14 RFO Modifications Snap-Shot Assessment.pdf Page 57

Library # Document Name 12.101 DB-SS-06-26 Common Cause Rvw 07-2006 .pdf 12.101 DB-SS-06-26 DE Condition Report Common Cause Review.pdf 12.101 DE-SS-06-28 Cross-Cutting Aspects-NRC insp Rpt Findings.pdf 12.101 DE-SS-06-31 Tech Svcs 2006.pdf 12.101 DE-SS-06-34 DB Site Focused SAs have CRs 2006.pdf 12.101 DE-SS-06-35 Sys Waikdowns.pdf DE-SS-06-36 Corrective Action Program (CAP) Assessment Results 12.101 Review (Snap Shot).pdf 12.101 DB-SS-06-37 CAP Timeiiness.pdf 12.101 PRS-06-00033 Site Projects 2006-1 iPA.pdf 12.101 SA2004-0020 DE AOV Program On-Going Self Assessment.pdf 12.101 TSS-06-0050 Pit Eng-Tech Svcs 2006-1 iPA.pdf 12.201 DB Oversight 2006 2nd Qtr Report DB-C-06-02.pdf 12.202 DB Oversight 2006 3rd Qtr Report DE-C-06-03.pdf 12.203 DB Oversight 2006 4th Qtr Report DB-PA-06-04.pdf 12.204 DB Oversight 2007 1st Qtr Repolt DE-PA-07-01.pdf 12.205 DE Oversight 2007 2nd qtr report DB-PA-07-02.pdf 12.301 RAS-06-00259 DB Site Summary 2006-1 iPA.pdf 13 INPO reports on FENOC not listed 14 engineering procedures 14.00a New Procedures since Sept2006.xis DBBP-VP-0009-R3 R3 Management Pian for Confirmatory Order 14.001 Independent Assessmentspdf 14.002 NOP-LP-2001r15 CA Program.pdf 14.002-1 NOP-LP-2001 rev14 Corrective Action Program.pdf 14.004 NOBP-SS4001-R2 Change Management Guide.pdf 14.005 NOBP-CC-2005rO0 Eng Assmt Board.pdf 14.006 NOBP-CC-2003ArO2 Prelim Cost Est.pdf 14.007 NOEP-CC-2003Br02Conceptual Design Pkg.pdf 14.008 NOBP-CC-2003Cr02 Project Team.pdf 14.009 NOEP-CC-2003Dr02Waikdowns.pdf 14.010 NOEP-CC-2003-RZ Config Mgt Database Controi.PDF 14.011 NOBP-CC-3002-R3 Processing Caics.PDF.pdf 14.012 NOBP-CC-7001rll Procurement Pkgs.pdf 14.013 NOBP-CC-7002rO4 Enhanced Procurement.pdf 14.014 NOEP-ER-1002-R4 Proj Apprvl and Resource Allocation.PDF 14.015 NOBP-ER-1004-R2 Fleet Value Rating Methodoiog.pdf 14.015-1 Form NOEP-ER-1004-01 Rev0 FVR worksheet.doc 14.016 NOEP-ER-3002-R3 Plant Health Committee.pdf 14.017 NOEP-LP-2001rO9 FENOC SelfAssessment-Benchmarking.pdf 14.018 NOBP-LP-20071-04 CR Process Effectiveness Review.pdf 14.019 NOEP-LP-2008106 FENOC CARB.pdf 14.020 NOEP-LP-2010rO5 CREST Trending Codes.pdf 14.021 NOEP-LP-201lrO6 FENOC Cause Analysis.pdf 14.022 NOBP-LP4003ArO3 FENCO 5059 User Guidelines.pdf 14.023 NOBP- LP4003E-R1 50.59 Mentoring Review Committee.PDF 14.024 NOBP-SS-2101r02 Peer Groups and Teams.pdf 14.025 NOBP-SS-3401-R6 Document Hierarchy.pdf 14.026 NOP-WM-2001r06 Work Mgmt Scheduling Process.pdf 14.027 NOP-CC-2001r05 Design Verification.pdf 14.028 NOP-CC-2002103 Design input.pdf 14.029 NOP-CC-2003rll Engineering Changes.pdf 14.029-1 NOP-CC-2003 Design Report forms 02-21.zip Page 50

Llbraty # Document Name 14.029-1 NOP-CC-2003-03 R06 Installation and Test Requirements.doc 14.029-2 NOP-CC-2003-09 R04 Initiation Report.doc 14.029-3 NOP-CC-2003-11 ROB Engineering Change Package Matrix.doc 14.029-4 NOP-CC-2003-17 R04 Engineering Change Cancellation.doc 14.029-5 NOP-CC-2003-19 R01 Design Report.doc 14.030 NOP-CC-2004-R5 Design Interface Reviews and Evaluations.pdf 14.031 NOP-CC-3002rO4 Calculations.pdf 14.032 NOP-CC-7002r08 Procurement Engineering.pdf 14.033 NOP-ER-1001-RO Cont quip Perf Improvement.pdf 14.034 NOP-ER-3001rO3 Problem Solving and Decision Making.pdf 14.035 NOP-LP-2006r02 CNRB.pdf 14.037 NOP-LP4003r04 Eva1 of Changes-Tests-Experiments.pdf 14.036 NOPL-SS-3201-R1 Document Hierarchy.pdf 14.039 NOPL-CC-0001-R1 Eng Principles and Expectations.pdf 14.040 NOPL-ER-0001-RO Equipment Reliability Policy Statemempdf 14.041 NOPL-LP-2003-RZ SCWE Policy.pdf 14.042 NOBP-CC-2004-RO Engineering Change Risk Analyskpdf 14.044 ESI-001 R03 System Engineer Qual Card.pdf NOBP-TR-ll ll-Ol-RO Eng suppt personnel training sylabus Rev02.doc 14.045 14.045-1 NOBP-TR-1111-R1 FENOC Training Program Descriptions.pdf 14.046 NOBP-CC-1004 Calc Utility.PDF 14.047 EN-DP-01150 System Description Procedure-R3,PDF 14.046 NOP-CC-2004-05r07 Design Interface Summary.doc 14.049 NOP-CC-2004-07-RO3 Design Interface Evaluation.doc 14.050 NOP-CC-2004-02r07 Design Interface Review Checklist.doc 14.051 NOBP-CC-2005 RO Fleet EAB procedure.pdf 14.052 DB-DP-00023 R6 Labels 8 Signs.pdf 14.053 DE-DP-00307 R3 Ctrl of Positionable Comp.pdf 14.055 NG-EN-00307 R9 Configuration Mgmt.pdf 14.056 NG-EN-00309 R1 Plant Modification.pdf 14.057 NOBP-LP-2018rO2 Integrated Perf Assmt-Trending.pdf 14.056 NOBP-CC-1003-RO Design Basis Info for Atlas.pdf 14.059 NOBP-CC-1005-RO FENOC Latent Issues Review.pdf 14.060 NOPL-SS-3201-Rl Document Hierarchy.pdf 14.061 NOP-SS-6001 Rev1 FENOC Activity Tracking.pdf 14.062 NORM-CC-2001 Engineering Change Process Flowcharkpdf 14.063 DBBP-DCU-0010 R11 EC Closeout Process.pdf 14.064 NOBP-LP-2001 FENOC SA-Benchmark.pdf 14.065 NOP-WM4300rO6 Order Execute Process.pdf 14.066 NOP-WM4305rO1 Order Closure.pdf 14.067 NOP-OP-1010rOl Operational Decision-Making.pdf 14.066 NOP-WM-1001rO8 Order Planning Process.pdf 14.069 NOBP-ER-3009-R1 FENOC Plant Health Report Program.pdf 14.069a NOBP-ER-3009 FENOC Plant Health Report Program Rev 2.doc 14.070 DBBP-PES-0005rO3 System Walkdowns.pdf 14.071 DB-FP-00007 ROB Control of Transient Combustibles.pdf NOBP-ER-3903 R3 Component Template Implementation ER Workbench 14.072 Module 3 14.073 DE-OP-06913 R16 Seasonal Plant Preparation Checklist 14.074 DE-OP-06331 R16 Freeze Protection 8 Electrical Heat Trace NOBP-ER-3005 R01 FENOC Equipment Vulnerability Review Process 14.075 Page 59

Librarv #

15 Engineering program documents 15.001 N i t used 15.002 Not used 15.003 Current ECPs.xls 15.004 Closed ECPs.xis 15.005 15RFO Mods from SAP.xls 15.006 TM List-Gerren.xls 15.007 TM list closed since 060901 and currently open 15.008 Not used 15.009 Temporary Mods list 08-29-07.xls 16 engineering work products 16.1.C-EE-002-01-010R30A02DC Calc - BatterylCharger size, Short 16.1.C-EE-002-01-010R30A02 Circuit, Voltage Drop.pdf 16.1.C-EE-00201-01OR30A03 DC C a b Battery and Chargersizing. Short 16.1.C-EE-00201-01OR30A03 Circuit, and Voltage Drpo.pdf 16.1.C-EE-O1503-006R04A14 16.1.C-EE-01503-008R04A14AC Power Systems Analysis.pdf 16.1.C-EE-O1503-008R04A26 16.1.C-EE41503-008R04A28 AC Power Systems Analysis.pdf 16.1.C-EE-O1503-006R04A29 16.1.C-EE41503-008R04A29 AC Power Systems Analysis.pdf 16.1.C-ME-011-01-141RO1 Service Water System NPSH Analysis.pdf 16.1 .C-ME-O11-01-141R01 16.1.C-ME-04001-004RO Boric Acid Addition Tank Vortex Formation.pdf 16.1 .C-ME-04001-004RO 16.l.C-NSA-011-01-016R01A01Service Water System Design Basis 16.1.C-NSA-O11-01-016R01A01 Fiowrate Analysis and Testing requirementspdf 16.1.C-NSA-011-01-017ROl Pump Curve Acceptance Criteria for Service 16.1.C-NSA411-01-017R01 Water Pump 2.pdf 16.1.C-NSA-04901-004ROl Vortex Formation w ECCS Suction from 16.1.C-NSA-04901-004ROl BWST.pUf 16.1.C-NSA-050-03-028R01AOl Auxiliary Feedwater (AFW) Minimum 16.1.C-NSA-050-03-028R01AOl Performance.pdf 16.1.C-NSA-05201-003RO6A06 16.1.C-NSA-05201-003RO8A06 HPI Pump Acceptance Criteria.pdf 16.1.C-NSA-054-02-026R0224 Month pressure temperature curve 16.1.C-NSA-054-02-026R02 data.pdf 16.1.C-NSA-064-017R02RPS P-T Curve Adjusted for lnstrunent 16.1.C-NSA-064-017R02 Uncertainly.pdf 16.2. ECP02-0738-00 16.2. ECP02-0736-00 EDG Gov Rpicmt.pdf 16.2.ECP02-0817 16.2.ECP02-0817-ARC-B.pdf 16.2.ECP02-0817-00 16.2.ECP02-0817-00 RCP Rotating Assembiy.pdf 16.2.ECP04-0271-00 16.2.ECP04-0271-00CS Piping Therm Rlf Path.pdf 16.2.ECP06-0012-00 16.2.ECP06-0012-00 P221-MP221 EQUIVALENT CHANGE.pdf 16.2.ECP07-0062-00 16.2.ECP07-0062-00 1-inch CCB-19 J&tion.pdf 16.2.ECR04-0365-00 16.2.ECR04-0365-00 MU212 and 212A Drn 0rifice.pdf 16.3.ProblemSoivingDMPlan 16.3.ProblemSoivingDMPlan RCP Bearing Temp 07-13004.pdf 16.401 OE23342 and 23622 FAC Reheat Leak.doc 16.501 SE walkdown report Freeze Protection / Heat Trace 070606 17 NRC reports 17.001 NRC Restart Confirmatory 0rder.pdf 17.002 Davis Besse Inspection Findings Summary 2007.doc 18 Root cause analyses and CR Information 18.001 DBDM CARF Report all CAS 2351 items.xis 18.002 DBDM CR Report all CRs 1732 items.xls 18.003 DBDM Open CA Report 070628.~1s 18.COIA 2005.CR05-05393 CR05-05393 AFI CM 3.1 unevaluated design changes.pd1 Page 60

Library # Document Name 16.COIA 2005.CR06-02311 CR06-02311 Attention to Detail in SE.pdf 16.COIA 2005.CR06-02312 CR06-02312 ERFOM DBPE.pdf 16.COiA 2005.CR06-02411 CR06-02441 ERFOM DBDM .pdf 16.COIA 2005.CR06-02422 CR06-02422 Copper Oxide in Containment.pdf 18.COIA 2005.CR06-05628 CR 05-05826 2005 COlA Vendor Quality.pdf CR06-6366 DESIGN VERIFICATION AND REVIEW OF ECR 16.COIA 2006.CR06-6368 REVISONS.pdf 16.COIA 2006.CR06-6388-1 Design Engineering Section Meeting Dec 2006.ppt 16.COIA 2006.CR06-6652 CR06-6652 CST Anti-Voltex Proc Direction.pdf 18.COIA 2006.CR07-12143 CR07-12143 INATTENTION TO DETAIL IN CALCULATIONS.pdf Human Performance orientation re 2006 COlA eng programs item l.pdf 16.COIA 2006 07-12143-1 CR07-12147 IMPLEMENTATION OF REQUIREMENTS FROM 16.COIA 2006.CR07-12147 CALCULATIONS.pdf 18.COIA 2006 07-12147-2 Eng Design Section Head Meeting 070426.pdf 18.COIA 2006 07-12147-4 Design Engineering Section Meeting 070427.pdf 16.COIA 2006 07-12147-5 Plant Engineering Section Meeting 070723.pdf 16.COIA 2006 07-12147-6 Technical Services Section Meeting 070712.pdf 16.COIA 2006.CR07-12146 CR07-12148 ER Program 1mplementation.pdf 18.COIA 2006.CR07-12150 CR07-12150 Red Plt Health Sys.pdf CR07-12151 FOLLOW-UPS TO ASSESSMENTS AND COIA-ENG-16.COIA 2006.CR07-12151 2005.pdf 16.COIA 2006.CR07-12153 CR07-12153 MANAGEMENT OF ENGINEERING WORKLOAD.pdf 16.CR06-01753 CR06-01753 Station Not In Compliance w-DB-FP-000 16.CR06-02340 CR06-02340 POTENTIAL VIOLATION OF 10CFR72-122C.pdf 16.CR06-02554 DOCUMENT INDEX FOR PROCUREMENT PACKAGE 16.CR06-02554 84277206 REVISION 0002 WAS NOT CURRENT.pdf 18.CR06-02686IBC DATA PACKAGE UPDATING DISCREPANCY.pdf 16.CR06-02666 16.CR06-02726 ERRORS IN DBAB LIBRARY CONTROLLED COPY OF 16.CR06-02726 FHAR.pdf 16.CR06-03008 INCORRECT STATEMENTS IN AFW CALCULATION C-16.CR06-03008 NSA-050.03-026.pdf 16.CR06-03379 18.CR06-03379 File Pmc Missing Pages.pdf 18.CR06-06427 CR06-06427 NRC NON-CITED VIOLATION-10CFR72.212.pdf CR06-11269 CDBI-EDG VENT DAMPERS MAY NOT BE 16.CR06-11269 STRUCTURALLY ADEQUATE FOR DESIGN TORNADO DP.pdf 16.CR06-2221 18.CR06-2221 Filenet Proc Missing Pages.pdf 16.CR06-6126 16.CR06-6128 FAC Prog Deficiency Eval.pdf 18.CR06-9236 COIA-2006-CAP CR 05-00738 REPEAT OCCURRENCE 16.CR06-9236 DUE TO UNTIMELY CORRECTIVE ACTION.pdf 18.CR06-9917 16.CR06-9917 HTRO-1 Normal Drain Line Vibration.pdf 18.CR06-02132 Maintenance Rule A(1) evaluation for Medium Voltage 16.CR06-02132 AC System 18.CR07-13550CR 05-00738 REPEAT OCCURRENCE DUE TO 16.CR07-13550 UNTIMELY CORRECTIVE ACTION.pdf 18.CR07-13738 DB-SA-07-02, IPA DESIGN ENGINEERING EMERGING 18.CR07-13738 TREND ON CLOCK RESET DATA.pdf 18.CR07-13743 DB-SA-07-02, IPA DESIGN ENGINEERING EMERGING 16.CR07-13743 TREND QUALITY OF NRC SUBMITTALS.pdf

~

18.CR07-13744 DB-SA-07-02, IPA DESIGN ENGINEERING INPO AFI 16.CR07-13744 CM.3-1. EXTENT OF CONDITION.pdf Page 61

Llbraly # Document Name 18.CR07-13746 DB-SA-07-02, IPA DESIGN ENGINEERING INPO AFI 18.CR07-13746 CM.3-1. REVISE TM PROCEDURE.pdf 18.CR07-13749 DB-SA-07-02. IPA DESIGN ENGINEERING -

18.CR07-13749 SEARCHABLE VERSION OF THE USAR NOT UPDATED.pdf 18.CR07-13799 18.CR07-13799 DB-SA-07-02 DE IPA Latent Doc Errorspdf 18.CR07-13888 INSTRUMENT AIR RELIEF VALVE IA 2780 DOES NOT 18.CR07-13886 MEET PURCHASE SPECIFICATI0N.pdf 18.CR07-14101 18.CR07-14101 DB-SA-07-14 TS IPA ER AFl.pdf 18.CR07-14130 18.CR07-14130DB-SA-07-14 TS IPA Temp Mod AFl.pdf 18.CR07-14131 18.CR07-14131 DB-SA-07-14 TS IPA AFl.pdf 18.CR07-15051 PLANT EQUIPMENT INSTALLED WITHOUT AN 18.CR07-15051 APPROVED DESIGN DOCUMENT.pdf 18.CR07-15336 CR07-15336 NRC NCV Combustibles Near Dry Fuel Storage.pdf 18.CR07-15971VIBRATION AND NOISE FROM MAIN STEAM TO #235 18.CR07-15971 REDUCER PIPING.pdf CR07-18074 HPI TRAIN 1 DISCHARGE PIPING- POTENTIAL AIR 18.CR07-18074 INTRUSION.pdf 18.CR07-19389 18.CR07-19389 PI5468 OVER RANGED AGANpdf 18.CR07-20754 PROCEDURE NONCOMPLIANCE IDENTIFIED 18.CR07-20754 DURING WALKDOWN FOR EOC.pdf 18.CR07-21386 INACCURATE BARRIER EVALUATION PERFORMED 18.CR07-21386 FOR DOOR 427.pdf 18.CR07-22099 LICENSING AND DESIGN BASIS FOR PSH 2929 AND 18.CR07-22099 PSH 2930 SETPOINT.pdf 18.CR07-25792 Not Qualified in FITS for Root Cause Method Used on 18.CR07-25792 CR07-18974 18.CR07-25988 DB-SA-07-043 AFI - Usinrg Operating Experience 18.CR07-25986 Effectively to prevent events 18.CR07-27586 Not Qualified in FITS for Root Cause Method Used on 16.CR07-27586 CR07-15275 18.Notification 600269668 Notification 600269666 Address NWI from SS Self-Assessment.pdf 18.Notification 600333089 18.Notification600333089 AIF 0055548 Notification.pdf 1&Notification 600333824 18.Notification 600333824 MOV Program Enhancement.pdf Notification 600363287 DB-SA-07-02, concerns regarding SAP activity 16.Notification 600363287 tracking..pdf Notification 600363417 modules in SAP that some personnel in Design 18.Notification 600363417 Engineering are not familiar with.pdf 18.Notification 600363446 Notification 600363446 evaluate full text search criteria in Filenet.pdf Notification 600392126 re-evaluate the appropriateness of the specified 18.Notification600392126 Milton Roy pump selected.pdf 19 General procedures 19.001 Post Maintenance Test Manual Rev 29 NOP-WM-1003 R3 Nuclear Maintenance Notification Initiation, Screening.

19.002 and Minor Deficiency Monitoring Processes 20 Organrational Charts and contact llsts 20.001 Draft-DB-Org-Chart-Rev-52.ppt 21 Performance lndlcators Program health reports 21.004 2006 3rd qtr fleet program health.zip 21.005 2006 4th qtr fleet program healthzip 21.006 2007 1st qtr fleet program healthzip 21.007 2007 2nd qtr fleet program healthzip Plant Health reports Page 62

Library # Document Name 21.008 2007-01 Plant Health Report.pdf 21.009 2007-02 Plant Health Report.pdf 21.010 System Health Report 2006-3.pdf 21.011 System Health Report 2006-4.pdf PHC meeting minutes 21.001 PHC Min 2007-06-13 FinaLdoc 21.002 PHC Min 2007-07-11 FinaLdoc 21.003 PHC Min 200747-25 Final.doc 21.004 PHC Min 2007-08-06 Drakdoc 21.005 PHC Min 2007-08-22 Draftdoc Monthly Performance Reports (MPRs) 21.034 0609 DB MPR.pdf 21.035 0610 DB MPR.pdf 21.036 0611 DB MPR.pdf 21.037 0612 DE MPR.pdf 21.038 0701 DB MPR.pdf 21.039 0702 DB MPR.pdf 21.040 0703 DE MPR].pdf 21.041 0704 DB MPR .pdf 21.042 0705 DE MPR.pdf 21.043 0706 DB MPR.pdf 21.043a 0707 DB MPR.pdf CNRB meeting reports 21.044 CNRB Nov 2005.zip 21.045 CNRB Feb 2006.pdf 21.046 CNRB July 2006.pdf 21.047 CNRB Meeting Minutes Feb 2007.pdl CNRE engineering focus area and subcommittee DRAFT 2007 sept 21 048 agenda.doc Outage performance 21.060 15RFO Milestone Schedule UPDATED.xls EAB results 21.070 EAB Log.xls 21.071 EA61 Log.xls 21.072 Not used 21.073 Not used 21.074 EAB third quarter 2006 report- final.doc 21.075 EAB fourth quarter 2006 report- Anal.doc 21.076 EA6 first quarter 2007 report- final.doc 21.077 EA6 second quarter 2007 report- final.doc DBARs for 2007 COlA eng programs 21.101 DBAR 2005 1st Qtr (FINAL) 21.102 DEAR 2005 2nd Qtr (Final) 21.103 DBAR 2005 3rd Qtr (FINAL) 21.104 DEAR 2005 4th Qtr (FINAL) 21.105 DEAR 2006 1st Qtr (FINAL) 21.106 DEAR 2006 2nd Qtr (FINAL) 21.107 DBAR 2006 3rd Qtr (FINAL) 21.108 DEAR 2006 4th Qtr (FINAL) 21.109 DBAR 2007 1st Qtr (FINAL) 21.110 DBAR 2006 2nd Qtr (FINAL)

GARB minutes 21.201 081002 CARE Minutes.doc Page 63

Libraly Y Document Name 21.202 061016 CARB Minutes.doc 21.202a 61023 CARB Minutes.doc 21.203 061030 10-30-06 CARB Minutes (l).pdf 21.204 061030 CARB Minutes.doc 21.205 061 113 CARB Minutes.doc 21.206 061 127 CARB Minutes.doc 21.207 061 129 CARB Minutes.doc 21.208 061204 CARB Minutes.doc 21.209 061211 CARB Minutes (l).pdf 21.210 061211 CARB Minutes.doc 21.211 061218 CARB Minutes (l).pdf 21.212 061218 CARB Minutes (2).pdf 21.213 061218 CARB Minutes.doc 21.214 070312 CARB Minutes (l).pdf 21.215 070312 CARB Minutesdoc 21.216 070525 CARB Mintues (l).pdf 21.217 070525 CARB Mintues (2).pdf 21.218 070525 CARB Mintues.doc 21.218 070820 CARB Minutes.doc 21.219 070530 CARB Minutes.doc 21.220 070604 CARB Minutesdoc 21.221 070618 CARB Minutesdoc 21.222 070702 CARB Minutes.doc 21.223 070709 CARB Minutes.doc 21.224 070716 CARB Minutes.doc 21.225 070723 CARB Minutes.doc 21.226 070801 CARB Minutes.doc 21.227 070813 CARB Minutes.doc 22 Business and performance ImprovemenUactlon plans 22.001 FENOC Business 8 Excellence Plans 2007-2011.pdf 22.002 PGER Excellence Plan.xls 22.003 Not used 22.004 Not used 22.005 BACC Long Range Improvement Plan 23 General Information 23.001 NRC Eng IP 37001 50-59 Process.pdf 23.002 NRC IP 40500 Problem Resolution.pdf 23.003 NRC IP 37551 System Engineering .pdf 23.004 NRC IP 37550 T-Mods.pdf 24 Information provided by Industry peem 25 Exponent Repon and related correspondence 25.001 -

070320 Davis Besse Exponent Repotpdf 25.002 070402 DB RFl.pdf 25.003 070404 NEIL Letter SubmittaLpdf 25.004 070502 DB Response to RFl.pdf 25.005 070514 Demand for Info n-cl241.pdf 25.006 070521 Letters to Employees and Public.pdf 25.007 070613 Demand For Info Response - DB .pdf 25.008 070815 FENOC CAL (reg sensitivity).pdf 26 Management reports and lnformatlon 26.001 Plant Status - Thursday September 6 2007.msg 26.002 070906 MAOM-package.pdf 26.003 Rad Monitor equipment list - G Chung.XLS Page 64

Page 65 1.7 Team Members Biographies The following biographies are included John Garrity The Marathon Consulting Group, Team Leader Paul Borer The Marathon Consulting Group Harold Baumberger The Marathon Consulting Group Rod Filipek St. Lucie Nuclear Power Plant, FPL Mark Flaherty Calvert Cliffs Nuclear Power Plant, Constellation Energy Joe Pechacek JA Fitzpatrick Nuclear Power Plant, Entergy Page 66

John H. Garrity President and Chief Executive Officer Marathon Consulting Group 1994-present: Marathon Consulting Group; President and CEO -

- Responsible for Marathon client service operations, and selected personal consulting engagements. Engaged in expert consulting in the area of process performance monitoring and improvement, management mentoring, process centered team formation and compensation, configuration management, business plan and corporate strategy development, process improvement training, and project management training. Also conducted root cause and collective significance analyses of client situations, and participated or lead high impact teams to resolve problems.

- Team Leader, Davis-Besse Independent Assessment of Engineering Programs Effectiveness 2004, 2005, and 2006 1993-1994: New York Power Authority; Resident Manager - Placed in charge after unit was shut down under NRC confirmatory action letter and on problem plant list.

Responsible for developing and executing plan to resolve problems in context of intense political pressure and company senior management turnover. Numerous escalated enforcement actions from actions of earlier periods mitigated by effective, aggressive management investigations and corrective actions.

1992: TVA Bellefonte; Site Vice President - Responsible for all ongoing activities necessary to reactivate the project from deferred status.

1990-1992: TVA, Watts Bar; Site Vice President - Responsible for all activities necessary to progress completion of the Watt's Bar units, including engineering, construction, startup, operational readiness, and commissioning. Formulated management objectives for restart of construction following stand down and significant regulatory involvement. Reengineering of design engineering and construction processes, restart of construction, outsourcing construction labor, engineering, and management. Instituted management performance accountability through site wide self-monitoring program, based on principles of TQM. Significant improvement of site nuclear performance, left site positioned for successful completion. Credibility with NRC restored. Significant process performance improvement results in engineering design, engineering analysis, construction engineering, construction, and corrective action.

1990: Maine Yankee Atomic Power Co; Assistant to President - Special projects assignment, including work on low level waste disposal options available to company and state.

1989-1990: Maine Yankee Atomic Power Co; Vice President Engineering and Licensing - Responsible for nuclear engineering, plant engineering, licensing, and operations support.

1988-1989: Maine Yankee Atomic Power Co; Assistant Vice President Engineering and Quality Programs - Responsible for quality assurance, nuclear engineering, licensing and plant engineering.

1984-1988: Maine Yankee Atomic Power Co; Plant ManagdSenior Site Manager -

Responsible for site operations.

Page 67

John H. Garrity (continued) 1984: Maine Yankee Atomic Power Co; Assistant Refueling Manager - Special assignment, monitored several dozen engineering projects and coordinated activity with overall refueling effort.

1980-1984: Maine Yankee Atomic Power Co; Director, Nuclear Engineering and Licensing - Responsible for overall coordination of reload design, plant safety analysis and nuclear engineering analysis of plant systems, emergency planning, and radiological monitoring.

1975-1980: Central Maine Power Co.; Principal Nuclear Engineer for Central Maine Power Co. (1976 -1980), project engineer for two new reactor sites (1975) 1970-1974: Maine Yankee Atomic Power Co.; performed primaryireactor and secondary plant systems performance monitoring (1 973- 1974), Reactor Engineer &

Startup Test Supervisor for commissioning of the Maine Yankee reactor (1970-1972)

Page 68

Paul J. Borer Senior Vice President Marathon Consulting Group 2002-present: Marathon Consulting Group

- Performed safety culture and engineering effectiveness assessments, assessment of a foreign nuclear station overall processes and performance, provided executive oversight of a multi-year program to introduce US system engineering process concepts to a foreign nuclear power station fleet, USNRC Inspection Procedure 95-003 (Supplemental Inspection for Repetitive Degraded Cornerstones) process readiness reviews Team Member - Davis-Besse Independent Assessment of Engineering Programs Effectiveness in 2004 and 2005.

1986-2002: Institute of Nuclear Power Operations (1NPU)-Held the following positions:

Senior Representative for Assistance - Management consulting role.

Responsible for formulating performance improvement plans for several nuclear stations. Provided direct feedback to senior station management on performance issues. Prioritized deployment of INPO assistance resources.

- Division Director, Plant Operations Division - a technical INPO division responsible for evaluation of Operations, Chemistry, and Radiation Protection areas. Involved in setting standards for evaluations, responsible for the evaluator training program, and assisting the industry in attaining standards of excellence.

Detroit Edison Vice President - Nuclear Generation (On - loan from INPO 1997-1998) Responsible for all aspects of Operation, Maintenance, and Engineering of a large scale BWR. Led a plant staff of approximately 500.

- Vice President, Nuclear Engineering - New York Power Authority (On - loan from INPO 1993-1994). Responsible for Design Engineering at two nuclear generating stations. Developed and implemented a plan to deploy corporate design engineering resources to the stations in order to be more responsive to station needs.

- Department Manager - Managed four INPO departments (Emergency Preparedness, Operating Experience Applications, Technical Support, and Operations) - Responsible for the evaluation of their respective areas of plant performance and various assistance programs. Also functioned as a Team Manager and lead teams of 15-20 INPO and industry professionals during performance-based nuclear plant and corporate evaluations.

Held a Senior Reactor Operator's License - Boiling Water Reactor and Licensed Professional Engineer - Mechanical.

1985: Engineering, Planning, and Management, Inc.; Project Manager - Responsible for the overall conduct of work, sales, budget, schedule, client relationship, and quality of products for EPM clients in the Southeastern U.S.

1983-1984: Smith Barney, Harris Upham, and Company; Account Executive -

Responsible for retail securities sales, client development, securities research, financial planning advice.

Page 69

Paul J. Borer (continued) 1976-1983: Cooper Nuclear Station; Served in various management positions, all reporting to the site manager. (Operations Manager, Engineering Manager, Chemistry and Radiation Protection Manager) 1970-1976: U.S. Navy; Completed the Naval Nuclear Power Training Program and served aboard a nuclear submarine.

Page 70

Harold E. RustyBaumberger Vice President and Director, Performance Assessment Marathon Consulting Group 996-present: Marathon Consulting Group; Responsibilities include the following:

Vice President and Director, Performance Assessment - Responsible for business areas of independent assessment, INPO evaluation and NRC inspection support, Design Basis assessments, Corrective Action Program assessments, and Maintenance Rule implementation. Also serve as Marathons Quality Assurance Manager.

Team Member, Palo Verde ImPACT Review Team, root cause evaluation and corrective action program area, responsible for review of programs and practices in preparation for NRC 95003 Inspection.

Consultant, responsible for assessment of Peny Corrective Action Program for effectiveness of past actions, current status, performance monitoring and goals, and ongoing 95003 recovery plans.

Team Member - Davis-Besse Independent Assessment of Engineering Programs Effectiveness in 2004,2005, and 2006.

Project Lead of the Master Equipment List (MEL) Update Project at Millstone -

Managed the validation and update of the MEL database.

Executive Lead, Transition for the Vermont Yankee Nuclear Power Corporation -

Managed the implementation of the sale agreement and transition of the Vermont Yankee station to new ownership. Reported directly to the President & CEO.

Quality Assurance Manager - Developed and implemented Quality Assurance Program, obtained NUPIC certification, trained and certified lead auditors.

Provided interface with client QA Managers.

Configuration Management Supervisor at Cooper Nuclear Station - Worked in environment of high regulatory scrutiny to improve Engineering performance and develop recovery strategies. Responsible for maintaining Design Basis and resolving Design Basis and Configuration Control issues. Managed Modification Process, Design Criteria Program, Equipment Classification Program, Equipment Data File, and Drawing Control Program.

Served as a Safety System Functional Evaluation team member in the area of Operations at Beaver Valley - Reviewed the 4kV Electrical Distribution and Emergency Diesel Generator systems for Unit 2.

Provided expert consulting related to INPO-related issues at River Bend -

Participated in major assessment covering the new INPO Performance Objectives, existing INPO findings, and items from the Long Term Performance Improvement Program.

Participated in a component-level design basis review of non safety-related systems and outage work at Dresden - Documented review of over 7000 components against Design Basis, FSAR requirements, original system and component specifications, and vendor-supplied data.

Performed assessment of Design Basis programs at Vermont Yankee including Design Basis document program development.

Page 71

Harold E. RustyBaumberger (continued)

- Participated on corporate Engineering Independent Safety Assessment Response Team at Maine Yankee.

1990-1996: Independent Consultant; Provided services to nuclear utilities and Department of Energy (DOE) contractors in management, safety review, quality assurance and performance areas. Performed audits and independent assessments of overall performance, outage management, maintenance, and configuration management programs.

1988-1990: Liberty Consulting Group; Senior Consultant - Led evaluations of management capability at nuclear power plants in all areas of facility operation.

Conducted assessment of plant performance against INPO standards.

1980-1988: Insfitute of Nuclear Power Operations (INPO); Evaluator/Senior Evaluator - Performed evaluations of more than 50 commercial nuclear power stations in areas of maintenance, Engineering Support, and Organization and Administration. Participated in accreditation reviews of utility training programs.

1977-1980: Nuclear Power Consultants; Consultant - Provided services to nuclear utilities and government agencies conducting reviews and audits in areas of operations, maintenance, engineering, quality assurance, nuclear fuel fabrication and procurement, and licensing. Project manager for the update of Fort St. Vrain Final Safety Analysis Report. Participated in the review of Ontario Hydros heavy water production costs and uranium fuel requirements for the Providence of Ontario.

1967-1977: U. S. Naval Submarine Service; Naval Nuclear Propulsion Officer -

Responsible for supervision, operation and maintenance of nuclear propulsion plant and ships auxiliary systems. Certified Navy Nuclear Propulsion Engineer Officer.

Participated in refueling, pre-operational testing, and startup of two reactors following extended outages, including one after a change of NSSS.

Page 72

Rod Filipek Supervisor, Instrument and Control (I&C)/Digital Design Engineering St. Lucie Nuclear Power Plant, Florida Power & Light January 2006 - Present: St. Lucie Nuclear Power Plant, Florida Power and Light; I&C and Digital Design Engineering Supervisor - Supervise the production of I&C design Modifications and support the recently-installed Distributive Control System (DCS).

Acted as Shift Design Engineering Manager during the last two Unit outages.

January 2004 - December 2005: St. Lucie Nuclear Power Plant, Florida Power and Light; Procurement Engineering and Configuration Management Supervisor Supervised Procurement Engineering and Configuration Management activities for the station.

Team Lead for Configuration Management Self-Assessment, Mid-cycle INPO Plant Evaluation Team Member for St. Lucie.

October 1990 - December 2003: St. Lucie Nuclear Power Plant, Florida Povtier and Light; I&C Design Engineering Supervisor - Supervised production of I&C design modifications for the station. For periods of time was also the Electrical Design Engineering Supervisor.

October 1989 - October 1990: St. Lucie Nuclear Power Plant, Florida Power and Light; I&C Design Engineer - Prepared I&C design modifications for the station.

December 1985 - October 1989: Fermi N Nuclear Power Plant. Detroit Edison Company; I&C Engineer - Head of I&C Plant Maintenance and Technical Department. System Engineering Supervisor - Supervised I&C and Electrical System Engineering groups.

Member of the Detroit Edison Company Nuclear Speakers Bureau.

April 1979 - December 1985: Fermi II Nuclear Powier Plant, General Electric Company; Co-Startup Test Engineer - Supported pre-operation and startup testing. Obtained General Electric (GE) Boiling Water Reactor Senior Reactor Operator certification and Professional Engineering License.

January 1978 - April 1979: General Electric Company; Field Engineering - Provided training and testing support at the GE Power Generation Control Complex and training facilities in San Jose, California.

July 1977 - January 1978: General Electric Company; Field Engineering - Training and field training assignments.

Page 73

Joseph A. Pechacek Manager Program and Component Engineering James A. Fitzpatrick Nuclear Power Plant, Entergy Nuclear Operations 1995-present Entergv Nuclear Operations, James A . Fitzpatrick Nuclear Power Plant;

- Manager-Program and Component Engineering - Responsible for leadership of Program and Component Engineering Department consisting of twenty engineers and technicians. Department responsibilities include performance monitoring and trending of station components; administration and maintenance of the stations preventive maintenance program; conduct of equipment failure evaluations; and administration and maintenance of engineering programs, e.g. fire protection, Motor-Operated Valve (MOV)/Air-Operated Valve (AOV), Flow Accelerated Corrosion (FAC), Appendix J.

Direct reports consist of three engineering supervisors in the areas of Code Programs, Component Engineering and Procurement Engineering.

- Manager-Design Engineering - Provided leadership of Design Engineering Department consisting of twenty-five engineers, technicians, drafters and clerical staff. Department responsibilities included development of both commercial and nuclear design changes, maintenance of engineering configuration, and maintenance of plant design basis. Direct reports consisted of five engineering supervisors in the areas of Mechanical, Civil /

Structural, Electrical, Instrumentation and Controls, and Engineering Configuration.

- Manager - Engineering Support - Provided leadership of Engineering Support Department consisting of fifteen engineers, technicians and drafters. Department responsibilities included Procurement Engineering, Modification Design, Vendor Manual and Plant Equipment Database maintenance, and design drafting. Direct reports consisted of three supervisors.

- Fire Protection and Safety Coordinator - Provide leadership for the implementation and maintenance of the station Fire Protection and Industrial Safety Programs in accordance with NRC, NEILMSO, Factory Mutual (FM), New York State Uniform Fire Code, and NYOSH/OSHA guidelines and regulations. Responsibilities included supervision of five Fire Protection and Industrial Safety Technicians and oversight of day-to-day fire protection and industrial safety activities.

- Senior Fire Protection Engineer - Maintained the station Fire Hazards Analysis (FHA);

1 OCFRSO, Appendix R Safe Shutdown Analysis and Fire Protection Design Basis Document. Performed reviews and developed evaluations for NFPA code deviations and fire barrier deviations per NRC Generic Letter 86-10. Performed hydraulic calculations of suppression fire systems and assessed plant features using computer fire models.

Prepared 50.59 Nuclear Safety Evaluations to support resolution of Final Safety Analysis Report (FSAR) deviations.

Page 74

Joseph A. Pechacek (continued) 1994-1995: IBEX Engineering Services; Fire Protection Supervisor - Served as Fire Protection Supervisor at the James A. Fitzpatrick Nuclear Power Plant. Supervised staff of four Fire Inspectors and approximately thirty fire watch personnel.

1992-1994: Engineering, Planning, and Management Inc.. (EPIY); Fire Protection Engineer - Evaluated plant fire protection systems and features for compliance with NFPA codes, evaluated findings, and recommended corrective actions.

1989- 1992: Hobson and Woese, Znc., Fire Protection Engineer - Designed fire suppression and detection systems for commercial and institutional projects which included preparation of specifications and drawings. Performed fire hazards analyses which included the use of compartment fire and heat transfer models.

Page 75

Mark D. Flaherty Manager, Engineering Services Calvert Cliffs Nuclear Power Plant, Constellation Energy April 2006 - present: Culvert C w s Nuclear Power Plant, Constellation Energy; Manager, Engineering Services - Responsible for providing engineering services to site including system, design, program, and equipment reliability functions. Manage a staff of over 100 engineers, technicians, and supervisors.

February 2006 - April 2006: Constellation Energy; Vice President, Technical Services (Acting) - Responsible for providing oversight for corporate technical functions including Fuels, Corporate Engineering, Probabilistic Risk Assessment, and Licensing. Supervised managers of identified corporate functions; participated in senior leadership meetings and councils.

June 2004 - Feb 2006: Constellation Energy; Manager, Fleet Licensing - Responsible for: interfacing with Nuclear Regulatory Commission (NRC) management; creating and implementing standard Licensing processes and procedures; providing interface to INPO and NEI; serving on site Nuclear Safety Review Boards. Supervised three site Licensing Directors and staff, including two corporate personnel. Managed oversight of successful recovery of Nine Mile Point License Renewal Project - $3M effort. No NRC violations at any site greater than green during this time period, closed two existing white findings.

July 2001 - June 2004: R.E. Ginna Nuclear Power Plant, Rochester Gas and Electric Corporation (RG&E); Manager, Nuclear Safety and Licensing - Responsible for interfacing with NRC personnel including preparing License Amendment Requests (LARS) and responding to correspondence ( e g , Orders, Bulletins, etc.). Supervised staff of eight personnel (imaging, licensing, risk, and software engineers). Managed conversion of configuration management computer system from mainframe to local server based - $OSM project.

October 1998 - December 2001 : R. E. Ginna Nuclear Power Plant, RG&E; Manager, Configuration Support Engineering - Implemented Design Basis Document (DBD) Program which provided electronic copies of design related information on employee computers - $3SM project. Supervised staff of eight personnel (imaging, design, risk, and software engineers). Implemented new 1 OCFR50.59 and 1 OCFR50,65(a)(4) Programs. Shift Technical Advisor (STA), 2000. INPO Plant Evaluation Team Member for Point Beach, 2000. Participated as an International Atomic Energy Agency (IAEA) PRA expert to Krsko (Croatia) and Dukovany (Czech Republic) in 1991 and 1992, respectively.

February 1997 - October 1998: R. E. Ginna Nuclear Power Plant, RG&E; Senior Licensing Engineer - Senior Reactor Operator (SRO) Certification. Developed and implemented risk models for Ginna Station using Equipment Out Of Service (EOOS) software.

Page 76

Mark D. Flaherty (continued)

February 1989 - February 1997: R. E. Ginna Nuclear Powier Plant, RG&E; Licensing Engineer - Developed and managed Improved Technical Specifications (ITS) Program - (Ginna Station was first and oldest Westinghouse plant to convert) and received Senior Nuclear Executive Award - $1.5M project. Developed probabilistic risk assessment (PRA) models and programs; managed program beginning 1995; %3M project. Responsible for multiple licensing tasks (research licensing basis, LARS)

September 1986 - February 1989: Davis-Besse Nuclear Power Station, Toledo Edison Company; Probabilistic Risk Analysis Engineer Page 77

Section 2 Assessment of Internal Self-Assessment Performance This topic is an explicit assessment area in the 2007 Independent Assessment plan, and is addressed in section 1.5.2.5 Page 78

Appendix 1 Action Plans DBBP-VP-0009 Rev 3 Management Plan for Confirmatory Order Independent Assessments requires Action Plans to be developed to address the Independent Assessment Reports Areas for Improvement (AFls). No AFls were identified in the 2007 Independent Assessment of Engineering Programs, therefore no action plans are required.

Page 79

Appendix 2 Independent Assessment Plan submittal NUMBER:

COIA-ENG-2007 ASSESSMENT AREAS:

Engineering program effectiveness of modifications, calculations, system engineering, and corrective action program utilization.

PURPOSE:

The purpose is to provide an independent and comprehensive assessment of the Engineering program effectiveness at the Davis-Besse Nuclear Power Station. The assessment will be performed in accordance with the requirements of the March 8, 2004, Confirmatory Order Modifying License No. NPF-3, and Davis-Besse Business Practice DBBP-VP-0009, Management Plan for Confirmatory Order Independent Assessments. The assessment will be used to identify areas for improvement, requiring corrective actions with action plans. The assessment will also be used to assess the rigor, criticality, and overall quality of available Davis-Besse internal self-assessment activities in the Engineering program areas listed above. The final assessment report will provide an overall concluding statement on the Engineering program effectiveness as rated utilizing the assessment categories of DBBP-VP-0009.

SCOPE:

The Independent Assessment Team will assess the following Engineering program areas:

1. Plant Modification process
2. Calculation process
3. System Engineering Programs and Practices
4. Implementation of the Corrective Action Program (CAP) by Engineering
5. Effectiveness of self-assessments
6. Corrective actions taken in response to the Areas in Need of Attention (ANAs) identified during the 2006 Independent Assessment of the Davis-Besse Engineering Program Effectiveness The Assessment Team will assess conduct of the following activities:
1. Plant Modification Process The team will perform a review of activities to assess the effectiveness of the plant modification process:
h. Selection and prioritization of potential modifications, including assessment of delayed modifications on plant and operating personnel
i. Owner acceptance sub-process (review of contracted work)
j. Quality of modification packages since 2006 assessment (Permanent and Temporary Modifications)

Page 80

k. Closeout of modification packages and supporting document updates I. Effectiveness of modifications
m. Interaction and support from parallel processes
n. Workload management
2. Calculation Process The team will assess the following attributes of the plant calculation process:
k. Workload management, including appropriateness of work priorities I. Acceptance criteria and owner acceptance sub-process (review of contracted work)
m. Margin management and allocation
n. Linkages and consistency with other calculations
0. Preservation of design bases
p. Documentatiodtraceability/attribution
q. Calculation health and improvement program
r. Interaction and support from parallel processes
s. Systems descriptions design information
t. Engineering rigor and attention to detail
3. System Engineering Programs and Practices The team will investigate the following items:
k. System Engineering alignment and plant support I. System Health evaluation and reporting
m. Process for prioritizing, communicating, and resolving health deficiencies and program deficiencies
n. Equipment Reliability Improvement Program as reflected in FENOC Excellence Plans
0. Maintenance Rule system monitoring and trending
p. Experience and expertise, including use of operating experience
q. Margin awareness and margin allocation
r. Interaction and support from parallel processes
s. Access to knowledge of Engineering information in calculations
t. Workload management
4. Implementation of the Corrective Action Process by Engineering The Assessment Team will assess the following:
d. Condition Report ownership and appropriate initiator involvement
e. Quality of root and apparent causes produced by Engineering and associated management behavior and guidance
f. The Assessment Team will review selected Condition Reports related to Engineering Section performance initiated since the 2006 Independent Assessment of Engineering Performance and independently assess the corrective actions taken Page 81
5. Effectiveness of Davis-Besse Assessment Activities The Assessment Team will evaluate the effectiveness of the Davis-Besse Nuclear Power Station's assessment activities associated with the implementation of Engineering programs as follows:
f. Planning of assessments over the short and long term for ongoing assessment of Engineering performance
g. Review the results of the Davis-Besse Quarterly Quality Assessments that evaluated Engineering; Determine if the assessments were comprehensive and if effective actions were taken to correct problems or weaknesses identified.
h. Evaluate the effectiveness of self-assessment capability by reviewing corrective actions associated with self-assessment reports, audits (including audits of the offsite safety committee activities), and evaluations conducted of Engineering program implementation.
i. Determine if the Engineering staff is aggressive in correcting self-assessment and assessment findings, and determine whether the corrective actions are adequate, timely, properly prioritized, and that effectiveness reviews are ensuring the desired results.
j. Determine the receptivity and responsiveness of management and staff to issues raised in self-assessments and assessments.
6. Corrective actions taken in response to the Areas in Need of Attention (ANAs) identified during the 2006 Independent Assessment of the Davis-Besse Engineering Program Effectiveness The Assessment Team will evaluate the responses to the seven (7) Areas in Need of Attention (ANAs) identified during the 2006 Independent Assessment:

1 ANA Inattention to detail in calculations 2 ANA Implementation of requirements from calculations 3 ANA Equipment Reliability Program 4 ANA Red plant health systems 5 ANA ECPRevisionReviews 6 ANA Follow-ups to assessments and last year's COIA-ENG- 2005 7 ANA Management of engineering workload INDEPENDENT ASSESSMENT TEAM:

John Garrity, Marathon Consulting Group, Team Leader Paul Borer, Marathon Consulting Group Harold Baumberger, Marathon Consulting Group Rod Filipeck, I&C Design Supervisor, St. Lucie Station, Florida Power & Light Co.

Joe Pechacek, Manager, Engineering Programs, Fitzpatrick Nuclear Station, Entergy Northeast Page a2

Mark Flaherty, Manager, Engineering Services, Calvert Cliffs Nuclear Power Plant, Constellation Energy Biographies attached.

SCHEDULE:

July 10, 2007: Send selected documentation to team members to begin off-site preparations.

July 16,2007 to September 7, 2007: Offsite (in office) review in preparation for onsite assessment.

September 9, 2007: Assessment team will assemble at the plant for final assessment preparations.

September 10,2007 to September 21,2007: Conduct onsite assessment and provide Davis-Besse with preliminary results prior to leaving site.

October 1,2007: Draft team assessment report and final debrief (marks the completion of the assessment) will be provided to Davis-Besse.

October 12,2007: Final team assessment report provided to Davis-Besse.

November 19,2007: Final Davis-Besse assessment report and action plans (if required by findings) will be submitted to the NRC within 45 days of the completion of the on-site assessment.

ASSESSMENT METHODS:

The Independent Assessment Team will use DBBP-VP-0009 Management Plan for Confirmatory Order Independent Assessments The assessment methodology may include, but is not limited to, any combination of the following:

Observing activities Interviewing personnel Reviewing documentation Evaluating or performing trend analysis Reviewing procedures, instructions, and programs Comparing actual performance levels with pre-established performance indicators The following general standards will apply to the assessment of Davis-Besse Engineering program implementation:

Modification and Calculations reflect in-depth reviews of problems and resolutions that support a high level of nuclear safety.

Engineers demonstrate knowledge and understanding of the design basis, including maintenance of design basis documentation.

System engineers demonstrate intolerance for failures of critical equipment.

Engineers maintain clear ownership of corrective actions from initiation through resolution.

Page a3

FENOC Davis-Besse Engineering Assessment Plan - 2007 System engineers demonstrate intolerance for failures of critical equipment.

Engineers mainrain clear ownership of corrective actions tiom initiation through resolution.

A rigorous approach to problem solving and application of engineering procedures and methods is used.

The Assessment Team will review the referenced procedureidocuments during the preparation period prior to site arrival.

The Assessment Team will identify in its final report, as applicable, areas of strength, areas in need of attsntion, and areis for improvement as defined in Davis Besse Business Practice DBBP-VP-0009. The Team will provide an overall concluding statemenr on the Enginwring program efectiveness as rated utilizing the assessment categories of DBBP-VP-0009.

REFERENCES:

.. Confirmatory Order dated March 8,2004 DBBP-VP-0009 "Management Plan for Confirmatory Order Independent .

.. Assessments" NOP-CC-2003, Engineering Changes NOP-CC-3002, Calculations NOP-LP-7-00 I , Condition Report Program Action items from NRC inspection reports issued since September 22,2006, that are applicable to the areas assessed (Le., condition reports, corrective actions,

.. responses to findings and non-cited violations)

Applicable self-assessments performed since September 22,2006

.. QA Quarterly AssessmentsiReports for past three quarters CNRB meeting minutes from last three CNRB intervals.

Applicable Section or area Performance Indicators ,

ASSESSMENT P L . U APPROVALS:

Prepared by: Td\l\i;3.3.3.3.3.3.3.3.3.3

<e-Date: 6/7/07 John H. Garrity, Asseshent Team Lead Approved by: Date:

Approved by:

&JlLbIL m. LWdd e bl, Rinckzl, Executive Sponsor Date: h/f,/o7 Page 84