ML20102B648

From kanterella
Jump to navigation Jump to search
Procedure EA-062, Program for Completion of Engineering Assurance In-Depth Technical Audits,Millstone Unit 3 Project
ML20102B648
Person / Time
Site: Millstone Dominion icon.png
Issue date: 02/19/1985
From:
NORTHEAST NUCLEAR ENERGY CO.
To:
Shared Package
ML20102B643 List:
References
EA-062, EA-62, NUDOCS 8503040239
Download: ML20102B648 (12)


Text

, 3 EA-062 PROGRAM FOR COMPLETION OF THE ENGINEERING ASSURANCE IN-DEPTH TECHNICAL AUDITS MILLSTONE UNIT 3 PROJECT NORTHEAST UTILITIES SERVICE COMPANY

PROGRAM FOR COMPLETION OF THE ENGINEERING ASSURANCE IN-DEPTH TECHNICAL AUDITS MILLSTONE UNIT 3 PROJECT NORTHEAST UTILITIES SERVICE COMPANY OVERVIEW This plan describes a two phase program for completion of the Engineering Assurance In-depth Technical Audits as discussed with the NRC on January 11, 1985. The Program provides for performance of 4 in-depth technical audits of the Millstone Unit 3 Project engineering and design activities.

Three audits have been completed to date.

Completion of the audit activities involves the performance of the 4th audit (Phase I) and the evaluation and assessment of the results of the 4 audits in order to form a conclusion as to the adequacy of the design process as implemented for Millstone Unit 3 (Phase II).

A detailed plan for completion of both phases of the program is described below.

The completion schedule provides for initiating phase II for the three technical audits that have been completed such that the preliminary results of the evaluation may be used to adjust or expand the scope of the fourth technical audit as necessary.

In addition, the fourth technical audit is scheduled such that preliminary results of the NRC CAT Inspection (scheduled for the first quarter 1985) which relate to the design process may also be used to adjust or expand the scope of the fourth technical audit as necessary.

PHASE I - AUDIT PLAN FOR PERFORMANCE OF THE FOURTH IN-DEPTH TECHNICAL AUDIT OF MILLSTONE UNIT 3 PROJECT PURPOSE The purpose of the audit is to assess the adequacy of the design procens by evaluating the design of the Containment Recirculation Spray System (RSS),

associated structures and interfaces.

Adequacy of design changes generated by the Site Engineering Group (SEG) and other specific activities will also be evaluated.

SCOPE AND APPROACH The RSS will form the basis of the audit.

The design will be reviewed to determine if the following attributes are met:

o The design is consistent with the FSAR.

Including eystem function, compliance to documents committed to in the FSAR and compliance with correct design practices.

o The design is in compliance with NSSS requirements and criteria.

Technically adequate calculations are available to support the design.

o o

Diagrams, specifications, and drawings are technically complete and consistent with each other.

The adequacy of design changes generated by the SEG will also be evaluated.

Emphasis will be placed on changes to the RSS. However, changes to other systems will be evaluated as necessary to obtain a reasonable sample size.

In order to evaluate certain activities, it is anticipated that review of material other than that related to the RSS will be necessary. These activities include:

Stress reconciliation (piping) o o

Environmental Qualification o

Structural Load Tracking o

Electrical Separation o

Conduit Support Design AUDIT TEAM The team will function under the direction of SWEC Engineering Assurance Division (EA) Boston.

The team will be composed of NUSCO and SWEC personnel.

The SWEC team members will be off-project experienced technical personnel.

PREPARATION Review applicable FSAR sections, Westinghouse (NSSS) documents, and related o

project procedures to become f amiliar with system function, design basis requirements, and project specific methods.

Identify and assemble key documents necessary for the audit (e.g., FSKs, Design Criteria).

o Determine status of design documents (Example:

Determine what pipe stress analysis problems have been stress reconciled).

o Discuss and establish means to evaluate interface between disciplines, o

Refine scope to indicate specific interfacing systems, structures and components to be evaluated.

o Refine scope to account for preliminary results of the evaluation of the first three technical audits and the results of the NRC CAT Inspection that relate to the design process.

Using the Review Plans provided by Engineering Assurance as guidance, each o

team member is to develop a Review Plan specific to the discipline and areas to be audited. The Review Plan is to reflect the scope of the audit and the means for evaluating discipline interface and identify the detailed attributes that are to be pursued during the audit.

PERFORMANCE The team members must annotate the review plans to specifically and completely identify the documents reviewed (including issue and revision identification) and to document, in detail, the results of the review for each attribute.

During the performance of the audit, the Audit Team is to inform the Project of potential concerns or requests to provide needed information using an " ACTION ITEM" form.

The Project engineering staff will be expected to promptly respona to each Action Item providing the information requested or a response to the potential concern. A status log is to be maintained to track and account for all Action Items.

(See Attachment 2 for guidance in generating Action Items).

To facilitate the review of the system and evaluate activities, a site walkdown will be included.

Periodic status meetings are to be held by the Team Leader and SWEC Project Engineer. The purpose of these meetings is to discuss the progress of the audit and the status of any open Action Items.

REPORTING At the conclusion of the audit, and prior to issuance of the report, a meeting will be held by the Team Leader with the SWEC Project Engineer and appropriate SWEC and NUSCO management personnel to discuss the results of the audit.

The audit report will generally follow the outline below:

1.

INTRODUCTION 2.

PURPOSE 3.

SCOPE 4.

SUMMARY

OF RESULTS AND OVERALL CONCLUSIONS 5.

AUDIT OBSERVATIONS 6.

SUMMARY

BY DISCIPLINE 6.1 Control Systems 6.2 Electrical 6.3 Engineering Mechanics 6.4 Materials Engineering 6.5 Power 6.6 Structural Each team member is specifically responsible for preparing any needed Audit Observations and preparing the summary section for the discipline audited (Sections 6.1 thru 6.6).

See Attachment 3 for guidance in generating Audit Observations.

The report will be reviewed by the Audit Team Leader and approved by the Chief Engineer Engineering Assurance Division.

PHASE II - PLAN FOR THE EVALUATION OF IN-DEPTH TECHNICAL AUDIT RESULTS FOR MILLSTONE 3 PRO. JECT PURPOSE The purpose of this plan is to describe the method to be used to evaluate the combined results of the Millstone Unit 3 In-depth Technical Audits in order to

_4_

form a conclusion as to the adequacy of the design process as implemented on Millstone Unit 3.

BACKGROUND In order to put in perspective the evaluation plan for analyzing the Engineering Assurance Technical Audit results on the Millstone Unit 3 Proj ect, it is important to review some background information on how each Engineering Assurance Technical Audit is pursued and culminated.

The purpose of the in-depth technical audit is to evaluate the technical adequacy of engineering and design documents and to evaluate their degree of compliance with the FSAR, applicable codes, standards, and other licensing commitments.

Findings from the individual technical audits are evaluated for the determination of root cause, extent of conditions and corrective and preventive actions as part of the audit follow-up.

SWEC Engineering Assurance verifies that these actions are appropriate end have been completed during the individual audit follow-up.

EVALUATION PLAN The findings from each of the technical audits and the findings related to the design process from the NRC CAT Inspection will be summarized and grouped to determine the overall significance and impact, when viewed as a composite, that these findings have on the adequacy and implementation of the design process.

The findings will first be categorized by type.

These categories will be selected as representing specific activities or functions of the design process and will provide a framework for judging the adequacy of the design process and its implementation.

Listed below are examples of finding types which will form the basis for the categorization.

These finding types were established based on the results of SWEC in-depth technical audits and NRC IDI Inspections.

1.

Design Process Implementation Deficiencies 2.

Design Process or Method Inadequate 3.

Inadequate Interface Control 4.

SAR Related Deficiency 5.

Design Change Deficiency 6.

Document Control Deficiency 7.

Test Requirement or Implementation Deficiency 8.

Construction / Site QC Deficiency 9.

Vendor or Site Contractor Deficiency

10. NSSS Deficiency

To facilitate the evaluation, the data base for these findings will also include, as applicable, the following:

Responsible Discipline o

o Document Type o

Cause o

Extent of Conditions o

Corrective Actions o

Preventive Actions Af ter the findings have been categorized and grouped by discipline they will be reviewed and screened to determine if any findings can be eliminated from further considerations because they are minor, editorial, or administrative in nature and that they do not provide evidence of inadequacies in the design process or represer. generic implementation concerns and therefore do not warrant additional analysis. The rationale for eliminating findings from further consideration will be documented.

The remaining findings will be reviewed to evaluate the adequacy and implementation of the overall design process. Particular emphasis will be placed on the adequacy of the design to permit safe operation and shutdown of the facility.

This review will be accomplished by evaluating the findings within finding types (activity or function of the design process) and by evaluating the similarity of findings and the extent of corrective or preventive action as the means of determining either that further action (e.g.,

audit, design review, etc.) is necessary or that sufficient basis exitts for establishing confidence in the adequacy of the design process and its implementation.

A summary report will be issued, which will present the conclusion reached during the analysis of the combined audit results.

The report will address whether sufficient evidence exists from the technical audits to give additional confidence that the Millstone Unit 3 facility as designed is in compliance with the FSAR commitments and NRC requirements and regulations. This report will niso present recommendations for areas that require additional actions to confirm the adequacy of the design, if the results so dictate.

The evaluation will be performed and a draft report prepared by SWEC, Engineering Assurance Division.

NUSCO will review and approve the evaluation report and submit the final report to the NRC.

TENTATIVE SCHEDULE OF PHASE I AND PilASE 11 ACTIVITIES lists the surnary of activities for both the upcoming Engineering Assurance Technical Audit and the Evaluation of Technical Audit Results.

This summary of activities also includes anticipated NRC involvement in the process.

r-Page 1 of 1 ATTACHMENT 1 t

AUDIT TEAM DISCIPLINE NAME AND ORGANIZATION LOCATION TITLE EMD Electrical Power Control Systems Structural Materials Engineering EA (Audit Team Leader)

EA (Audit Coordinator) i

~.

Page 1 of 1 ATTACHMENT 2 GENERATION OF ACTION ITEMS An Action Item can be generated to identify deficiencies or to request information.

It is difficult to define precise criteria to apply in determining if an Action Item should be generated.

Three considerations are:

significance of individual discrepancies, number of discrepancies, and the urgency of needed information by the evaluation team member.

An Action item is to be written when one or more of the following needs exist:

1.

Need to identify a technical concern.

2.

Need to identify a potential technical concern and there is no information readily available to substantiate or alleviate the concern.

3.

Need to identify a significant program aspect or practice that is, or appears to be, incorrect or inadequate.

4.

When it is deemed necessary for the project to investigate to determine cause and extent of discrepancies.

5.

When it is deemed appropriate to evaluate the Project's proposed actions to correct discrepancies and prevent recurrence.

It is generally not necessary to generate an Action Item if a minor discrepancy is observed and the discrepancy appears to be isolated or random.

Several minor discrepancies, however, would generate an Action Item.

NOTES 1.

Review Plans must indicate all discrepancies observed regardless of significance or number and even if an Action Item was not generated.

The Audit Team Leader will make the final decision for when an Action Item is written. His decision will be based on the above written guidance, as well as, objectivity and fairness to the issue in question at that time.

2.

Generation of, and obtaining a response to, an Action Item does not necessarily negate the need for an Audit Observation

~.

Page 1 of-3 ATTACHMENT 3 AN APPROACH TO DRAFTING AN AUDIT OBSERVATION INTRODUCTION The main purpose of the audit program is to resolve " systematic" or " generic" problems (i.e., obtain adequate preventive action). This requires audit reports, audit observations, etc. to be written in a manner such that overall assessments are presented; problems and their root causes can be addressed by appropriate.

management.

In order to maintain credibility and impact, A0s must be valid and demonstrate good judgement.

It is difficult to define precise criteria to apply in determining if an A0 is necessary or warranted. However, two main considerations are significance of individual deficiencies and number of deficiencies.

General ~ Examples:

1.

If a minor deficiency is observed in a document and was not observed in other documents of that type - An A0 is probably not warranted.

(Deficiency

. could be corrected during audit or marked for future corr <: tion at next revision).

2.

If a large number of minor deficiencies are observed in several documents -

an A0 is probably warranted.

3.

A single deficiency. of relative significance if observed in only one document may warrant an AO, even if apparently isolated, in order to assure the deficiency is corrected.

(Action to prevent recurrence may not be necessary, however, if deficiency is of isolated nature).

Specific Examples:

1.

Logic Diagrams and Logic Descriptions are audited.

They are found to be clear, complete, consistent with FSKs, ESKs, and technically adequate. Some of the 1,ogic Descriptions contain 1 few minor " typos".

Should on A0 written? Probably not.

2.

Several Power calculations are audited.

Calculations are clear and complete, appropriate methods are used, are technically adequate.

In.one calculation, an input value was incorrect, apparently due to a transposition error.

Results would not be affected.

Another calculation was not marked with the QA Category (but was Independently reviewed).

Should an A0 be written? Probably not.

3.

Structural Calculations are audited.

Calculations are found to be adequate except that in one calculation an input value is incorrect. The results are not affected.

The reasons for the incorrect value appears to result from failure of another discipline to provide revised information. Time did not permit further investigation.

Should an A0 be written? Probably.

Pagt 2 of 3 NOTE: Review Plans must indicate all deficiencies observed regardless of significance or number. For any deficiency not included in an AG, it must be evident why an A0 was not written (e.g., minor, isolated, or corrected during the audit).

If we decide that an A0 is probably warranted, we now prepare it.

AUDIT OBSERVATION PREPARATION An Audit Observation is usually presented in two basic parts:

the " Description of Condition (s)" and the " Details".

In nearly all cases, it is the " Description of Condition (s)" we want addressed by audited organizations in their response to the audit observation.

Therefore, audit results must be evaluated, logically grouped, re-evaluated, and a conclusion or summary presented.

The details or supporting evidence then follows.

Preparation of an audit observation is more of a thought process than a mechanical exercise. The following is an attempt to describe that process.

1.

LIST ALL THE DEFICIENCIES 2.

Determine if there is a commonality among some or all of the items listed.

Can the items be logically grouped or categorized?

Possible Groupings and Categories:

o By element (Procedures, control, review or approval, documentation, design consistency, technical adequacy).

o

" Probable Cause".

For example:

Lack of thorough

review, misunderstanding of requirements, etc.

o Consequence.

For example:

Various distribution problems could result in personnel working with out-of-date information, o

Other 3.

Prepare a Rough Draft AO (handwritten) using the attached outline.

4.

Read the draf t as objectively as possible.

Is it logical?

Can an overall conclusion be reached? Should this conclusion be stated in the Description of Condition (s)?

Is the english, spelling, etc., correct?

AUDIT OBSERVATION OUTLINE I.

Description of Condition (s) Categories need not necessarily be presented in order shown below.

In fact, it would be unusual for an A0 to contain all categories.

A.

Describe the basic failure of the systen or activity if applicable or describe the overall conclusion (e.g.,

"the E&DCR system does not provide complete control of design changes").

Pega 3 of 3 B.

Summarize the deficient elements (or sub-elements).

Since most people won't be familiar with element definitions, include a brief definition or examples, e.g.,

"... calculations are incompletely documented (methods and sources of input not identified,... etc.)".

C.

When there is strong supporting evidence, state what the observed deficiencies indicate.

That is, what is the " probable cause".

Sometimes the cause is implied and need not be stated.

Example:

the improper application of the analysis method indicates a lack of guidance to the preparer...".

D.

Indicate the consequences of the deficiencies.

(As stated above, this may be implied or obvious and need not necessarily be stated.

Improper application of method could, obviously, affect technical adequacy).

Exampic:

" Failure to distribute results of revised calculations could lead to...".

E.

The auditor may (in some cases) provide guidance on the boundaries for determination of the extent of conditions.

F.

If any audit findings are recurrences of earlier findings on the activity being audited, this fact should be emphasized in the AO.

II.

Details (Supporting Evidence)

A.

Details should be grouped and sequenced to be consistent with the Summary where practicable.

B.

Some type of quantitative comparison should be provided where appropriate (e.g.,

fifteen of the twenty selected from the list were not included in...").

C.

Provide detail, explanation, background, etc.

Don't force people to

" read between the lines".

Take care to provide information - not just more words.

Avoid Terms Such As:

o in accordance with procedures...

o as required by...

o inadequate o

generally o

satisfactory Avoid including nits.

Avoid long, complicated sentences.

Pags 1 of 1 ATTACHMENT 4 TENTATIVE SCHEDULE FOR PHASE I AND PHASE II TENTATIVE SCHEDULE OF ACTIVITIES DATE LOCATION NRC Meeting to Review Audit Plan & Proposed Plan 2/11/85 Bethesda for the Evaluation of Technical Audit Results NRC Approval of this Program as an Alternative to 3/1/85 IDI or IDVP Initiation of Data Summarization for the First 3/1/85 Boston Three Technical Audits Select Audit Team Members 4/22/85 -

Boston 4/30/85 Initial Orientation of Audit Team Members 5/6/85 Boston Status Meeting with Audit Team Members 5/13/85 Boston Adjustments or additions to the audit plan as necessary based on preliminary reviews of the results of the first three technical audits and review of the results of the NRC CAT Inspection Status Meeting with Audit Team Members (Submit 5/21/85 Boston review plans for approval to the Audit Team Leader)

NRC Review of Detailed Scope and Approach 5/21/85 -

Boston (Review Plans) 5/24/85 Pre-audit Meeting (auditors and auditees) 5/28/85 Site NRC Impicmentation Review 6/10/85 -

Site 6/14/85 Audit 5/28/85 -

Site 6/21/85 Summary Meeting (submit section of draft report and 6/28/85 Site Audit Observations)

Post Audit Conference 7/10/85 Site Audit Report Issue 7/22/85 NRC Review of Audit Results 7/29/85 -

Boston 8/1/85 Submittal of Final Evaluation Report of Four 9/9/85 Technical Audits to NRC by NUSCO

-