ML18038A011
| ML18038A011 | |
| Person / Time | |
|---|---|
| Site: | Nine Mile Point |
| Issue date: | 04/03/1985 |
| From: | NIAGARA MOHAWK POWER CORP. |
| To: | |
| Shared Package | |
| ML17054B538 | List: |
| References | |
| EA-106, NUDOCS 8504090155 | |
| Download: ML18038A011 (42) | |
Text
EA-106 PROGRAM FOR COMPLETION OF THE ENGINEERING ASSURANCE IN-DEPTH TECHNICAL AUDITS NINE MILE POINT 2 PROJECT NIAGARA MOHAWK POWER CORPORATION 8504090i55 850403 PDR ADOCK 050004i0 A
PROGRAM FOR COMPLETION OF THE ENGINEERING ASSURANCE IN-DEPTH TECHNICAL AUDITS NINE MILE POINT 2 PROJECT NIAGARA MOHAWK POWER CORPORATION OVERVIEW This plan describes a
two phase program for completion of the Engineering Assurance In-depth Technical Audits a's discussed with the NRC on January 28, 1985.
The Program provides for performance of 3 in-depth technical audits of the Nine Mile Point 2 Project engineering and design activities.
Two audits have been completed to date.
Completion of the audit activities involves the performance of the 3rd,audit (Phase I) and the evaluation and assessment of the results of the 3 audits in order to form a conclusion as to the adequacy of the design process as implemented for Nine Mile Point 2 (Phase II).
A detailed plan for completion of both phases of the program is described below.
The completion schedule provides for initiating phase II for the two technical audits that have been completed such that the preliminary results of the evaluation may be used to adjust or expand the scope of the third technical audit as necessary.
In addition, the results of the NRC CAT Inspection which relate to the design process and the results of review performed by SWEC's New York Office will be used to ad)ust or expand the scope of the third technical audit as necessary.
PHASE I AUDIT PLAN FOR PERFORMANCE OF THE THIRD IN<<DEPTH TECHNICAL AUDIT OF NINE MILE POINT 2 PURPOSE The purpose of the audit is to assess the adequacy of the design process by evaluating the design of the Reactor Core Isolation Cooling System (ICS) associated structures and interfaces.
Adequacy of design changes generated by the Site Engineering Group (SEG) and other specific activities will also be evaluated.
OVERALL SCOPE AND APPROACH The ICS will form the basis of the audit.
The design will be reviewed to determine if the following attributes are met:
o The design is consistent with and supports the FSAR commitments.
Including system
- function, compliance to documents committed to in the FSAR and compliance with correct design practices.
o The design is in compliance with NSSS requirements and criteria.
o Technically adequate calculations are available to support the design.
o
- Diagrams, specifications, and drawings are technically complete and consistent with each other.
o Inter-and intra-discipline interfaces are adequate.
The adequacy of design changes generated by the SEG will also be evaluated.
Emphasis will be placed on changes to the ICS.
- However, changes to other systems will be evaluated as necessary to obtain a reasonable sample size.
In order to evaluate certain activities, it is anticipated that review of materia( other than that related to the ICS will be necessary.
These activities include:
o Structural load tracking o
As-built/stress reconciliation o
Environmental Qualification o
Cable tray and conduit supports o
Electrical Separation The audit team vill perform site walkdowns for the purpose of facilitating and expediting the system review and for the detailed review of specific programs.
AUDIT TEAM The team will function under the direction of SWEC Engineering Assurance Division (EA) Boston.
The team will be composed of Niagara Mohawk Power Corporation and SWEC personnel.
The SWEC team members will be off-project experienced technical personnel.
PREPARATION General:
o Review applicable FSAR sections, General Electric (NSSS) documents, and related project procedures and technical criteria to become familiar with system
- function, design basis requirements, and project specific considerations.
Identify and assemble key documents necessary for the audit (e.g.,
FSKs, Design Criteria).
o Determine status of design documents (Example:
Determine what pipe stress analysis problems have been stress reconciled).
o Discuss and establish means to evaluate interface between disciplines.
o Refine scope to indicate specific interfacing
- systems, structures and components to be evaluated.
o Refine scope to account for preliminary results of the evaluation of the first two technical audits and the results of the NRC CAT Inspection that relate to the design process.
o Modify task sheets (see below) as necessary to address refined scope and any additional requests by team members or the QA audit team.
o Develop a set of marked-up documents (drawings, diagrams),
a listing of key components and equipment, and any other requested material in order to provide appropriate inform'ation among team members and to the QA audit team.
(Specific material required will be identified on task assignment sheets.
Task sheets will be attachments to Review Plan.
The audit team leader will coordinate with the audit team in establishing needs).
o Using the Review Plans provided by Engineering Assurance as
- guidance, each team member is to develop a Review Plan specific to the discipline and areas to be audited.
The Review Plan is to reflect the scope of the audit and the means for evaluating discipline interface and identify the detailed attributes that are to be pursued during the audit.
PERFORMANCE The team members must annotate the review plans to specifically and completely identify the documents reviewed (including issue and revision identification) and to document, in detail, the results of the review for each attribute.
During the audit, and prior to start of the QA audit on-site effort, identify key requirements in specifications and drawings that should be addressed by the QA audit team and provide this information to the QA audit team.
During the performance of the audit, the Audit Team is to inform the Project of potential concerns or requests to provide needed information using an "ACTION ITEM" form.
The Project engineering staff will be expected to promptly respond to each Action Item providing the information requested or a response to the
potential concern.
A status log is to be maintained to track and account for all Action Items.
(See Attachment 2 for guidance in generating Action Items).
Attempt will be made to "bound" all valid concerns/deficiencies during the audit by the use of Action Items.
In order to be
- bounded, the full extent of the concerns/deficiencies must be determined, corrective action taken and preventive
- action, where appropriate, implemented.
The team member must concurr that'he extent has been determined and verify that appropriate corrective action has been taken and preventive action implemented..
Concerns that relate to construction and gA will be communicated to the QA audit team for their investigation, reporting, and follow-up, as appropriate.
To facilitate the review of the system and evaluate activities, a site walkdown will be included.
(See Attachment 4 for site walkdown description)
Periodic status meetings are to be held by the Team Leader and SWEC Prospect Engineer.
The purpose of these meetings is to discuss the progress of the audit and the status of any open Action Items.
REPORTING At the conclusion of the audit, and prior to issuance of the report, a meeting will be held by the Team Leader with the SWEC Project Engineer and appropriate SWEC and Niagara Mohawk management personnel to discuss the results of the audit.
The audit report will generally follow the outline below:
l.
INTRODUCTION 2.
PURPOSE 3.
SCOPE 4.
SUMMARY
OF RESULTS AND OVERALL CONCLUSIONS 5.
AUDIT OBSERVATIONS 6.
SUMMARY
BY DISCIPLINE 6.1 Control Systems 6.2 Electrical
/
6.3 Engineering Mechanics 6.4 Power 6.5 Structural Each team member is specifically responsible for preparing any needed Audit Observations and preparing the summary section for the discipline audited (Sections 6.1 thru 6.5).
See Attachment 3 for guidance in generating Audit Observations.
The report will be reviewed by the Audit Team Leader and approved by the Chief
- Engineer, Engineering Assurance Division.
PHASE II PLAN FOR THE EVALUATION OF IN<<DEPTH TECHNICAL AUDIT RESULTS FOR NINE MILE POINT 2 PROJECT PURPOSE The purpose of this plan is to describe the method to be used to evaluate the combined results of the Nine Mile Point 2 In-depth Technical Audits in order to form a conclusion as to the adequacy of the design process as implemented on Nine Mile Point 2.
BACKGROUND In order to put in perspective the evaluation plan for analyzing the Engineering Assurance Technical Audit results on the Nine Mile Point 2 Pro)ect, it is important to review some background information on how each Engineering Assurance Technical Audit is pursued and culminated.
The purpose of the in-depth technical audit is to evaluate the technical adequacy of engineering and design documents and to evaluate their degree of compliance with the FSAR, applicable
- codes, standards, and other licensing commitments.
Findings from the individual technical audits are evaluated for the determination.
of root cause, extent of conditions and corrective and preventive actions as part of the audit follow-up.
SWEC Engineering Assurance verifies that these actions are appropriate and have been completed during the individual audit follow-up.
Q
EVALUATION PLAN The findings from each of the technical audits and the'findings related to the design process from the NRC CAT Inspection and the SPEC New York Office Review will be summarized and grouped to determine the overall significance and impact, when viewed as a
composite, that these findings have on the adequacy and implemeqtation of the design process.
The findings will first be categorized by type.
These categories will be selected as representing specific activities or functions of the design process and will provide a framework for judging the adequacy of the design 'process and its implementation.
Listed below are examples of finding types which will form the basis for the categorization.
These finding types were established based on the results of SWEC in-depth technical audits and NRC IDI Inspections.
1.
Design Process Implementation Deficiencies 2.
Design Process or Method Inadequate 3.
Inadequate Interface Control 4.
SAR Related Deficiency 5.
Design Change Deficiency 6.
Document Control Deficiency
7.
Test Requirement or Implementation Deficiency 8.
Construction/Site QC Deficiency 9.
Vendor or Site Contractor Deficiency 10.
NSSS Deficiency To facilitate the evaluation, the data base for these findings will also include, as applicable, the following:
o Responsible Discipline o
Document Type o
Cause o
Extent of Conditions o
Corrective Actions o
Preventive Actions After the findings have been categorized and grouped by discipline they will be reviewed and screened to determine if any findings can be eliminated from further considerations because they are minor, editorial, or administrative in nature and
that they do not provide evidence of inadequacies in the design process or represent generic implementation concerns and therefore do not warrant additional analysis.
The rationale for eliminating findings from further consideration will be documented.
The remaining findings will be reviewed to evaluate the adequacy and implementation of the overall. design process.
Particular emphasis will be placed on the adequacy of the design to permit safe operation and shutdown of the facility.
This review will be accomplished by evaluating the findings within finding types (activity or function of the design process) and by evaluating the similarity of findings and the extent of corrective or preventive action as the means of determining either that further action (e.g.,
- audit, design
- review, etc.)
is necessary or that sufficient basis exists for establishing confidence in the adequacy of the design process and its implementation.
A summary report will be issued, which will present the conclusion reached during the analysis of the combined audit results.
The report will address whether sufficient evidence exists from the technical audits to give additional confidence that the Nine Mile Point 2 facility as designed is in compliance with the FSAR commitments and NRC requirements and regulations.
This report will also present recommendations for areas that require additional actions to confirm the adequacy of the design, if the results so dictate.
The evaluation will be performed and a draft report prepared by SWEC, Engineering Assurance Division.
Niagara Mohawk Power Corporation will review and approve the evaluation report and submit the final report to the NRC.
TENTATIVE SCHEDULE OF PHASE I AND PHASE II ACTIVITIES Attachment 5 lists the summary of activities for both the upcoming Engineering h
Assurance Technical Audit and the Evaluation of Technical Audit Results.
This summary of activities also includes an anticipated NRC involvement in the process.
Page 1 of 1 ATTACHMENT 1 AUDIT TEAM DISCIPLINE NAME AND ORGANIZATION LOCATION TITLE Electrical Power Control Systems Structural Materials Engineering Engineering Assurance Audit Team Leader Audit Coordinator QAAD Interface/Coordinator
Page 1 of 1 ATTACHMENT 2 GENERATION OF ACTION ITEMS An Action Item can be generated to identify deficiencies or to request information. It is difficult to define precise criteria to apply in determining if an Action Item should be generated.
Three considerations are:
significance of individual discrepancies, number of discrepancies, and the urgency of needed information by the evaluation team member.
An Action item is to be written when one or more of the following needs exist:
1.
Need to identify a technical concern.
2.
Need to identify a potential technical concern and there is no information readily available to substantiate or alleviate the concern.
3.
Need to identify a significant program aspect or practice that is,'r appears to be, incorrect or inadequate.
4 When it is deemed necessary for the pro)ect to investigate to determine cause and extent of discrepancies.
When it is deemed appropriate to evaluate the Prospect's proposed actions to correct discrepancies and prevent recurrence.
It is generally not necessary to generate an Action Item if a minor discrepancy is observed and the discrepancy appears to be isolated or random.
Several minor discrepancies, however, would generate an Action Item.
NOTES 1.
Review Plans must indicate all discrepancies observed regardless of significance or number and even if an Action Item was not generated.
The Audit Team Leader will make the final decision for when an Action Item is written.
His decision will be based on the above written guidance, as well as, objectivity and fairness to the issue in question at that time.
2.
Generation of, and obtaining a
response to, an Action Item does not necessarily negate the need for an Audit 'Observation.
ATTACHMENT 3 AN APPROACH TO DRAFTING AN AUDIT OBSERVATION INTRODUCTION The main purpose of the audit program is to resolve "systematic" or "generic" problems (i.e., obtain adequate preventive action).
This requires audit reports, audit observations, etc. to be written in a manner such that overall assessments are presented; problems and their root causes can be addressed by appropriate management.
In order to maintain credibility and impact, AOs must be valid and demonstrate good gqdgement.
It is difficult to define precise criteria to apply in determining if an AO is necessary or warranted.
- However, two main considerations are significance of individual deficiencies and number of deficiencies.
General Exam les:
1.
If a minor deficiency is observed in a
document and was not observed in other documents of that type An AO is probably not warranted.
(Deficiency could be corrected during audit or marked for future correction at next revision).
2.
3.
If a large number of minor deficiencies are observed in several documents-an AO is probably warranted.
A single deficiency of relative significance if observed in only one document may warrant an AO, even if apparently isolated, in order to assure the deficiency is corrected.
(Action to prevent recurrence may not be necessary, however, if deficiency is of isolated nature).
S ecific Exam les:
1.
Logic Diagrams and Logic Descriptions are audited.
They are found to be
- clear, complete, consistent with FSKs,
- ESKs, and technically adequate.
Some of the Logic Descriptions contain a
few minor "typos".
Should on AO written?
Probably not.
2.
Several Power calculations are audited.
Calculations are clear and
- complete, appropriate methods are
- used, are technically adequate.
In one calculation, an input value was incorrect, apparently due to a transposition error.
Results would not be affected.
Another calculation was not marked with the QA Category (but was Independently reviewed).
Should an AO be written?
Probably not.
3.
Structural Calculations are audited.
Calculations are found to be adequate except that in one calculation an input value is incorrect.
The results are not affected.
The reasons for the incorrect value appears to result from failure of another. discipline to provide revised information.
Time did not permit further investigation.
Should an AO be written?
Probably.
Page 2 of 3 NOTE:
Review Plans must indicate all deficiencies observed regardless of significance or number.
For any deficiency not included in an AO, it must be evident why an AO was not written (e.g., minor, isolated, or corrected during the audit).
If we decide that an AO is probably warranted, we now prepare it.
AUDIT OBSERVATION PREPARATION An Audit Observation is usually presented in two basic parts:
the "Description of Condition(s)" and the "Details".
In nearly all cases, it is the "Description of Condition(s)" we want addressed by audited organizations in their response to the audit observation.
Therefore, audit results must be evaluated, logically
- grouped, re-evaluated, and a conclusion or summary presented.
The details or supporting evidence then follows.
Preparation of an audit observation is more of a
thought process than a
mechanical exercise.
The following is an attempt to describe that process.
1.
LIST ALL THE DEFICIENCIES 2.
Determine if there is a commonality among some or all of the items listed.
Can the items be logically grouped or categorized?
Possible Groupings and Categories:
o By element (Procedures,
- control, review or approval, documentation, design consistency, technical adequacy).
o "Probable Cause".
For example:
Lack of thorough'eview, misunderstanding of requirements, etc.
o Consequence.
For example:
Various distribution problems could result in personnel working with out-of-date information.
o Other 3.
Prepare a Rough Draft AO (handwritten) using the attached outline.
4.
Read the draft as objectively as possible.
Is it logical?
Can an overall conclusion be reached?
Should this conclusion be stated in the Description of Condition(s)'?
Is the english, spelling, etc., correct?
AUDIT OBSERVATION OUTLINE I.
Descri tion of Condition(s)
Categories" need not necessarily be presented in order shown below.
In fact, it would be unusual for an AO to contain all categories.
A.
Describe the basic failure of the system or activity if applicable or describe the overall conclusion (e,g.,
"the E&DCR system does not provide complete control of design changes").
Page 3 of 3 B.
Summarize the deficient elements (or sub-elements).
Since most people won't be familiar with element definitions, include a brief definition or
- examples, e.g.,
calculations are incompletely documented (methods and sources of input not identified,... etc.)".
C.
When there is strong supporting
- evidence, state what the observed deficiencies indicate.
That is, what is the "probable cause".
Sometimes the cause is implied and need not be stated.
Example:
the improper application of the analysis method indicates a lack of guidance to the preparer...".
D.
Indicate.the consequences of the deficiencies.
(As stated above, this may be implied or obvious and need not necessarily be stated.
Improper application of method could, obviously, affect technical adequacy).
Example:
"Failure to distribute results of revised calculations could lead to tl E.
The auditor may (in some cases) provide guidance on the boundaries for determination of the extent of conditions.
If any audit findings are recurrences of earlier findings on the" activity being audited, this fact should be emphasized in the AO.
II.
Details (Su ortin Evidence)
A.
Details should be grouped and sequenced to be consistent with the Summary where practicable.
B.
Some type of quantitative comparison should be provided where appropriate (e.g., fifteen of the twenty selected from the list were not included in...").
C.
Provide detail, explanation, background, etc.
Don't force people to "read between the lines".
Take care to provide information not gust more words.
Avoid Terms Such As:
in accordance with procedures as required by...
inadequate generally satisfactory Avoid including nits.
Avoid long, complicated sentences.
ATTACHMENT 4 GUIDANCE FOR SITE WALKDOWNS Page 1 of 2 There are two basic site walkdowns involved in audits:
1.
Detail walkdowns and investigations dealing with audits of programs such as environmental qualification and seismic qualification.
Such walkdowns are performed to Review Plans that contain attributes that specifically require field checks of installed equipment and hardware.
(Attributes such as determination of identification,
- location, and orientation of specific, pre-selected, items of equipment).
2.
Walkdowns associated with vertical design reviews for the purpose of facilitating and expediting the review.
The walkdown associated with a
design review provides for:
o Familiarity review of overall arrangement,
- location, and configuration of design.
o Evaluating specific items that arose as a
result of reviewing engineering documents.
o Evaluating specific design attributes that are easier to evaluate by seeing the installed hardware or equipment than by document review only.
o Evaluating the adequacy and clarity of engineering documents as evidenced by the implementation of the basic design criteria and technical requirements in the as-constructed condition.
A general familiarity tour can be conducted early in the audit.
- However, a
walkdown to evaluate specific items and attributes should be conducted only after design documents have been reviewed in sufficient detail to prepare for the walkdown.
Prior to performing the
- walkdown, an outline will be developed.
Each discipline is to provide input to the outline by preparing a
scoping document (approx.
I to 2
pages) to identify the key items and attributes from Review Plans that are to be evaluated.
(The intent of the outline is to ensure team members are adequately prepared for the
- walkdown, but does not restrict the walkdown to only the areas identified on the outline.
In addition to the
- walkdown, in-plant visits may be necessary to follow up on potential concerns or questions).
The types of items or attributes that could be included in the outline are:
Electrical Separation Separation of Redundant Equipment Maintenance access and ALARA considerations Sloping of Lines
0
Page 2 of 2 Pipe Support Spacing,
- function, Pipe Restraint Locations The results of the walkdown are to be documented in the Review Plans utilized for the design review.
The Audit Team Leader will coordinate walkdowns with the Project to establish dates and times and ensure appropriate personnel availability.
Page 1 of 1 ATTACHMENT 5 TENTATIVE SCHEDULE FOR PHASE I AND PHASE II TENTATIVE SCHEDULE OF ACTIVITIES NRC Meeting to Review Audit Plan 6 Proposed Plan for the Evaluation of Technical Audit Results DATE 1/28/85 LOCATION Bethesda Initiation of Data Summarization for the First Two Technical Audits Select Audit Team Members Initial Orientation of Audit Team Members Status Meeting with Audit Team Members Adjustments or additions to the audit plan as necessary based on preliminary reviews of the results of the first two technical audits and review of the results of the NRC CAT Inspection and the New York Office review Status Meeting with Audit Team Members (Submit review plans for approval to the Audit Team Leader)
NRC Review of Detailed Scope and Approach (Review Plans)
Pre-audit Meeting (auditors and auditees)
NRC Implementation Review Audit Start Audit Completion Summary Meeting Post Audit Conference Audit Report Issue NRC Review of Audit Results Submittal of Final Evaluation Report of the Three Technical Audits to NRC by NMPC 3/1/85 3/15/85-3/22/85 4/1/85 4/10/85 4/22/85 4/22/85 4/25/85 4/29/85 5/20/85 5/23/85 4/29/85 5/31/85 5/31/85 6/27/85 7/19/85 8/19/85 8/23/85 10/7/85 Boston Boston Boston Boston Boston Boston CHOC CHOC CHOC CHOC CHOC CHOC