ML20236L786
| ML20236L786 | |
| Person / Time | |
|---|---|
| Site: | Comanche Peak |
| Issue date: | 10/31/1987 |
| From: | Levin H, Turi P TEXAS UTILITIES ELECTRIC CO. (TU ELECTRIC) |
| To: | |
| References | |
| CON-#487-4775 OL, NUDOCS 8711110064 | |
| Download: ML20236L786 (19) | |
Text
$4775 -
o s
00CKETED Filed: October 31, 8987 U ITED STATES OF AMERICA 117 NOV -6. P4 :04 i
NUC E.R REGULATCRY COMMISSION i
0FFICE 0f SECFtiAnY before the 00CMEf tNG & SERVICf.
BRANCH ATOMIC SAFETY AND LICENSING BOARD I
l I
l In the Matter of
)
Docket Nos. 50-445-OL i
1 50-446-OL.
I TEXAS UTILITIES GENERATING
)
COMPANY et al.
)
)
(Application for an (Comanche Peak Steam Electric
)
Operating License) 1 Station, Units 1 and 2)
)
I I
l 1
i t
\\
ANSWERS TO BOARD'S 14 QUESTIONS (Memo; Proposed Memooof April 14, 1986)
Regarding Action Plan Results Report-VI.a
')
In accordance with the Board's Memorandum: Proposed Memo-l randum and Order of April 14, 1986, the Applicants submit'the answers of the Comanche Peak Response-Team ("CPRT")ftolthe 14 questions posed by the Board, with respect to the-Results Report-published by the CPRT in respect'of CPRT Action Plan VI.a, I
" Insulation / Shield Wall Gap.*'
Opening Request:
Produce copies of any CPRT-generated checklists that were used during the conduct of the action plan.
Response
Two checklists were developed and utilized during the-implementation of ISAP VI.a.
(See Attachment 1, Figures III.1 and III.2.)
Figure III.3 was used only as an aid in filling out 8711110064 871031 A,<
PDR ADOCK 05000445
/.
0 PDR l
u
l 6 l
the checklists.
The checklists facilitated sample selection and l
l review for the investigation of non-nuclear safety design changes.
Question No. 1; l
1.
Describe the problem areas addressed in the' report.
Prior to undertaking to address'those areas through sampling, what did Applicants do to define the problem areas further?
How did it believe the problems arose? -What did it dis-cover about the QA/QC documentation for those areas?
How extensive did it believe the problems were?
Response
Three areas were addressed'by the Action Plan Results Report.
The first concerned the adequacy of cooling flow in the reactor cavity.
This problem was identified by the Project dur-ing the Hot Functional Testing (HFT) Program, was documented on a Test Deficiency Report, and later was reported'in accordance i
with 10 CFR 50.55(e) requirements.
No sampling was undertaken j
by the Project or third party relative to-this' area.
The prob-L lem was caused by the-location of a-steel support. ring outside l
the boundary of the Reactor Pressure Vessel Reflective Insula-tion (RPVRI).
This ring obstructed the cooling flow' required l
between the RPVRI and biological shield wall, thereby causing higher than allowable temperature within the reactor cavity area.
The support ring and the RPVRI, which is considered a 1
non-safety-related item, were furnished together.
As. discussed in the Results Report, the cause of the' problem was a breakdown i'
in communications between Westinghouse and Gibbs.& Hill during-.
development of the original design.
Thus, QA/QC documentation 2-j
._._______._m_._m.__m___m_.._______
6 was not explicitly at issue.
The Project's corrective measures q
and the corresponding third-party review applied to both' Units 1 and 2.
Secondly, the-Action Plan Results Report. addressed debris 1
in critical spaces.
No sampling techniques were applied.in i
implementing the Action Plan tasks related,to this area.
Instead, the Action Plan sought to identify all types of'criti-cal spaces in the plant and to verify that they were free of I
debris, but inspections disclosed debris in many of the critical i
l spaces.
l Investigation of the cause revealed that housekeeping and surveillance procedures did not include measures that.would l
address cleanliness requirements for critical spaces.
This, in turn, was attributed to the lack of cleanliness requirements in' j
design specifications for equipment with associated critical I
spaces.
l-Thirdly, the Action Plan Results Report addressed the Non-Nuclear Safety (NNS) design change process, which utilized the checklists previously identified.
The TRT had requested TU Electric to review procedures for the approval of design changes to non-nuclear safety-related equipment and to revise as neces-sary to assure that such design changes do not a'versely affect d
safety-related systems.
Action Plan VI.a employed an investiga-tory sampling approach to test the process (procedures) for adequacy in this area.
The third-party investigation concluded.\\
a that the process for reviewing non-nuclear safety design changes was effective and that it is not considered a problem area.
Question No. 2:
2.
Provide any procedures or other internal documents that are necessary to understand how the checklists should be inter-preted or applied.
1
Response
{
f 4
Attachment I is the sampling plan developed to implement the ISAP investigatory sampling effort relative to non-nuclear i
~
safety design changes.
The investigation focused on the popula-l tion of Design Change Authorizations (DCAs) and Component j
Modification Card.s (CMCs).
For the reasons discussed in Sect'lon
]
5.4.1 of tne Results Report, use of this population was con-l sidered to be the most effective test of implementation of the
)
l
]
NNS design change process Question No. 3:
I 3.
Explain any deviation of checklists from the inspection j
report documents initially used in inspecting the same attributes.
Response
The nearest equivalent to an inspection report for this ISAF is the Field Design Change - Change Verification Checklist (CVC) used by the Project to record Engineering, Design, and Quality Assurance review of a DCA or CMC change package.
The Interdisciplinary Support Review section of the CVC delineates disciplines that are required to perform reviews, whereas the ISAP checklist is directed specifically at determining whether the Project adequately addressed Q/non-Q interactions.
1
_ 4 _
Therefore, the ISAP form is a more focused' checklist'used to assess the adequacy of a single item on the Project CVC' form.
i Question No. 4:
4.
Explain the extent to which the checklists contain fewer attributes than are required for conformance to codes to-which Applicants are committed to conform.
Response
The attributes of the two checklists do not correspond to code-related items to which the Project is' committed to conform.
Question No. 5:
5.
(Answer Question 5 only if the answer to Question 4 is that the checklists do contain fewer attributes.).
Explain the engineering basis, if any, for believing that the safety margin for components (and the plant) has not been degraded by using checklists that contain fewer attributes than are required for conformance to codes.
Response
This question is not applicable by reason of the response q
to question 4.
Question No. 6:
j 6.
Set forth any changes in checklists while they were-in.use, including the dates of the changes.
Response
No changes were made to the two checklists during the investigation.
Question No. 7:
7.
Set forth the duration of training in the use of checklists and a summary of the content of.that training, including field training or other practical training.
If.the train-ing has changed or retraining occurred, explain the reason for.the changes or retraining and set forth changes in duration or content.
1 I
Response
i No formal training was required for this review.
Personnel j
1 involved in completing the checklists had sufficient qualifica-1 tions, having been selected on the basis of.their educational i
)
and experience backgrounds.
These reviewers read the sampling plan containing the checklists that were used and were trained in the appropriate Design Adequacy Program (DAP) procedures.
In the judgment of the Review Team Leader, no additional training J
was necessary.
Question No. 8:
I 8.
Provide any information in Applicants' possession concern-ing the accuracy of use of the checklists'(or the inter-1 observer reliability in using the checklists).
Were there 1
any time periods in which checklists were used with questionable training or QA/QC supervision?
If applicable, are problems of inter-observer reliability' addressed statistically?
Response
The results were reviewed by the Issue Coordinator and the
'j l
CPRT Statistical Advisor.
Training and supervision were there-I fore not in question.
Inter-observer reliability was not l
applicable.
Question No. 9:
9.
Summarize all audits or supervisory reviews (including reviews by employees or consultants) of training or of use of the checklists.
Provide ' the factual basis for believing-that the audit and review activity was adequate and that each concern of the audit and review teams has been resolved in a way that is consistent with the validity of a
conclusions.
i 4
Response
As previously stated, the Issue Coordinator reviewed i
I results of the investigations in accordance with the checklists, i
and the CPRT Statistical Advisor reviewed the sampling applica-l l
tions and associated working files.
In addition, the overall 1
Implementation of the ISAP was audited.
The statistical review and overall audit were conducted in 1
accordance with established procedures and guidance provided by the SRT in the Program Plan.
No audit findings or observations were issued concerning i
o training or the use of checklists.
The Statistical Advisor i
identified a small number of errors made during the sampling l
effort.
These were corrected, and a determination was made that they had no impact on the sampling process or the results of the i
investigation.
A more detailed discussion of this subject is I'
presented in Section 5.4.1 of the Results Report.
Question No. 10:
1 10.
Report any instances in which draft reports were modified in an important substantive way as the result of management action.
Be sure to. explain any change that was objected to (including by an employee, supervisor, or consultant) in writing or in a meeting in which at least one supervisory or management official or NRC employee was present.
Explain what the earlier drafts said and why they were modified.
Explain how dissenting views were resolved.
l
Response
No-substantive modifications were made to the Results Report as a result of management action.
l l l l
l
~:
l j
Question No. 11:
11.
Set forth any unexpected. difficulties'that were encountered in completing the work of each task force and that would be j
helpful to the Board in understanding the process.by which l
conclusions were reached.
How were each of theseLun-expected difficulties resolved?
l
Response
1 No unexpected difficulties were encountered during imple-mentation of the Action Plan.
Question No. 12:
12.
Explain any ambiguities or open-items in the Results Report.
\\
)
Response
No ambiguities,or open items are_left in the Results
)
1 Report.
Ongo'ing activities by the project-are defined in
]
Section 7.0 of the Results Report.
Question No. 13:
13.
Explain the extent to which there are actual or apparent conflicts of interest, including whether a worker _or super-visor was reviewing or evaluating his own work or supervis-ing any' aspect of the review ~or evaluation'of his,own work or the work of those he previously supervised.
Response
)
Activities not performed entirely by third-party personnc1 were closely monitored by third-party personnel to preclude potential bias resulting from possible conflicts of interest.
Question No. 14:
14.
Examine the report to see that it adequately discloses the-thinking and analysis used.
If the language-is ambiguous or the discussion gives rise to obvious questions, resolve the ambiguities and anticipate,and resolve the questions'.
l 4
8'-
sP>c>
=.
e av 2,1-
$8$. 2616 nSj;so% 6i dri';" - ~"
3;
~
Beanennt The Issue Coordinator and others who aldad in the prepa' ration and approval of the Results Report have reviewed and checked it for clarity and believe that no ambiguities exist.
)
Respectfully submitted, s
5+
M M er L. Tiri l
Action Plan VI.a Issue ordinator I
I i
^ -
/
\\
)
l cA Or u n
~
rd~ Levin 'NU Review Team Leader The CPRT 8anier Roview Team has reviewed the foregoing j
responses and concurs in them.
1 j
e 4
.g.
~
TRT Issue VI.a Page-l-of 3 06/25/85 ATTACIDGtNT 1 Revision-0 ISSUE VI.a SAMPLINC' PIAN 1.
PURPOSE As called for by the TRT Issue VI.a Action Plan, an investigation into the potential for non-nuclear safety (NNS) design changes to t
give rise to significant interactions.between Q and non-Q items is outlined. The NNS design change process will be tested by means of an investigative sampling program described in the in11owing sections. Findings will be evaluated for their safety significance
~
on an individual basis as well as any generic implications.
Corrective measures will be established accordingly.
II. APPROACH An investigative sampling approach will be employed to test the NHS design change process for adequately addressing potential adverse interactions between Q and non-Q items. The population to be sampled from is the collective set of:all NNS design changes fori CPSES. From this population, design change packages will be randomly selected. Each package will be examined as part of this selection process for confomanes to the following screening criteria:
1.
The design change package must involve some physical alteration of a non-nuclear safety item occurring in g
the field, as opposed ~to a change to design documentation only with no impact to'the' field 1
configuration, i
2.
the design change package must involve an alteration within a Category I structure (see Table 11.1 for a list of Category I structures, 3.
the design change package must be rei'lective of the current plant configuration, and 4.
the desigu change package must have the potential for interactions other than purely seismic alone.
These first two screening criteria will focus this investigation on the population of change packag.ss where physical Q/non-Q interactions arise.
The third criterion is used to assure the existence of the subject change i.e., the change hasn't been altered or eliminated by a subsequent revision.
The final
. requirement will focus the investigation on the types of interactions that are not already being addressed as part of Issue II.d (" Seismic Design of Control Roon Ceiling Elements" - Damage' Study Verification). Flow obstructions, thermal expansions, mechanical interferences, etc. are examples of non-seismic types-of interactions of interest in the investigation.
uj
TRT Issue VI.a Page 2 of 8 06/25/85 Revision 0
The screening process will randomly identify 60 packages meeting the above requirements. These packages will than be " injected" into the existing CPSES design review process.
As the project completes its normal review cycle, the third party will review the design change package against the projects' review.
Such an investigation will not only assess the extent of existing significant Q/non-Q interactions, but it will also assess the effectiveness of the current process by which the project identifies these interactions.
1 TRT Issus VI.c j
Page 3 of 8
j 06/25/85 Revision 0
TABLE 11.1
)
Seismic Category 1 Structures
- 1.
Containment Buildinge including internal structures 2
Safeguards Buildings including diese1' generator room and emergency switchgear room 3.
Auxiliary Building 4.
Electrical and Control Building 5.
Fuel Building 6.
Service Water Intake Structure i
7.
Safe Shutdown Impoundment Das 8.
Refueling Water Storage Tanks and Associated Piping Tunnels 9.
Reactor Makeup Water Storage Tanks and Associated Piping Tunnels
- 10. Condensate Storage Tanks and Associated Piping Tunnels Based on FSAR 3.2.1.1.1 1
s.
j TRT Issue VI.a Page 4 of 8 06/25/85 Revision 0
III. SAMPLING PROCEDURE i
The following defines the process and responsibilities for conducting the NHS design change investigation.
This establishes the general order of activities to be performed to satisfy the Action Plan requirements; however, some tasks may be performed-concurrently.
Step Resp.
Activity 1
Project Identify all NNS design changes.
1 2
Project Number all identified NNS design change packages.
3 Project Assign randos numbers to NNS design.
'I change packages.
4 Project Begin screening process utilizing screening criteria.. Reject packages that do not involve field changes to NNS items do not apply to Category I structures do not represent existing plant configuration
' appear-to involve only potential seismic interactions.
A Selection Checklist (Figure 111.1) should be completed for each package reviewed. Continue the random f
selection process until 60 NNS design change packages meeting the screening criteria are identified.
5 CPRT Overview random selection process.
6 Project
" Inject" design change psckages into CPSES design review process for re-review.-
t 7
CPRT Review results of the projects' design review process. Document reviews on the Review Checklist (Figure 111.2). The Potential Interaction Checklist (Figure 111.3) may be used as an aid.
..m...
is
TRT Issua VI.a i
Pago 5 of 8 06/25/85 Revision 0
8 CPRT Evaluate cuccome of sampling.
Should rasuita' prove to be i
inconclusive, expand sample f
accordingly.
1 9
Project Perform any corrective actions identified ~as a result of the investigation.
10 CPRT Review implementation' of corrective actions and summarise investigation findings.
b i
\\
- ~
~
YRY Issue VI.o' Peg 3 6 ef 8 06/25/85 Revision 0
FIGURE 111.1 Selection Checklist 1.0 Change Package Sample Number (randos number).
2.0 Change Package Identification Number (include rev. number):
3.0 Does change package involve alteration of field configuration?
(Check one) 4.1 No (Go to Step 7.0) 4.2 Yes 4.0 Does change package involve an alteration made in a Category I structure?
(See Table 11.1 for list of Category I structures) 5.1 No (Go to Step 7.0) 5.2 Yes-5.0 Is change package reflective of current plant configuration?
6.1 No (Go to Step 7.0) 6.2 Yes 6.0 Does the change create the potential for non-seismic interactions?
7.1 No 7.2 Yes 7.3 If "Yes," list potential interactions. (Co to Step 8.0) 7.0 Reject change package.
8.0 Sign and date checklist below.
Raview Performed By:
Date:
w, y,
-n
\\
/
TRT Issus VI.s-f -
Pass 7 cf" 8 j
06/25/85 FIGURh III.2 Review Checklist 1.0 Change Packaga Sample Number (random number): _
2.0 Change Package Identification Number (include rev. nt$ber):
3.0 Conduct an evaluation, noting all potential Q/no$-Q interactions 1
(note potentially significant seismic interactions and bring them to the attention of the Issus 'II.d Issue Coordinator). Refer to Figure 111.3, Potential Interstion Checklist, for guidance.
4.0 Does Project design change review package adequately address:
Q/non-Q interactions?
4.1 Yes 4.2 No 4.3 If "No", attach evaluations for each potentially significant interaction identified.
5.0 Are identified interactions significant?
5.1 Yes (Return change package to Project) 5.2 No 5.3 N/A (No interactions identified)
Reviewed By:
Dates
/
/
/
}
}
, 1,
/ ji a.
i A
g 1,
.-)..
.}'
- .)
.J-
TRT Issue VI.c Pass 8 of. 8 06/25/85 Revision 0
FIGURg III.3 Potential Interactions Checklist 1.0 Could the change adversely impact inputs or assumptions used in the design of a Q component / system /structuret j
2.0 Could the change adversely impact the functional or operational characteristics of a Q component or systemt 3.0 Could the change adversely impact the operating or limiting qualification environment for Q components (e.g. pressure, temperature, humidity, radiation)?
4.0 Could the change adversely impact the ovaluation of damage to Q components / systems / structures due tot pipe whipt ~
jet impingement spray?
room pressurization?
floodingt 5.0 Could the change represent a potential source of missile damage to a Q component / system / structure due to mechanical failure not related to seismic displacement / damage (e.g. failure of rotating equipment, value sten ejection, etc.)?
6.0 Could this change adversely impact the function / response'of instrument or control elements in a Q system (pressure drop, radio frequency output, etc.)?
7.0 Could this change adversely impact the Appendix R evaluation (e.g.
fire loading, fire barrier integrity, fire detection or suppression system performance, non-Q elements used in the safe shutdown analysis)?
8.0 Could this change adversely impact radiation shf alding?
9.0 Could this change create materials compatibility problems in a Q system or component?
10.
Could this change adversely impact the performance,'or results, of surveillance tests?
i
00CKffE0 05NkC CERTIFICATE OF SERVICE 17 NW -6 P4 :04 i
I, R.
K.-Gad III, hereby certify that on October 31, 1987, I 0FFICL iN 5Ei.ht TAP made service of " Answers to Board's 14'Que kkN h h N M Proposed Memo of April 14, 1986) Regarding Action. Plan Results' Report-VI.a" lar mailing copies thereof, postage prepaid, to:
Peter B. Bloch, Esquire-Asst. Director for Inspection Chairman Programs Administrative Judge Comanche Peak Project Division Atomic Safety and Licensing U.S.: Nuclear Regulatory Board Commission U.S. Nuclear Regulatory P.
O.' Box 1029 Commission Granbury, Texas 76048 Washington, D.C.
20555 l
Dr. Walter H. Jordan Ms. Billie Pirner Garde j
Administrative Judge GAP-Midwest Office 881 W. Outer Drive 104 E. Wisconsin Ave.
-B Oak Ridge, Tennessee 37830 Appleton, WI 54911-4897 Chairman Chairman Atomic Safety and Licensing Atomic Safety.and Licensing l
Appeal Panel Board Panel J
U.S.
Nuclear Regulatory U.S.
Nuclear Regulatory
{
Commission Commission Washington, D.C.
20555 Washington, D.C._
20555 Janice E. Moore Mrs. Juanita Ellis Office of the General Counsel President,. CASE-U.S. Nuclear Regulatory 1426.S. Polk Street i
l Commission Dallas, Texas 75224 Washington, D.C.
20555 Renea Hicks, Esquire Ellen'Ginsburg, Esquire Assistant Attorney General Atomic' Safety and Licensing:
l Environmental Protection Division' Board Panel
.]
P.
O.
Box 12548 U.S. Nuclear Regulatory Capitol Station Commission l
Austin, Texas 78711 Washington, D.C.
20555
)
l P
1 j
Anthony Roisman, Esquire Mr. Lanny A.
Sinkin Suite 600 Christic Institute 1401 New York Avenue, N.W.
1324 North Capitol Street Washington, D.C.
20005 Washington, D.C.
20002 Dr. Kenneth A. McCollom Mr. Robert D. Martin Administrative Judge Regional Administrator 1107 West Knapp Region IV Stillwater, Oklahoma 74075 U.S.
Nuclear Regulatory Commission Suite 1000 611 Ryan Plaza Drive Arlington, Texas 76011 Elizabeth B. Johnson Geary S. Mizuno, Esquire Administrative Judge Office of the Executive Oak Ridge National Laboratory Legal Director P.
O.
Box X, Building 3500 U.S. Nuclear Regulatory Oak Ridge, Tennessee 37830 Commission Washington, D.C.
20555 Nancy H. Williams 2121 N.
California Blvd.
Suite 390 Walnut Creek, CA 94596 m
)
4 /h=
J' x
R. K. Gad IIJ/
i l
i l
i J