ML20217D707

From kanterella
Jump to navigation Jump to search
Forwards Final Assessment Rept of PDI Program.Outstanding Issues Will Be Discussed at 960307 Meeting.Resulting Changes to Be Incorporated Into Future Revs of Program Encl
ML20217D707
Person / Time
Site: Fermi 
Issue date: 03/06/1996
From: Strosnider J
NRC (Affiliation Not Assigned)
To: Sheffel B
DETROIT EDISON CO.
References
NUDOCS 9710060075
Download: ML20217D707 (1)


Text

,

So - 3y/

f fierch 6, 1996 Bruce J. Sheffel, Chtinaan PD!

' Detroit Edison Company

(

Fermi-2 Power Plant 6400 North Dixie Highway Newport, Michigan 48166

SUBJECT:

NRC ASSESSMENT OF THE PDI PROGRAM

Dear Mr. Sheffel:

i In a letter dated August II, 1995, NRC sent you a copy of the staff's preliminary assessment of the PDI program to review for any inaccuracies in the program description and for information PDI consistared as proprietary a letter dated October In 27, 1995, PDI submitted a respwse that identified inaccuracies, propriep contained in your respo.ry information, knd editorial corrections.

The changes The final assessment report is provided in this letter asnse were inco report..

Outstanding issues identified in the final assessment report will be discussed in a meeting on March 7, 1996, betwecn the NRC staff and PDI.

the PDI program resulting from the March 7, 1996, meeting will be incorporate Any changes to into future revisions +o the PDI program.

Sincerely, original signed by:

Jack Strosnider, Chief Materials and Chemical Engineering Branch Division of Engineering

Enclosure:

As stated cc:

Ted Sullivan J

)f$3 fB a gdxL Tu 2880 8e888su P

PDR ng l

!h,l!

d. hh om,%

g__-____--------

k 3

UNITED STATES j

NUCLEAR REEULATORY COMMIS810N o

wAswweton, p.c. seeswam 9

August 11, 1995 f

Bruce J. Sheffel, Chaiman, PO!

Detroit Edison Company Femt 2 Power Plant 6400 North Dixie Highway Newport, Michigan 48166

$UBJECT: NRC ASSESSMENT OF THE PO! PROGRAM

Dear Mr. Sheffel:

The NRC perfomed an assessment of the nuclear industry's efforts in Ultrasonic Examination Systems," to Section XI of th Mechanical Engineers (ASME) Boiler and Pressure Vessel Code (the Code i

nuclear industry created the Performance Demonstration Initiative (POI) to

. The manage demonstration requirements contained in Appendix VIII.

the Electric Power Research Institute (EPRI) Nondestructive E P01 designated Center to be the Performance Demonstration Administrator (PDA) to implem the P01 program.

Subsequently, POI was expanded to include demonstration of nt cracking (IGSCC) based on the "IGSCC Coordination EPRI, and the Boiling-Water Reactor Owners' Group.hdtvidual utilities and vendors may satisfy Appendix VIII and the IGSCC Coordination Plan performan requirements by participating in the PDI program.

Qualification procedures are either generfc specific (establishe(d for unique application.available for broad application) o UT1, P01-UT2, and POI-UT3 are a major improve) ment over the pre:eduresTh current!y used in the industry.

The assessment team reviewed PO! and PDA activities associate performance demonstrations of the UT procedures, equipment, and personnel u to detect and size flaws.

stainless steel and ferritic steel pipe weldsThe current P01 testing progras includes a 3, and 12), intergranular stress corrosion crac(Appendix VIII, Supplements 2, king IGSCC steel piping, and automated UT of reactor pressure ve(ssel w)eldsfar stainless and 6).

P01 is also doing preparatory work for bolting Supplements 4 the manual UT of reactor pressure vessel welds (Supplemen(Supple t 4 and 6).

The team observed PO! meetings and had discussions with PDI and PDA sta EPRI in September and December 1994.

The onsite assessment was conducted between January 23 and February 3, 1995, with an exit meeting held at the NDE Center on February 3, 1995.

'Overall,itwasfoundthattheP04staffhasestablishedand cess selected portions of Appendix VIII.

effectively qualified NDE technicians in accordance with Appendix VIII wit the exception of certain requirements that P01 found to be impractical.

specific requirements are the subject of ongoing svilustions by theThe Qw(- OTaBMb

3ruce J. Sheffel.

i appropriate ASME couaittees and may be the subject of code cases.

implementation of Appendix VI!!,important that these issues be reso It is code are.ddressed by the P01 steerir.) comeittee and subcommit I

are discussed in the assessment report.Certain open items and a Please review the report and advise me withih 30 days of any inaccuracles in the description of the PDI program o inforsation that should be proprietary.

you plan to take relative to the open item identified in the report.Als l

Donald Naujock at (301) 415-2767.If you have any questions,- please l

4 l

Sincerely, i

ORIGINAL SIGNED BY t

i' Jack R. Strosnider, Chief l

Materials and Chemical Engineering Branch l

Division of Engineering Office of Nuclear Reactor Regulation j

Enclosure:

As stated i

cc w/ Encl.:

Brian Sheron i

Gus Lainas M. Modes, Re i

Jerry Blake,gion I Re i

John Jacobson, gion II Region III Dale Powers, Region IV

{'

Distri k ian:

Centra ! Piles EMC8/RF DOCUMENT NAME: G:

C5ee previous conc \\NAUJ0CK\\ POI LET.TER

~

urrence i

To receive a espy of thle desument, indteste la the hem C=Capr m/e ettechment/enstesure Escepy with

]

copy o

4

}

OFFICE ORS:M)E ML*

E DE:ENCB*

E DE:EMCB*

C DE:EMCB*

C l

{

MAME EHGray:jb DNaujock: den:adl TSullivan JR$trosnider

}

lE 6/09/95 6/08/95 6/12/95 08/11/95 i

]

0FFICIAL RECORD COPY n

mm-m m

---~v

Enclosure REPORT NO: 999 01288/95-01 TAC NO: M90846

SUBJECT:

The Ultrasonic Testing (UT) Performance Demonstration Initiative (PDI) of the US Nuclear Industry as currently Implemented by the Electric Power Research Institute (EPRI) Nondestructive Examination (NDE) Center as the Performance Demonstration Administrator (PDA)

NRC TEAM MEMBERS:

Full 77mc Harold Gray Chief, NRC Mobile Nondestructive Examination Lab (ML) g Steve Doctor Pacific North' ett Laboratories (PNL)

Nick Economos Inspector, Materials Section, Region II Don Naujock NRC/NRR Materials and Chemical Engineering Branch Pan 77mc Edmund Sullivu NRC/NRR Materials and Chemical Engineering Branch Joe Muscara NRC/RES Electrical, Materials, and Mechanical Engineering Branch Jim Coley Inspector, Materials Section, Region II 4

Richard Harris NDE MIrUT Technician Pat Peterson NDE ML UT Technician Pat Hessler Statistician, PNL b

b APPROVED:

YYt.A.A C

Edmund J. Sullivad-fr., Chief, In:ervice Inspection Section Date Materials & Chemical Engineering Branch Office of Nuclear Reactor Regulation U.S. Nuclear Regulatory Commission kC3j20SOCC 74

TABLF OF CONTENTS PAGE EXECUTIVE S UMMAR Y..............................................

Iy ASSESSMENT REPORT

1.0 INTRODUCTION

AND PURPOS E.......................................

1 PD I Poli cy...........................................

3 Program Description.........................................................

4 Open I te m s....................................................................

4 2.0 AREAS ASS ESS ED.................................................

6 2.1 Review of PDI Generic UT Procedures..............................

7 2.1.1 Procedures PDI UT-1 and PD1 UT 2........................

7 2.1. 2 Proced u re PDI UT-3................................

8 2.1. 3 Essen tial Variables.............................................

9 2.1.4 Controlling Generic Proced ures.............................

10 2.1.5 Vendor Specific Procedures....................................

10

2. 2 Test Bloc ks.......................................................

I1

2. 2.1 Test Block Materials............................................

I1 2.2.2 Test Block Defect Processes..................

12 2.2.3 Specimen Designs..............................................

13 2.2. 4 Validation of True S tate of Flaws................... '........

14

2. 2.5 Concl u sion s....................................................

15

2. 3 PDI Testing Process....................................................

15

2. 3.1 Data Acquisition.................................................

17

2. 3.2 Data Analysis.....................................................

17

2. 3.3 Personnel Qualifications...................................

18

2. 3.4 Procedure Qualification s......................................

18

2. 3. 5 Concl usion s.....................................................

19 2.4 Non-UT Procedures and Documentation............................

19 2.4.1 Security of Test Specimens and Information...............

19 2.4.1.1 Identifying and Fabricating Specimens.............

19 i

i 1

i 9

2.4.1.2 Handling, Storage, and Shipping of Specimens...

20 2.4.1.3 Handling Specimens During the Demonstration...

21 2.4.1.4 Handling, Storage, and Shi

2. 4.1.5 Conclusion s..................pping of Data..........

21 22 i

2.4. 2 '/uali fication Records.................................

22 2.4.2.1 Candidate and Equipment Performance Demonstration Sheets................................

22 2.4.2.2 Performance Demonstration Qualification Summar

2. 4. 2. 3 Conclu sions..............y.............................

23 24 2.5 Performance Demonstration Testing of Piping Materials.........

24 2.5.1 Test Set Desig n..........................

25 2.5. 2 Test Execution..................................................

27 2.5. 3 Scoring Test Results.....................................

27 2.5. 4 Retesting and Ex pansion.......................................

28 2.5.5 Feedback to Candidates that Fall a Test......................

28 2.5.6 Reporting of Test Results........

28

2. 5. 7 Find ing s............................................................

28

2. 5. 8 Concl u sion s...................................................

30 2.6 Performance Demonstr6 tion Testing of RPV Ma'terials..........,

30 2,6.1 Test Set Design..........................

31 2,6. 2 Test Execution...................................................

31

2. 6. 3 Scoring Results.......................................

31

2. 6.4 Renting and Expansion........................................

32

2. 6. 5 Feedback Provided to Candidate............................

32 2.6.6 Repo ting of Test Results...............

32

2. 6. 7-Concl u sions...................................................,..

32 2.7 Bolting Examination.....................................................

33 2. 8 A S ME Code................................................

34 2.8.1 PD' PDA Exceptions to ASME Code........................

J 34

2. 8.1.1 Items Reviewed.........................................

34

2. 8.1. 2 PDI Position No.94-007..............................

35

2. 8.1. 3 PDI Posi tion No. 94 009..............................

36 2.8.1.4 Deferring Using Appendix VII.......................

36 il

2.8.1.5 Changing RP\\ frue 1.ocation Tolerance...........

.6 2.8.2 Merging the IGSCC Program into the PDI Program......

J6 Comparison of NUREG-0313, Revision 2, to tha. PD I Program..........................................

36 3.0 MANA G EMENT MEETING S.............................................

37 PERSONS CONTACTED AT PDI BY NRC ASSESSMENT TEAM....

38 TABLFS Open I te m s...........................................................................

4 Table 1 Least Number of Piping Flaws for Maximum Qualification.

26 Table 2 Summary of PDI's Altemative Positions......................

35 l

U

- EXECUTIVE

SUMMARY

BACKGROUND Section 50.55a, " Codes and C.andards," of Title.10 of the Code of Federal Regulgigns (CFR) gives the regulatory requirements for application of the American Society of Mechanical Engineers (ASME) Ibiler and Preunte Vaual Code (BPV Code) to nuclear power reactors !! censed in the United States. The current regulations incorporate the 1989 Edition of St;ction XI, " Rules for Inservice Inspection of Nucleer Power Plant Components, tot inservice inspection (ISI) and examination of the reactor ecolant pressure boundary (ASME Code Class 1) components and certain other safety related pressure vessels, piping pumps, and valves which are classified as ASME Code Class 2 or 3 'the 1989 Edition of Section XI included mandatory Appendix VII, " Qualification of Nondestructive Examination Persontml for Ultrasonic Examination," which specifies requirements for the training and qualification of ultrasonic nondestruedve examination (NDE) personnel in preparation %r employer certification to perform NDE. The 1989 Adhada to Section XI adoed mandatory Appendix VIII, " Performance Demonstratior 'm Ultrr. sonic Examinuion Systems," to give requirements for performance demonstration for ultrasonic testing (UT) procedures, equipment, and personnel used to detect and size flaws. Appendix VIII references Append VII for personnel qualifications. The establistment of Appendices VII and VIII accelerated after a November 1984 public workshop in Rockville, Maryland, during which UT performance demonstration prrivisions prepared by Pacific Northwest Laboratories (PNL) under NRC Research funding were extensively discussed. Following the meeting, the Section XI Subgroup o9 Nondestructive Examination (SGNDE) agreed to undertake further development of the UT performance demonstration rules for publication in the ASME BPV Cd in lieu of NRC developing and issuing a regulatory guide.

The U.S. nuclear industry created the Performance Demonstration Initiative (PDI) group to manage implementation of the performance demonstration requirements of Appendix VIII.

PDI contracted with the Electric Power Research Institute (EPRI) Nondestructive Framination (NDE) Center, located in Charlotte, North Carolina, to be the current Performance Demonstration Administrator (PDA) for implementation of the PDI program.

All nuclear power plant licensees participate in the PDI program.

In addition to managing the performance demonstration for Appendix VIII, PDI agreed to expand its purpose to include the intergranular stress corrosion cracking (IGSCC) performance demonstration testing program to fulfill the IGSCC Coordination Plan.'

  • merging the two programs, PDI has instituted changes that have benefitted both mus.

Individual utilities and vendors participating in the PDI program may satisfy Appendix VIII

'The IOSCC Coordianhoe Plan orismaand from a tripartise agreement between NRC, EPRI, and the BWROO, and was described in ' Coordination Plan for NRC/EPRIIBWROG Training and qualificahon Activities of NDE Personnel,' lbchard C. DeYoung, Director, Office of Inspection and Enforcement, IJSNRC,

- to Dt. Gary Dau, Senior Program Manager, NDE, EPRI July 3,1984, iv

performance requirements and the intent of the IGSCC Coordination Plan. The NRC has

' agreed to allow the UT examiners for IGSCC, who were required to requalify every three years, to extend their qualifications to extend their qualification up to five years from the date of the last qualification examination before March 1,1994, (i.e., to extend their requalification t,y up to 2 years). The IGSCC Coordination Plan extension is to avoid duplication of qualification requirements.

ASSESSMENT ne purpose of the NRC review of PDI was to assess the program for conforming to and implementing the requirements of Appendix VIII and merging the IGSCC Coordination Plan into the protn m. De team reviewed activities of PDI and PDA associated with the performance demonstrations of the UT procedures,_ equipment, and personnel used to detect and size flaws. _ When problems in meeting the requirements were identi%i, PDI developed alternatives.- nese alternatives were reviewed by the NRC team (see Secuon 2.8 for details).

he assesunent team reviewed PDI policy and FDI programs and identified 13 items for potential improvement that will be tracked foi resolution. The team found that PDI qualifies NDE technicians to a number of generic procedures (i.e., those procedures that have broad applienion for subscribing utilities) and works with vendors and utilities for qualification of technicians using specific procedures (i.e., those procedures which are for specific applications for the vendor or utility). PDI has developed and qualified generic procedures for manual examination of austenitic (including IGSCC) and ferritic piping. Generic procedures for bolting have been developed, while generic procedures for reactor pressure vessel examinations are under development. A review of the generic procedures indicated that the procedures could be enhanced if certain items were clarified and if additional details

, were added, though the generic procedures are more detailed than vendor or utility specific procedures.

Mr MM:Lt The team reviewed the test block designs and materials and the processes for manufacturing defects in the specimens and found conformance with the requirements of Appendix iThi.

Test specimen configurations are representative of field conditions with the flaws located where a component is most sWble to cracking. Piping specimens containing IGSCC flaws were available from EPRI.- Piping specimens containing non-IGSCC piping fis.ws were predominantly fabricated with thermal fatigue cracks. Reactor vessel specimens were predominantly fabricated with weld solidification cracks with electrical discharge machining (EDM) notches being used sparingly. PDI has an effective security program to protect details of the types and locations of defects in particular specimens including provisions to prevent compromise during testing and to protect test data. PDI exercises security oversight at vendor facilities and licensee plants whenever their specimer.s or procedures are being utilized. De specimens are considered appropriate for performance demonstration testing v

, and allow for a significant challenge for the UT candidates. The associated security corarols i

are appropriate for maintaining the security of the specimens. Marimiting use of real cracb in test specimens is coruidered a strong point of the PDIprogram.

TESTING AND OUM1FICATION PROCFSS Review of the testing process included quality _ assurance, administrative controls, conduct of-the qualification, selection of specimens, documentation of the candidates findings durin examination, requirements for retest, and data on candidate success rates. The testing process was well organized, administratively controlled, and comprehensive with sufficient challenge to ensure that those candidates who are not adequately skilled will not pass.

However, the team identified that PDI does not meet the requirements of Appendix VII in limiting the number of retests to two per year (see Item 95-0107).

Candidates participating in the performance demonstration test program submit statemenn to PDI stating that they satisfy ASME Code Level II or III requirements. PDI accepts statements as an indication of proficiency and defers the training and implementation of Appendix VII to candidates' sponsors. Candidates who are unsuccessful during their initial demonstradon may take a retest. However, candidates are restricted by ASME Code to two ictests per year. Neither PDI nor Appendix VII address periodic proficiency demonstrations which is a departure from the American Society for Nondestructive Testing (ASNT) SNT-TC-1 A requirements and the IGSCC Coordination Plan.

The performance demonstration tests for piping are being performed by candidates using 4

manual UT equipment. The tests are tailored to the candidate's qualification request which may include the non-Code conditions of IGSCC and inspection of a weld from the opposite side (single-side access). The addition of IGSCC to the performance demonstration test is an effort by PDI to satisfy the demonstration testing contained in NUREG-0313, Revision 2, though the testing is not consistent with the tripartite agreement.- For the non-Code conditions, the number of specimens per test set and acceptance criteria were set by PDI (i.e., three IGSCC flaws in a specimen where detecting one correctly enables a candidate to

, mss 1GSCC portion for piping). Acap**w criteria established by PDI ma. allow a candidate to pass the performance demonstration test without adequately demonstrating proficiency for some of the non-Code conditiens. See Item 95-01-04.

Performance demonstration tests for reactor pressure vessels are being performed by candidates using automatic UT, data collection equipment, and vendor-specific procedures.

- The automatic equipment presents an opportunity for PDI to expand the number of flaws in a specimen simply by changing the examination coordinates on the test specimens. Some companies are examining the entire specimen at one time and storing data for future performance demonstration tests and retests. The sensitivity of scanners being used for these exsminations may not be representative of those used in the field (see Item 95-01-13).

vi

. EEC.QEDS Records are maintained on qt.alifications of elluipment, procedures, and candidates participating in the PD1 program. Performance demonstration qualification summary (PDQS) sheets summarize material categories, material thicknesses, and material diameters that have -

been successfully demonstrated The conspilation of the PDQS is a labor intensive manual task that could be improved if an electronic data retrieval system is established (see item 95-01-08).

CONCLUS10NS The team concluded that the PDI program is acceptable for qualifying NDE technicians in accordance with Appendix VIII with the exception of requirements in the areas of testing tolerances and test administration that PDI determined to be impractical. The team reviewed PDI's technicaljustifications for the impractical requirements. It is important that ASME Code activities to address the issues be completed on a timely schedule. Situations or conditions not specifically addressed by code are addressed in the PDI program as determined by the PDI steering committee and sutscommittees.

9 0

ASSESSMENT REPORT

1.0 INTRODUCTION

AND PURPOSE Section 50.55a; " Codes and Standards,"'of Title 10 of the Code of Fadacal Reeutations

-(CFR) gives the regulatory requirements for application of the American Society Mechanical Engineers (ASME) Boiler and Pre==ure Veuel Code (3PV Code) to nuc power reactors licensed in the United States. The current regulations incorporate the 198 for inservice inspection (ISI) and examination of the (ASME Code Class 1) components and certain other safety related pressure pumps, and valves which are clasrified as ASME Code Class 2 or 3. The 1989 Edition of Section XI included mandatory Appendix VII, "Qualificatior. of Nondestructive Ex Personnel for Ultrasonic Examination," which specifies requirements for the tra qualification of ultrasonic nondestructive exambation (NDE) personnel in preparat Appendix VIII, " Performance Demonstration for Ultr requirements for performance demonstration for ultrasonic testing (UT) procedure equipment, and personnel used _to detect and size flaws. Appendix VIII refe VII for personnel qualifications. The establishment of Appendices VII and VIII acce after a November 1984 public workshop in Rockville, Maryland, during which draft U performance demonstration provisions prepared by Pacific Northwest Laborato

-under NRC Research funding were extensively discussed. Following the meeting

_ Section XI Subgroup on Nondestructive Examination (SGNDE) agreed to underta development of the UT performece demonstration rules for publication in the ASM Code in lieu of NRC developing and issuing a regulatory guide.

Appendix VIII gives the performance demonstration requirements for specific typ components in Supplements 2 through 118 as follows:

Supplement Iigg 2

Qualification Requirements for Wrought Austenitic Piping Welds 3

Qualification Requiremeats for Ferritic Piping Welds charactanastoon and its use is opuceal. Suppleessets 12 and 13 implementatics of selected aspects of supplements 2,3.10. and 11. and supplements 4 5 6 and respectively. Reference the 1995 Editice of Section XI of the ASME BPV Code.

1

4 Qualification Requirements for the Clad / Base Metal Interface of Reactor Vessel 5

Qualification Requirements for Nozzle Inside Radius Section 6

Qualification Requirements for Reactor Vessel Welds Other Than Clad / Base Metal Interface 7

Qualification Requirements for Nozzle-to Vessel Weld 8

Qualification Requirements for Bolts and Studs 9

Qualification Requirements for Cast Austenitic Piping Welds (In the Course of Preparation) 10 Qualification Requirements for Dissimilar Metal Piping Welds 11 Qualification Requirements for Full Structural Overlaid Wrought Austenitic Piping Welds The U.S. nuclear industry created the Performance Demonstration Initiative (PDI) group to manage implementation of the performance demonstration requirements of Appendix VIII.

PDI contracted with the Etectric Power Research Institute (EPAI) Nondestructive -

Exsmination (NDE) Center, located in Charlotte, North Carolina, to be the current Performance Demonstration Administrator (PDA) for implementation of the PDI program.

All nuclear power plant licensees participate in the PDA program, though some utilities and NDE contractors have developed and qualified their own procedures within the current PDI program (e.g., Virginia Electric Power Company, Northeast Utilities, Duke Power Company).

In addition to meaging the performance demonstration for Appendix VIII, PDI agreed to expand its purpose to include intergranular stress corrosion cracking (IGSCC) performance demonstration testing program to fulfill the IGSCC Coordination Plan.2 In merging the two programs, PDI has instituted changes that have benefitted both programs. Individual utilities and vendors participating in the PDI program may satisfy Appendix VIII performance requirements and the intent of the IGSCC Coordination Plan. The NRC agreed to allow the UT examiners for IGSCC, who were required to requalify every three years to extend their qualification up to five years from the date of the last qualification examination before March

%e IOSCC Coordination Plan originesed from a tripartite ageweset between NRC, EPRI, and the BWROO, and was described in *Coordinaman Plan for NRC/EPRI/BWROO Training and qualification Activities of NDE Personnel,' Richard C. DeYoung. Director, Office of Inspection and Enforcement, USNRC, to Dr. Gary Deu, Senior Program Manager, NDE, EPRI July 3,1984.

2

).

0

1,1994, (i.e., to extend incir requalific4 tion by up to 2 yearu. The IGSCC Coordination l

' Plan eatension is to avoid duplication of qualification requirements.

(

De purpose of the NRC review of PDI was to assess the program for conforming to and implementing the requirements of Appendix VIII and merging the IGSCC Coordination Plan into the program. The team reviewed activules of PDI and PDA associated with the performance demonstrations of the UT procedures, equipment, and personnel used to detect and size flaws. When problems in meeting the requirements were identified, PDI developed altematives, Dese alternatives were reviewed by the NRC team.

The assessment team developed an understanding of the PDI program by attending a number of PD1 and PDA meetings, observing the testing process, and reviewing procedures and documentation. The assessment was performed during the period from September 15,1994 to February 3,1995, as outlined below:

Attendance at PDI Steering Committee Meeting in Charlotte, North Carolina, September 15, 1994.

1 Meeting with PDA on September 16, 1994, to discuss the PDI program.

Three-day team planning meeting at Region I in November 1994.

Attendance at the December 7 8,1994 PDI, Meeting in New Orleans, Louisiana.

Preliminary assessment.at EPRI December 12 - 16,1994, to review procedures, observe testing in progress, and meet PDA staff.

Observe performance demonstration (PD) work on reactor vessel specimens in Chattanooga, Tennessee, January 23 - 24,1995.

Assessment of PDA work at the EPRI NDE Center, Janua3y 25 through February 3, 1995, including ultrasonic examinations on selected piping a.id vessel test specimens performed by NRC NDE UT technicians.

PDE Mey

" Performance Demonstration Piml" delineates the PDI policies for accomplishing qualification in accord with the requirements of Appendix VIII and the IGSCC Coordination Plan. The policies cover testirg prerequisites, procedure review, dispute resolution, scheduling, specimen selection, security, test administration, documentation, test gracing, and retesting. The policies are embodied in the implementing procedures.

3

.- _ ~ - -. - - - - -.- - -

Progiam Descr4Nion c

' Program Description" presents a paragraph by-paragraph comparison of Appendix requirements to the PDI program for implementing the requirements. The PDI pro includes alternatives to the code requirements found to be impractical as de j

papers that justify the alternatives._ The program description identifies three sets of implementing procedures:

" Quality Assurance of Specimen Fabrication"

" Quality Arar.wce for the Performance Demonstration Process"

" Quality Assurance Instruction Manual for the Performance Demonstration j

Process" j

Within these three sets of procedures, there are approximately 50 implementing pro each covering a particular aspect of the PDI program.

I Open hans 1

The following table lists the open items that were identified as areas for improvements. Th items are discussed in detail in the referenced report section.

i item Number Report Section Description 95-01-01 2.1.1 Additional examination techniques developed i

during performance demonstrations and clarification of certain items should be included in examination procedures PDI UT-1 and PDI.

i UT-2.

95-01-02 2.1.2 For PDI-UT-3, integrate test performance and equipment requirements into a single section of the procedure and include more detailed

)

instructions.

95-01 03 2.1.3 t

An examiner who encounters equipment l

combinations not previously demonstrated as i

part of his qualification should receive training on the equipment before its use.

l I.

1 4

4 B'r--

w'

-T---

=

u*

Item Number Report Section Description 95 01-04 2.1.3_

When a performance demonstration does not include all of the techniques listed in the qualiacation procedure, a candidate should demonstrate proficiency in using the other

'echniques before qualification is complete.

95-01 05 2.1.4 While monitoring of the effectiveness of generic procedures appears to be adequate, written guidance for the oversight function should be develnaad.

95-01-06 2.1.5 All essential variables (i.e., important equipment parameters that have an impact on inspection performance), with specia.ed values, should be included in the vendor qualification proen.ures for reactor vessel examinations.

The automated equipment procedures for piping, non generic piping, bolting, dissimilar metal welds, overlays, and cast austenitic welds, and the procedures for reactor pressure vessela should be examined closely to ensure that the procedures that are being qualified meet the requirements of Appendix VIII, 95-01-07 2.3.3 Limit retests to two in 1 year to comply with 1

Appendix VII-4360(c requirements. Unlimited retesting may compro) mise specimen security and present a situation for incorrect reporting of the candidate's qualification.

95-01-08 2.4.2.3 Complete the electronic data retrieval system for qualification records. The format of the Qualification Data Summary Sheet (QDSS) could cause a reviewer to miss the exceptions to the test or misidentify a candidate's-lGSCC qualifications. The current method of finding each Qualification Calibration Data Sheet (QCDS) in the candidate's file and comparing it to the equipment files is cumbersome.

5

itse Number Report Sxtion Description 95-01-09 2.5.7 The Code does not address IGSCC or single-side access acceptance criteria in the su; plements. Therefore, PDI established the acceptance criteria which the team considered weak and in need of improvement.

95-01 10 2.5.7 PDI should ensure that adequate details are included in the qualification procedures for IGSCC mid wall qualification.

95-01 11 2.6.3 The tolerance on the determination of the location of a flaw used by PDIis double the tolerance allowed in Appendix VIII, Supplement

4. PDI should initiate action with the Se: tion XI Committee on the tolerance changes that may x needed.

95-01-12

-2.6.7 PDI should develop a clear position on the use of software for the performance demonstration -

tests as compared to performance in the field.

95-01-13 2.6.7 PDI should develop a means to prove that the improvised scanners used in the performance demonstrations are not superior to the scanners used for the field ISI.

2.0 AREAS ASSESSED The assessment of PDI activities is divided into Sections 2.1 tnrough 2.8 as follows:

2.1 Review of PDI Generic UT Procedures 2.2 Test Blocks 2.3 PDI Testing Process 2.4 Non-UT Prncedures and Documentation 2.5 Performance Demonstration Testing of Piping Materials 2.6 Performance Demonstration Testing of RPV Materials 2.7 Bolting Examination 2.8 ASME Code At the time of the assessment, PDI used generic procedures that it developed. During the two week on-site assessment, the team observed personnel participating in performance 6

.--i

+

h demonstrations on pipe welds and reactor vessel shell welds. The team limited its assessment

'.to the in process testing and co:npleted work and avoided items under development so as n to influence PDI. The team reviewed in process qualifications for meeting Appendix VI!!,

4

-Supplements 2,3, and 12. 'Ihe team reviewed automated UT data collection portions for i[

qualifications to the requirements of Supplements 4 and 6, and the preliminary procedures j

with typical test specimens for meeting the requirements of Supplement 8.

2.1 Renew of PDI Genenc UT Procedures P

i.

Generic procedures are those that PDI develops for various materials and types of equipment j

that will apply to many applications in the field. Compare with vendor or utility specific procedures which are developed for unique or specific applications of materials or equipment j

that may not be covered oy generic procedures (e.g., UT steel pipe with stainless steel i

j -

cladding on the inside and outside diameters). PDI developed three generic procedures for UT examinations of piping:

I I-PDI UT detection and length sizing of flaws using manual UT on similar metal welds in ferritic piping. -

)

i PDI UT detection and length sizing of flaws using manual UT on similar metal welds in wrought austenitic (stainless steel) piping.

l PDI UT depth sizing of flaws in ferritic and wrought austenitic (stainless steel) piping.

4 l

The procedures are criteria-based in that they specify values for the essential variables, j

including instruments and transducers, used in examining piping of a particular material type.

thickness, and diameter. - Criteria-based procedures are decision assistance tools that guide the user in selecting the proper equipment and settings, i

j 2.1.1 Procedures PDI UT-l'and PDI UT-2 4

[

PDI generic procedures for ultra, sonic detection and length sizing of indications in ferritic and austenitic piping materials were revie'ved for compliance with ~ Appendix VIII and for t

technical adequacy. The procedures give guidance for the casential variables such as i

search unit bandwidth, applicable frequencies, type of sound wave, focal depth, and j

signal-to noise ratio. The reference sensitivity levels are established using calibration j-blocks.

1 j

The procedures are better (i.e., have a higher success rate of detecting and properly lt sizing flaws) than those used in the nuclear industry that do not meet the requirements in Appendix VIII; however, PDI could inmrove the procedures by giving mor: octail in the j

. data interpretation steps. The insights and expertise developed by the candidate preparing i

for the performance demonstration test may be lost from the lack of us: of these skills.

4 l.

7 4

4 4

. +, -

,,.....-e

,., -,~,

J i

If procedural guidance were expanded, the candidate could rely on details in the 4

pmoedure rather than having to memorize numerical values or technique specifics. For i

example, Section 8 of procedures rDI UT 1 and PDI UT 2 includes items that could be improved as follows:

(

1)

PDI UT 2, Paragraph 8.2.2.3, su.tes that a second angle is used in the j

same direction. It is unclear whether this means that two angles (e.g.,

j d

45 and 50 degrees or 40 and 45 degrees) are equally acceptable.

i 2)

PDI UT 2, Paragraph 8.2.2.7.a. states that "... when using a higher 4

frequency..." It is unclear whether this means that going from 1.5 l

MHz to 2 MHz, or increasing by a factor of 2 from 1.5 to 3 MHz, are equally acceptable.

a' 3)

PDI UT 2 Pvagraphs 8.2.2.9 and 8.2.3, state that an inside diameter i

t epir.3 wave probe is used No guidance for using these probes is i

sven regarding the frequency of operation or the element size.

j 4)

PDI UT 1. Paragraph 8.3, states that there are tests being conducted at i

- different angles and frequencies, but provides no guidance on what the other angles or frequencies should be used.

5)

Specifies such as items I through 4 should be included in the flow j

charts (e.g., Figure 4) in PDI UT 2.

j 6)

PDI UT 1 has sim'le types of items that could be better clarified. -

i i

Currently, ITT technicians must learn the critical values necessary for successful

{

performance demonstration through trial and error. When combinations are found that produce desired results, the UT technicians typically write notes or rely on remembering the combinations. By including such !nformation in the <pialification procedures UT te-hnicians will mo,e quicidy learn the tarrect techniques. Additional deta"s in the j

procedures compicment the mandatory requirements it: Appendix VIII for all essential variables and variable ranges. '!he recommendations for improving procedures will be i

tracked as item 95 0101.

4

- 2.1.2 Prrirt'ure PDI UT 3 Procedure PDI UT 3 for ultrasonic through wall (depth) flaw sizing of ferritic and austenitic piping materials was reviewed for compliance with Appendix VIII and for technical adequacy. '!he procedure includes a well written scope, a description of the techniques to be used, a listing of the equipment to be used, a detailed calibration procedure, and a description of the process for applying the techruque to determine a i

final flaw depth estimate.

8

,.,v.,s.,......

,.,.,,e..w.,,..,,..,.9.,... -.

..,w__,_y,_y.m,..

.,m.

-y.w,.---.y.,7-

,,3-n.

,yw..,~w..._.

,y

.,,,--w

i i

i t

The procedure requires UT personnel to perform tests identined in Section 7 of the 1-procedure, it then refers back to Section 5 of the procedure for determining the corr equipment for conducting the tests. Integrating these sections would siniplify us l

procedure. Also, a lack of detail similar to that found in PDI UT 1 and PDI UT 2 was j

noted. 'these comments will be tracked as item 95 0102, i

j 2.1,3 Essential Variables 1

To develop the generic procedures, PDI had to identify the dependent variables. To j

identify the dependent variables, PDI first asked the equipment manufactu j

the equipment best suited for the piping applications found in the light water reactor industry. Using the equipment recommendations from manufacturers, PDI measured i

variety of flaws in various materials and worked with the manufacturers to determine which controls affected equipment system performance. From these measurements, operating values that must be used with the equipment were identified. The operat j

values were ones which affect the center frequency and the bandwidth of the equip Pacific Northwest Laboratories (PNL) has conducted extensive modeling and

{

experimental studies on equipment operating tolerance parameters and found that to i

ensure repeatability of inspection performance, the system center frequency and l

bandwidth must be controlled. Therefore, a table in each generic procedure lists all the l

{

controls that affect equipment center frequency and bandwidth for each instrument.

The flaw detector instruments (e.g., pulsars, transducers, receivers, and search units)

[

specified in a procedure are the only ones thet can be used with the procedure. New Instruments can be added to procedures only after extensive evaluation and testing to determine the correct setting for all controls that affect center frequency and bandwidth.

Candidates using generic procedures use flaw detector instruments specified in the procedures and a variety of transducers made by various manufacturers. To qualify a i

transducer, it must be successfully used in a performance demonstration. When a non-qualified transducer is used, it must first be !!sted on the equipment data sheet. As each

+

j specimen is inspected and results are given to the test proctor, the candidate describes to j

the proctor the key UT data used to make the detection or sizing decision. The test proctor then notes whether that particular transducer was useful in achieving the desired results. If the candidate successfelly passes the performance demonstration, the i

transducer is added to the equipment qualified for that procedure.

When a candidate successfully passes the performance demonstration, the candidate is quall6ed to use any of the equipment combinations listed in the procedure, as well as any future additions to the procedure. Such latitude with equipment selection introduces a -

variable not addressed in Appendix VUI. However, the equipment control settings listed in tables in a criteria based procedure limit the amount of variation that the equipment cocid introduce into the system. As generic procedures are being applied, a successful candidate could be asked to use unfamiliar equiptr.ent in the field. If such a situation t

4 3

9 I

i

ariws, the candidate's proficiency with unfamiliar equipment should be verified thro some form of training. That a successful candidate may later encounter equipment combinations not previously demonstrated as part of his qualification will be tracked as item 95-0103.

The sizing procedure describes different techniques that produce equivalent results. A candidate who has successfully passed the performance demonstration using one te in the procedure is now qualified for all of the techniques listed. The team concluded l

that, because the qualification is perfornh lased, the candidate should demonstrate all of the technA ues in the procedure, if the candidate did not demonstrate all of the l

techniques listed in the procedure to qualify, he should demonstrate to the test proctor proficiency with the other techniques and the demonstration of ability should be documented in the test results. The completion of a performance demonstration without demonstrating all of the techniques of the UT procedure will be tracked as item 95 01-04, 2.1.4 Controlling Generle Procedures Generic procedures are frecuently revised; therefore, a method for controlling and evaluating the effectivenest of the generic procedures should be available. PDI has not t

yet developed written controls for the generic procedures, but was able to described the controls currently in place. User identified inadequacies in generic procedures are 3

submitted to the PDI Technical Working Group (TWG) for resolution. The TWO screens equipment before it is included in t generic procedure to identify incompatible combinations of equipment and criteria, if testing with specific equipment or a specific procedure results in a high candidate failure rate, the TWG performs a detailed review of the test and resolves concerns. In screening equipment, the TWG considers whether the equipment is sufficiently different to require its own procedure and the TWG is developing guidelines to identify when changes in equipment, tpchniques, or training are extensive enough to warrant a new procedure. The assessment team did not have i

concerns with the PDI method for monitoring the effectiveness of their generic procedures; however, the lack of written guidance for the oversight function will be tracked as item 95-0105.

2.1.5 Vendor Specific Procedures ne team reviewed data for several companies participating in qualification for the reactor pressure vessel (RPV) supplements in Appendix Vill. Procedures developed by these companies were much less detailed than me PDI generic procedures. The team recognizes that, for automated or computer based systems, all of the listed essential variables may not apply while other variables not listed in the code may be important.

De absence of essential variables in the supplements is a problem that the Section XI Committee should address. However, all essential variables (i.e., important equipment parameters that have an impact on inspectioi performance), with specified values, should be included in the qualification procedures. The automated equipment procedures for 10 4

. ~. _. ---.,,.,

m

~

piping, non generic piping, bolting, dissimilar metal welds, overlays, and cast austenitic welds, and the procedures for reactor pressure vessels should be examined closely to ensure that the procedures that are being qualified will meet the requiremen's of Appendix VIII. This issue will be tracked as Item 95 01 %.

2.2 Ten Blocks The fabrication process for making test specimens was reviewed beginning with the des specimens, acquisition of material, introduction of flaws, and validation of the true state of the flaws. Flaws were fabricated in wrought austenitic stainless steel and in ferritic materials for pipe and clad and unciad vessel materiale. PDA procedures and test specimens for

-dissimilar metal, castings, and noule welds were in various stages of completion and not available for review.

The objectives of the review were to (1) verify compliance with the requirements of Appendix VIII and the IGSCC Coordination Plan and (2) determine the representativeness of the naterial configurations and the flaw se%ction. To assess the acceptability of test bloc the team considered four factors: (1) materials, (2) defect processes, (3) specimen designs (including similarities to field conditions), and (4) validation of the true state of the flaws.

l 2.2.1 Test Block Materials PDI obtained much of its matedal from discontinued nuclear plant sites. For ferritic piping specimens, PDI used ASME SA 106 Grade B or SA 516 Grade 70 steel. For austenitic piping specimens, PDI used ASME SA 312 Type 304 or 316, and SA 376 Type 304 or 316. The large-diameter pipe specimens were manufactured from rolled and welded plate. The plate material was SA 358 Type 316, and welded using Type 308 weld material. The piping used for IGSCC demonstrations was Type 304 stainless steel.

Reactor pressure vessels specimens were ASME SA 533 or ASME SA 508, Class 2.

Cladding material was weld deposited with a Type 308 modified material using the submerged arc process. Some cladding was deposited using Type 309 weld material.

Although inconel is an austenitic material, it has different UT characteristics than austenitic stainless steel and would require a separate performance demonstration. There were no inconel specimens in the PDI inventory for a separate demonstration and UT examiners will have to qualify once these specimens are available. Also, the cast elbows at PDI were not used nor were they intended for flaw evaluation. 'Ihe scope of the PD1 program does not address these materials at this time. _ If specimens are not available through PDI, licensees may have to use plant.*pecific specimens or perform an evaluation on a plant specific baals of the qualification process for unique materials or configurations.

11

.J

2.2.2 Test Block Defect Processes j

Defects were fabricated to represent conditions that could exist in the field. For pip the predominant Daw was the thermal fatigue implant. To produce the thermal fatigue flaw and implant, a bar catension the size of the des!gned Daw was welded to the " weld-prep

  • pipe end. Placing the bar under stress, the bar pipe weld joint was subjected to cycles of heating and cooling until failure. The fractured surface was cut from the bar length, machined to the designed size, matched to the fractured surface on the pipe, a seal welded to the pipc. The pipe was butt welded to an adjoining pipe. The integrity of

)

the finished weld was verified by penetrant testing (PT), radiographic testing (RT), and UT, 4

For IGSCC Daws in piping, PDI acquired specimens and inventoried material from the EPRI IGSCC testing program. The IGSCC detection specimens are primarily piping with service induced cracks. Additional specimens were fabricated from the material obtained from inventory for the EPRI IGSCC Program. Specimens used for depth sizing were grown using a graphite wool process.

PDI secured piping specimens with mechanical fatigue Daws, thermal fatigue Daws, and IGSCC flaws. PDI considered Haws manufactured by the hot isostatic pressing (HIP) process (i.e., hot pressurized embedded flaws), but determined that the process produced unacceptable attenuation.

For reactor pressure vessel tests, the majority of the flaws were manufactured using a

)

weld solidification process. PDI also had slag and lack of fusion flaws embedded in the test specimens. The slag flaws were large enough to be rejectable per ASME,Section III, NB 5320.

Because electrical discharge machined (EDM) notches are not representative of Daws found in the field, PDI used them sparingly in the RPV and small nozzle specimens, even though Appendix VIII allows their use for up to 50% of the Daws in RPVs. PDIis using pipe test sets that do not contain EDM notches; they only contain cracks. EDM notches provide ultrasonic signals that are easier to identify than signals from tight cracks. The pressurized water reactor (PWR) vessel specimens have some very shallow EDM notches. Deae EDM notches were tapered, plugged with similar material, and welded shut. Solidification cracks could not be introduced into some of the small boiling water reactor (BWR) nuzzles, and these nozzles used EDM notches that were pligged and welded shut. De limited use of EDM notches allowed for extensive use of real cracks in the test specimens and is considered a strong point of the PDI program. The team considered the flaws used in the test specimens acceptable.

4 12

fkY h

V 2.2.s Specimen Designs in reviewing specimen designs, the team considered the configurations used, the condition of the weld crown (or top layer of weld metal), the conditions of the weld root, the locations of the cracks, and the counterbore region $. The weld crown conditions 0.e.,

ground flush, ground flat, unground) are dependent on the intended use. The crowns of the specimens that are used for crack depth sizing have been ground smooth with the parent material of the pipe specimens. The pipe specimens used for detection have crowns that are either ground smooth with the parent material or the crowns are flat topp:d (i.e., ground flat but not flush with the joined material). If the inspection procedure is to simulate crowns that are in the as welded condition, then the crowns are covered with tape to preclude ultrasonic inspection through the weld crown. The assessment team agreed that this practice is acceptable for simulating as welded crown conditions found in the field.

L The counterbore and the weld root in pipe specimens were examined in two ways. Firs they were examined visually to see the va etv of ennditions that were present.

v Counterbore conditions (i.e., conditions of the geometry of the joint due to preparation o the ends) included such conditions as very short lands, vertical transitions, and partial-circumferential extent. These conditions are comparable to conditions a candidate would see in the field Using PDI generic UT procedures, the NRC staff from the NDE Mobile Laboratory did blind (i.e., without any knowledge of the specimen) UT inspections of some of the specimens and confirmed that the specimens were challenging to inspect and representative of field conditions that would make accurate detection difficult. The team concluded that since these conditions are representative of field conditions, the specim are appropriate for performance demonstration.

The fabricated cracks were installed in a variety of locations within the pipe specimens,

. including the fusion line of the weld, the counterbore transition, and in the counterbore flat land. These locations, coupled with the crack orientations, made the detection and the proper identification of the cracks a challenge. The variation in crack locations and orientation of cracks are considered to be appropriate for performance demonstration.

PDI had assembled extensive information on the range of piping and reactor pressure vessel conditions that calst in nuclear power plants. A condensed version of this information was reviewed and found to be consistent with the nuclear industry. The information contained insight into how the fabricated specimens covered the range of conditions that are found in the nuclear industry. There are, however, some

  • The 'counterbore region
  • is the ares that has been counterbored (i.e., machined to enlarge larger diametes). A counterbore process is generally used when pipes of different wall thickne welding. The thicker-walled pipe is machined so that the eccentric inner pipe diameters will mate up joint. UT of thejoint is often difficult because of the ' geometries' of the joint that result from the m process.

13 l

cvangurations that are unique to a particular licensee (for instance, inside and outside

{

clad pipe) that would still require licensee specific demonstrations, i

The team examined the RPV specimens for geometry and cladding. The primary surface was applied with the shielded metal are welding (SMAW) process and was i

ground sufnciently smooth for inspection, the conditions of these specimens were j-similar to t5e conditions found in nuclear power plants. Because of the slae of these specimens, their complex geometry, and the lack of RPV inspection equipment, it w j

not possible to assess these specimens from the perspective of a realistic RPV insp However, from the limited specimen volumes that were evaluated, the team determined i

that the conditions and flaws renected a realistic range of field conditions and that these l

specimens are appropriate for performance demonstration.

l The NRC staff from the NDE Mobile Laboratory conducted UT examinations of a shell i

~

course plate. The examination consisted of looking for near surface, mid wall, and far.

surface flaws. The plate contains many segregated flaws that provide a challenge in t detection and slains of the cracks. The generic nrocedures provided useful guidance f length sizing, characterizing, and evaluating flaws.

2.2.4 Validstlos of True State of Flaws PDI required that suppliers of flaws demonstrate the processes used to make the flaws and identify Haw slae adjustments, if needed. The flaw slae adjustment for an im the change in flaw dimensions that occurs as a result of the seal welding pro When PD1 orders specific flaw sizes (height and length) for piping, the fabricator adds the appropriate flaw size adjustment to the flaw dimensions.

One supplier of thermal fatigue flaws fabricated the flaws by welding an extension of like metal to a weld prep end of a pipe. The extension la then fatigued to failure at the weld joint. The thermal fatigue area of the weld prep end of the piping is validated in the following manner:

1he center line of the thermal fatigue area is pictorially laid out to a aero datum mark on the pipe and photographed. The mating fractured piece is machined to the design dimension, measured, and then photographed in the presence of a scale. A true-slae tracing of 'he perimeter is made of the fractured piece. The en fractured surfaces are reassembled and sealed by welding the perimeter, The flaw-containing pipe surface is circumferentially welded to.a matching pipe. The weld integrity is verified using PT, RT, and UT. To establish the correlation between initial and final flaw slae, some of the weld joints were destructively tested in a way that minimizes dimensional change. The unwelded portion of the flaw is measured. The measurement

- differences between before and after welding are averaged. The average value 14

is the flaw size adjustment. The physical measurements provide highly accurate flaw siring and are independent of UT examinations.

Flaws that were grown for PDA were verified by the destructive examination of similar flaws. Plawed specimens secured from operating facilities were used only for detection.

The EPRI NDE Center created what was referred to as a " fingerprint" of each defect.

The fingerprint includes dye penetrant results, optical images of surface breaking cracks, and automated laboratory UT tests of the zones containing the cracks. The " fingerprint" supplements information given by the manufacturer of the flaws and verifies that there are no extraneous indications near the flaws.

2.2.5 Camelusloes The specimen configurations were found to be representative of field conditions. The flaws are located v.here they would be expected to be found in the field and also where they would be considered a challenge to detect and correctly classify. The specimens are consloered appropriate for performance demonstration and provided a significant challenge from the perspective of UT inspection. The test block specimens used in the performance demonstration program met the requirements found in the applicable supplements of the Appendix VI!!.

2.3 PDI Tendng Process The PD! tesdng process for qualification of personnel and procedures is implemented and controlled through procedures in the PDI approved Quality Assurance (QA) Manual and in the Operating Instruction Manual. A review for technical content and adequacy was performed for the following procedures:

PDP-Q-009, Rev. 2

" Performance Demonstration Control" PDP-Q-009.1, Rev. 4

" Test Administration" PDP-Q-018.3, Rev. I

" Surveillance

  • PDP 1009.3, Rev. 2

" Candidate Registration and Scheduling Instruction" PDP I-018, Rev. 3

" Piping Surveillance Instruction" In addition to the procedure and record review, the team held discussions with cognizant PD1 and PDA personnel and performed observations of the testing in progress. Candidates applying for performance demonstrations submit statements of qualification from their sponsors or employers certifying that they are qualified Level !! or III examiners in accord with the ASME Code. The requirements for qualification to I.evel II or III are contained in 15

the 1989 Edition of Section XI which references the American Society for Nonde

' Testing (ASNT) SNT TC 1 A,1984 Edition, or the equivalent. However App implementation of Appendix VII up to the candidate's sp implemented Appendix VIIin its performance demonstration program). The team express a position on the PDI approach to Appendix Vil; however, if Appendix Vill becomes a regulatory requirement, candidates who do not satisfy Appendix VII w i

to request NRC approval for any deviations from the requirements. Such requests wil i

reviewed on the basis of their individual merit.

l Administrative controls are used to ensure the security and confidentiality of the pe demonstration information, testing process, and results. Candidates suspected j

security requiretaents are not permitted to continue the demonstration. All information generated during the demonstration is submitted to PDA for review and evaluation.

Performance demonstrations for candidates are performed with previously qualified procedures or in conjunction with the qualification of a new procedure. Candidates p the PDA with the essential variable values (or ranges of valves) to be used for the j

demonstntion. Dese are verified before the demonstration begins. Special limitations to the qualification, such as the exclusion of IGSCC or through wall sizing, are permitted l

However, these exceptions become a part of the candidate's qualification record and res j

the candidate's qualification. Before starting the demonst ation, the candidate is give written description of the specimens to be examined and the applicable calibration bl data report forms, test sets of specimens, and sufficient detail on scan areas for p scan plans. Test specimens are examined in accordance with applicable procedure j

requirements, essential variable values, and limitations previously agreed upon.

4 i

Test specimen selection is controlled by the follo'ving instruction procedures:

PDP I 009.4

' General Test Specimen Selection Instruction" PDP 1009.4.1

" Test Specimen Selection Instruction. Manual or Semi Automated Austenitic Piping Examinations for Detection, length, nd Depth Sizing (Supplement 2)*

I PDP I 009.4.2

  • Test Specimen Selection Instruction. Manual or Semi Automated Ferritic Piping Examinations for Detection, Length, and Depth Sizing (Supplements 3 and 12)*

In general, test sets include at least four specimens having different nominal pipe diameters and thicknesses. Test sets have specimens with the minimum and maximum pipe diameters and pipe thicknesses for the applicable examination procedure, if IGSCC or far side qualifications are requested by the candidate, the test sets will contain specimens for IOSCC or far side examination. De IGSCC flaws are connected to the inside pin surface and extend to the depths specified in Appendix VIII.

16

i The test gecimens may contain both fabrication and service or laboratory induced flaws.

i l

t Flaws were installed near various inside diameter and outside diameter w conditions, geometric conditions (e.g., counterbores), and certain surface associated i

conditions. 'the service or laboratory induced Haws include thermal and mechanical cracks or IGSCC cracks (or both). Test specimen selection is directed towards provid uniform degree of difficulty. Test specimens ase divided into grading units, with each unit i

having at !rast three inches of weld length. For a candidate taking the performance i

demonstration for detection, a minimum number of flaws in a test set are identified in the j

appropriate supplement to Appendix Vill. For the performance demonstration to quali i

procedures with multiple essential variables. the minimum number of grading units can be i

used for each limit of each variable. Flawed grading units within test set specimens are required to meet specified criteria for naw depth, orientation, and type. Other demonstrationi conditions controlled by the procedures include single side access qualification, len i

through wall sizing, and IOSCC quall6 cation.

(

2.3.1 Data Acquisitloe Candidates examine specimens in accordance with applicable pipe procedures, essential variable values, and preestablished limitations. Areas to be examined on specified test specimens are identified by PDA, and examinations are restricted to those areas.

I Portions outside the area of interest are masked to prevent identification and examination.-

Administrative controls limit scanning time for detection to 2 hours2.314815e-5 days <br />5.555556e-4 hours <br />3.306878e-6 weeks <br />7.61e-7 months <br /> per flawed grading unit, although this time can be extended by the PDA following a conference with the 5

candidate. Additional time is allowed if the candidate appears to be performing acceptably.but needs more time to complete the qualification. For manual examination c

demonstration, on the average, up to 4 hours4.62963e-5 days <br />0.00111 hours <br />6.613757e-6 weeks <br />1.522e-6 months <br /> has been allowed per flawed grading unit for detection and analysis.

. 2.3.2 Data Analysis Candidates performing manual UT demonstrations on pipe specimens are permitted to choose the analysis method of their preference provided they finish within the allotted time for this activity. Itadiographs are available to the candidates upon request.

However, these radiographs do not contain any useful information on specific Daws within applicable grading units. The radiographs contained representative images of geometric or weld root conditions (or both).

Candidates document flaws detected during performance demonstrations on forms provided by PDA. The procedures include instructions for documenting indications detected in terms of position, length and depth in piping, nozzles, vessel shell welds, and bolting. Performance demonstration results are graded on a pass fall basis Upon completion of the demonstration, the PDA prepares a Performance Demonstration Qualification Summary (PDQS). The team noted that, although the information on the PDQS showed the areas qualified, the sequence of events was difficult to follow without 17 ge, n,-

vw..u.,mu_,,,.,,.,mr,n.,,,..

--n,,,

,,-,wn.w,,--w.,...e.v-ew,,---g.ww,--..,,v.-..------.~mw.,..

n.n,,w,,

aa,,

,---e e--,,m-wn,-,.,-,.,,.,_,n,.--

i i

eaamining each tocumer..,n the file (i.e., demonstrations performed, test sets used.

j retssts taken, daies, and in some cases, signatures), in part, because all records were still undergoing review. Discussions with cognizant PDI personnel disclosed that efforts were underway to review these records for completeness and accuracy and to computerire the data for easy access.

2.3.3 Pmonnel Qualifications j

To determine the performance of the candidates, the team randomly selected 27 performance demonstration packages for review. The items reviewed included such things as flaw detection, length string, through wall depth measurements, IGSCC i

detection, access restriction, retests, and pass fall results. Based on the review, the team noted that 78% of the candidates met minimum detection requirements 59% met j

minimum length siring requirements; 33% met minimum through wall depth measurements requirements; and 51% successfully met IGSCC detection requirements.

1 Although these percentages are based on a sampling that included retests, they show that the candidate must be highly skilled in order to successfully pass the test, i

Candidates who were unsucceuful during their initisi demonstration may take a retest, if i

the candidate falls the retest, the candidate receives training for 7 days and then takes a second retest. After each unsucceuful attempt, the candidate is briefed on the problem areas that contributed to the failure. A candidate may take more than two retests if PDI has testing time available. Taking more than two retests in 1 year is contrary to i

Appendix VII-4360(c) requirements. PDI said that if more than two retests are taken in !

year, it would state this as an exception on the PDQS. The unlimited retesting may i

compromise specimen security and present a situation for incorrect reporting of the candidate's qualification. 'Ihis issue will be tracked as item 95 01-07.

2.3.4 Procedure Quaurications Rules governing equipment and procedure qualifications are the same as those used for personnel qualification described above, with three exceptisns:

l (1)

Results of several successful candidates may be combined for this j

purpose.

(2)

Candidates must demonstrate that each transducer identified was used i

and that it was an integral part of the demonstration.

(3)

Administrative limits used for the subject demonstrations and retests are i

optional.-

4 18 b

8 9

....,w.,

,, -i7-

---e.

mm..

.,.,.y-

.,-,,_.,..%,,..w---,-y.,

,7-, -.,,

-,-3_,

y, y.-._,,,,c

.n,

2.3.5 Conclusions The testing process was well organized, administratively controlled, and comprehensive.

The nature of the indications in test specimens used for demonstrations presented a formidable challenge to well qualified l'T examiners and should cause those with limited capabilities to fall. PDI has bypassed implementation of Appendix Vil, which is referenced in Appendix VIII. As noted above, when candidates take retests, they should be subjected to the two per year retest restriction contained in Appendix VII 4360(c).

2.4 Non UT Procedures and Documentation 4

In reviewing the non UT procedures and documents, the team concentrated on security and candidate qualification records. PDI incorporated the " conduct of performance demonstrations" from the supplements to Appendix VIIIinto the security procedures. PDI incorporated in the quality assurance (QA) program, Appendix Vill, Subsection 5000,

" Record of Qualification," requirements for qualification records.

4

'Ihe team performed a limited review of the PDI and PDA organization plan, prerequisites of candidates, esting administration, internal qualification of PDI staff, calibration and measurement, nonconforming items, audit items, and quality assurance records not connected with the Performance Demonstration Qualification Summary (PDQS).

2.4.1 Security of Test Specimens and Information PDI developed security procedures to ensure that the candidates do not possess knowledge of flaw identity prior to testing. The team reviewed the procedures for specimen design, controls during specimen fabrication, safeguarding of specimens, security durit g testing, and safeguarding of demonstration data. The objective of the review was to identify strengths and weaknesses with the confidential aspects of the program.

2.4.1.1 Identifying and Fabricating Specinwas To initiate the specimen fabrication piocess, PDI developed Procedure PSP-001 (Rev. 0,, dated October 26,1992), " General Design Specification for Piping Performance Demonstration Specimens." The procedure translates Appendix VIII requirements into terms that are suitable for designing full scale mock ups of the piping and RPVs found in the plants that have ASME Section XI ISI programs. PDA identified the general requirements for material selection, flaw size and frequency, flaw characteristic and placement, flaw implantation, nondestructive examination, mounting ar.d handling, quality assurance, and confidentiality. PDI and PDA maintained the records generated dt..ing the course of specimen design, material procurement, special fabrication processes, in process and final inspection, and other related activities. From these general requirements, PDI selected the flaw 19

configurations to be included in the various fabricated test specimens. After flaw selection, the responsible PD! :ngineer developed a specific manufacturing specification, flaw specification, and any drawings appropriate for the fabrication of that specimen.

To control the specimen fabrication process, PDI developed Procedure PDI G 001 Rev. 2, dated June 1,1993, "Prepamtion, Use and Control of Process Control Sheet." Process Control Sheets (PCSs) are used to document various examinations, measurements, or tests of materials, and processing as necessary (PCSs are the same as shop v.ork orders). A PCS is prepared by the responsible engineer who is_ familiar with the processing. De PCS is a task comprised of sequential steps and work instructions necessary to accomplish the finished fabrication and may be described as a separate document, drawing, model, or a combination of these. Upon completion i

of each step, the individual performing the work is required to initial and date the PCS. Dere are multiple tasks for each fabrication. After completion of each task,

)

there is a QA review and an approval by the responsible engineer. Dese PCSs are filed as part of the permanent manufactu lag records for that particular fabrication.

l To maintain the necessary documents for the specimen fabrication process, PD1 developed Procedure PDI-Q-013 (Rev. 2, dated April 8,1994), "PDI Specimen Manufacturing and Examination Records Indexing." The procedure gives the instructions for controlling the manufacturing and examination documents and for establishing the hanis for cuality control records and record retention. Using the PD1 Specimen Quality Records Checklist, the responsible engineer identifies and verifles that the necessary quality assurance records have been completed. The verification is the last step in the specimen fabrication process.

After specimens are fabricated, the PDI and PDA program shifts to the performance demonstration process.

2.4.1.2 Handtlas, Stornse, and Shipping of Specimens To control the tracking of specimens, PDI developed Procedure PDP-Q-013, (Rev. O, dated October 1,1993), " Handling, Storase, and Shipping." The procedure describes the controls, handling, storage, and shipping of PDI test specimer.s. Although most of the performance demonstration is conducted at the EPRI NDE Center, the performance demonstration may be performed at other locations (i.e., the licensee or vendor facility). De test specimens are kept under the direct control of PDA or ender the control of an organization that has a PDA approved security plan.

When test specimens are transported to remote testing locations, PDA uses commercial trucking firms. Included in the shipping order is a PDA statement stipulating that the test specimens be shipped as a single load. The single load requirement avoids delays and prevents breaking the door seals before the truck 20

reaches the test site. Unless other arrangements are made, a designated supervisor, level III examiner, or a member of the PDA off site team will meet the shipment of test specimens upon arrival at the ISI vendor's facility, the EPRI NDE Center, or the host utility. After delivery, the test specimens are stored in designa'ed and controlled access storage areas or the PDA secured laboratory.12rge test specimens are covered and secured with locked chains or cables. The cover over the specimens prevents visual and nondestructive examination.

2.4.1.3 Handling Specimens During the Demonstration To control test specimens during the demonstration, PDI developed Procedure PDP.

Q-008.1, " Control and Security of Test Specimen," which describes the essential elements for maintainin demonstration process. g the security of PDI test specimens during the performan At the EPRI NDE Center and off site testing facility, the specimens are continually monitored. Monitoring is predominantly performed by PDA staff and is supplemented with remote video recording as,d display equipment.

When the test specimens are not in ut

'Sey are located in designated storage areas that are secured and monitored. Routine surveillance is performed by PDA staff during handling, storage, and testing of the specimens and test information. The surveillance verifies that requirements are being met for the security and control of the test specimens and that confidentiality of the test information is maintained. 'Ihe surveillance is implemented in accordance with written procedures and documented on checklists.

2.4.1.4 Handilag, Storage, and Shipping of Data To control the information generated during the performance demonstrations, PDI developed Procedure PDP-Q-008, " Security of Test Specimen Information." The procedure describes the essential elements for maintaining the security of confidential information during the performance demonstration process. Access to test data is controlled similar to access to test speciniens (i.e., all access to confidential information is controlled by the PDI project manager). Once access is granted, a

' Signature Authority / Authorized Access Log" is filled out and posted in the records room. Access is granted only on a need-to know basis. When information is not in use, it is locked in cabinets, rooms, or key-controlled computers. PDI and PDA corpputers are protected with password access. At remote locations, data generation and information handling are controlled by the PDA approved security plan. PDA requires that the security plan be signed by a company officer or representative.

Automated UT systems that are used for performance demonstrations are examined b the PDA staff for security and data generating aspects. Only the files necessary for ccnducting the performance demonstration are allowed on the computer systems. The proctor verifies files on the computer by running a directory of the computer files necessary far performing perfotmance demonstrations and recording the remaining 21

I file storage space. De data disks are kept under positive control of the P who !s assigned surveillance responsibility.

4 During an automatic UT examination of a RPV conducted per the PDI progr 1

demonstrated the site specific compute security to the team. Data generated f scanning transducer was collected on a removable hard drive. The hard drive was removed and placed into a second computer system that transferred the data to a compact dise (CD). After the accuracy of transferred data was verified for int the hard drive was reformatted and placed back into the data gathering equip The CD now contdns all the raw data. PDI does not make copies of the CDs i

containing the raw data. By contractual agreement, PDI retains the control and acce to the CDs. If a licensee or contractor wanted data storage control by an entit than PDI, PDI would stipulate that the licensee or contractor would not have acces i

the data without PD1 involvement. PDI does not accept responsibility foi t of the data on the CDs.

t D* use of alternative methods for achie"in; the required level of test information security is perinissible provided PDA approves the alternative. The security sho be provided by an independent organization other than the organization of t taking the examination. Altematively, where special conditions exist that precl meeting security requirements, the individual must provide a suitable method of i

achieving the level of security required by PDA.

2.4.1.5 Conclusions PDI and PDA have devised and implemented an effective security program to the integrity of the test specimens, to prevent compromise during testing, and to protect test data. De security program satisfies the requir,ements of Appendix Vill.

No areas for improvement were identified for the security procedures that were reviewed.

2.4.2 Qualification Records i

Appendix VIII, Subsection 5000, " Record of Qualification," requires that PDI and PDA maintain qualification records, he team reviewed the storage and t:trievability of performance demonstration records, he review covered the performance test results for i

candidates, the equipment used by candidates, locating data in the files, and identify potential problem areas. The objective was to examine the difficultics associated with a response to an eaternal inquiry for a candidate's qualification.

2.4.2.1 Candidate and Equipment Performance Demonstration Sheets To document the performance demonstration results, PDA developed Procedure PDP-Q-017, (Rev.1, date March 23,1994), " Quality Assurance Records." This 22 1

e

procedure describes the storage, identification, retenQn, ed retrievability of performance demonstration QA records.

The performance demonstration Qualification Data Summary Sheet (QDSS) identifies the material categories, thicknesses, and diameters that have been successfully demonstrated by the candidate, including the appropriate ASME Code and equipment.

The QDSS has an easy to-read format except for IGSCC entries. IGSCC entries are identified by asterisks and the meanings of these entries are explained as footnotes at the bottom of the summary sheet. Occasionally, PDA would take an exception to specific items in a candidate's procedure. Any exceptions are also explained at the bottom of the summary sheet.

The essential variables associated with the equipment used by the candidate during the performance demonstrations are listed on the Qualification Calibration Data Sheets (QCDS). The QCDS form is filled out by the candidate. The QCDS is combined with the performance demonstration QuallScation Surveillance Checklist (QSC). The QSC is an aid for the PDA proctor to verify that the individual performing the 2

demonstration is following the procedures, equipment settings, and filling out the paperwork completely. Deh entry is initialed by the PDA proctor overseeing the demonstration. The QCDSs are filed with the candidate's records that also contain the candidate's work sheets. As described in Section 2.3.2 above, the reconstruction of the test sequence and various qualifications is difficult.

A compilation of QCDSs is used as the input for the QCDS summary. The QCDS summary identifies the qualified range for each essential variable. The QCDS summaries are not in the equipment files, but are listed in the generic procedure files.

The generic procedures are routinely revised as new equipment and essential variable settings are added.

2.4.2.2 Performance Demonstration Quallfleation Summary (PDQS)

By combining the data from the QDSS and QCDS, PD/. creates a Performance Demonstration Qualification Summary (PDQS). Currently, these sheets nave to be compiled from the individual's file and the equipment qualification file which is a labor intensive task. PDI is in the initial phase of creating electronic retrieval of the data. PDQSs are not available for candidates at this time. To date, PDA has issued to one organization, four procedure PDQSs, for equipment and LTT procedures used in a performance demonstration. These qualifications were for a company outside the United States (U.S.) and not applicable to the U.S. nuclear industry.

Upon completion of the demonstration, PDA will verbally notify the candidate and t e h

candidate's employer of the test results. PDA does not intend to issue PDQSs until after January 1996. A utility can ve:ify a candidate's qualification by calling PDA.

While PDA has committed to provide the information over the phone, the protocol 23

i i

hr.: not been finalized, in the future, the PDQSs will be stored on an electronic database with backup files. De backup file will be sent to off4fte storage areas.

j After the electronic database is created, the current paper file system would place and be available for data retrieval, l

PDI procedures state that lifetime recotus will be maintained for the useful life of the

{

project as directed by the PDI Steering Committee; that is, the continued existence of i

lifetime records is dependent upon PDi's continued existence. Lifedme records i

consist of the PDQS for the equipmr.nt, individuals, and procedures. Nonpermanent i

records are kept for 7 years. Typical nonpermanent records are candidate and organintional use agreements, candidate registration forms, the supporting data with L

actual test results used to compile the PDQS records, other chocidists, and QC

{

examinations.

l In the event that changes are made to the acceptance criteria that affect the test i

{

results, PDA would review the data and issue a revised PDQS, if appropriate. The i

ease of retrieving the data is discussed in Section 2.3.2. Any major change would raquire the same level of review as the original PDQS.

l Besides obvious difficulties with the paper intensive filing system, an additional i

burden will be placed on utilities when equipment or procedures are revised or an equivalent procedure evaluation is made, ne u2 des will have to perform an evaluadon of the new revision against the revision that the candidate used for j.

qualification; the same is true for equivalent procedures.

l j

3.4.2.3 Conchasloes j

The performance qualification records satisfy the requirements of Appendix VI!!.

(

i However, the team identified areas for improvement. De format of the Qualification Data Summary Sheet (QDSS) could cause a reviewer to min the excepticas to the test or misidentify a candidate's !GSCC qualifications. F'.trthermore, the current method i

of finding each Qualification Calibration Data Sheet (QCDS) in the candidate's file-j.

and comparing it to the equipment files is cumbersome. These are currently considered areas for improvement that may_ be addressed when the electronic data retrieval system is in place. The areas will be tracked as item 95 0108.

L l

2.5 P>rfennense Demonstranton Tenatng of P(ping Maternals he team reviewed the performance demonstration test sequence that is offered to candidates for piping material. De review included assembling the test sets, conducting the tests, scoring of results, retesting, discussing the results, and reporting the information. As part of j-the review, the team evaluated the performance demonstration with the requirements of j-

- Supplements 2,3, and 12 to Appendix VIII.

^

24 1

l e..

_-..,....w

..-.-...m,-

.,,..-,-m..~.-..,

..%,-.._,._m.,,w_%m.m.,,.,.w,.e..,,.-_.m-%.,-..,.~

..,_,_.,. ~,.,,

e

2.6.1 Test Set Design i

The test sets are designed with a computer program containing the requirements of Appendix Vill, manual input of the candidate's qualification request, and manua of available specimens, ne candidate's qudification is defined by the material typ!

wali thickness, the pipe diameters, the flaw types, the access conditions, and the num i

i of unflawed grading units needed. As specimens are selected, the contribu make to the test set are noted and subtracted from a list of the requirements. Sp i

are selected until the matrix is complete. The computer is a useful tool for ensuring!

all of the Code requirements and candidate's needs are met. Separate compute j

are used to design the detection test set and the siring test set.

i, Not all the conditions being demonstrated by the candidates are explicitly requ i

Code, for example, IGSCC and single side access cracks (cracks located on one side o the weld that are inspected from the opposite side of the weld). The additicn of IGSCC to the performance demonstration program is a PDI effort to satisfy the demonstration i

testing contained in NUREG 0313, Revis'.- 2. Thew non Code condit!ons were j

included in the baseline test set with the number of specimens per condition bein j

arbitrarily determined by PDI.

in the PDI program for Supplement 12 of Appendix VIII, the candidates are requir i

detect the minimum number of flaws as shown in the appropriate supplements. From this flaw selection, the candidate can qualify for detection in piping for the following conditions: minimum and maximum dianster, thickness range, ferritic, austenitic, far-side examination, and IGSCC. In addition,50% of the flaws must be associated with geometry. Multiple conditions may exist within one test specimen. The lumping of conditions can have a negative compounding effect on the candidate's success rate.

j A review of test sets showed that some candidates test iets had more than one spec j

with no flaws. De absence of flaws in some of the test specimens adds authentici j

the performance demonstration program.

2.5.2 Test Execution i

he PDA conducts performance demonstrations in special laboratories at the EPRI NDE a

Center. When testing is in progress, the labs are maintained under continuous j

surveillance with a PDA proctor and TV camera monitoring systems overseeing the

[

testing process. To ensure the security of the tests, the candidates cannot remove paperwork from the testing laboratory, and are not to disclose w test information while participating in the performance demonstration in addition, the specimens for a given test condition were made to appear similar with no visible permanent markings, and the i

large number of piping specimens ensuit that candidates from the same company do n une the same test sets.

j 25

?

4 4

N' I,.

~.-m,-.,.

.-.-m-, _ _,.. -. _ _ _ _

,.... -, - - _... - - - -.., -. -. -. _..,.,. - _ -.. ~,

The candidates are restucted to a maximum of 4 hours4.62963e-5 days <br />0.00111 hours <br />6.613757e-6 weeks <br />1.522e-6 months <br /> per specimen to complete the l

detectior, requirements. After testing each specimen, the candidate submits paperwo j

generated durisig the test to the PDA proctor. The candidate explains to the PDA pro i

the reasoning for declaring a detected flaw. To support the detection, the candidate uses the tests and associated signals. The PDA proctor uses data forms to enter comments and verify test equipment.

2.5.3 Scoring Test Results j

Using a test specific code sheet, PDA manually compares detection test results for the I

specimens used against the associated specimen's true state drawings. The team reviewed i

results from a sampling of candidates and found that data interpretation and scoring wer j

straight forward. The grading units were consistent with ASME Code requirements.

The scoring of the length sizing is determined by the number of observations (n) and the measured length of each crack (m) as compared to the true state (6) using the root mean squarr error (RMSF) statistic Neither circemf rential offset nor axial offset in the flaw j

location is considered in the RMSE statistic. In a similar manner, the depth string is j

evaluated using the RMSE statistic, ne ASME Code has begun including the RMSE in i

i the supplements of Appendix VIII, For example, Supplement 12 of Appendix VIII give<

{

the RMSE as follows:

i l

i

)

(m,-e,):*

g,g j

,1 0

+

i, l

2.5A Reteattag and Expansion o

As the testing progresses, the candidate submits test results for evaluation to the test proctor. If a candidate demonstrates insufficient UT capapilities during the test, the testing may be stopped and the candidate may be dire:ted to spend time on practice samples before starting a retest, his practice makes the testing more cost effective and efficient for candidates. The QA protocol that was developed (PDP.! 009.5 and PDP 1-009.6) idendfies a set of conditions (e.g., falling a category of pipe wall thickness) in which a limited retest could be performed; however, limited retesting is seldom done. Ir.

general, candidates pass or fall with very few borderline cases.

Once a candidate successfully passes a set of conditions, the candidate can request to be tested on other conditions. For instance, after passing a dual side access condition test, the candidate can request for a single side access test set. The requirements are spe!'ed out in the relevant PDI documents (for instance, PDP I 009.6.1 for Supplement 4 for single side access identifies the trinimum number of flaws as 5) and include the minimum 26

... -. L -

.L-

4 4

i and raaximum Daw size. Other parts of.ac PDI documents cover other categories Supplements.

i 2.5 5 Feedbas to Candidates that Fall a Test

}

The feedback process wks net observed for an actual candidate; however, it was j

nimulated, in the simuladon, en NRC NDE technician took an abbreviated perfo demonstration. Following the testing, the PDA staff provided feedback about the 4

detections, miswd flaws, and false calls. The feedback focuses on the kinds of cond j

  • epresented by the spcciment. The candidates are expected to use the feedback to 1

identify their whnessra. For retesting, PDA selects different test sets than what the

]

candide ed bWec.

)

2.5.6 Repc:tlag of Test Resdts 4

Ultimately, the performance of the candidate will be formally documented in the P ne d;scussion on the PDQS is in Section 2.4.2.2 above.

Currently, test results are i

documented in individual files for each candidate.

]

2.8.7 Findlass s

Substandal work has gone into developing the PDl/PDA program. PDI and PDA

}

analyzed the IGSCC Coordination Plan testing program for BWR IGSCC qualification.

The major team observation of the IGSCC Coordination Plan program was that j

candidates taking the demonstration tests could skew the results to increase the i

for guessing an acceptable answer (testmanship). To avoid the short comings of the IOSCC Coordination Flan program, the PDI program was designed to minimize i

i testmanship. Some features of the PDI program are intended to cause candidates trying; to use testmanship to fail the demonstration, Such features include use of multiple flaws 2

in a Sngle specimen, use of one or more entirely blank specimens, unknown number of

{

total flaws in a test set, number of blank grading units being used, sizes of the grading uni'.;, length sizes of cracks, location and characteristics of tae counterbore, and crack i

sizes in the specimen set as a function of the pipe wall thickness. Dese conditions are scattered throughout the test and represent a significant challenge to the candidate.

1 The Code has used a hierarchical approach for testing of piping. In this approach, a candidate first demonstrates acceptable performance on austenitic pipe and then can include ferriSc pipe with the addition of 3 flawed grading units and 6 unflawed grading l

i units. The mndidate must classify all 9 grading units correctly to successfully pass the ferritic portion of the detection test. The addition of the 3 ferritic pipe flaws does not l

contribute to the flaw grading unit count shown in Appendix Vill, Table Vill S21.

However, the PDI program uses thl approach only for ferritic piping.

i 27 l

1 i

De Lode does not >pecifically adwess IGSCC or singleside access acceptance criteria m the supplements; therefore, PDI #Mrmined what constituted :i successful performance demonstration. When the categories ofIGSCC and single sided access examinations are included, the PDA program integrates these conditions as part of the detection and lengt sizing test. Under PDl's present program, a test set for Supplement 2, which includes hese categories, will consist of at least three IGSCC flaws and no less than three single-t side access flaws. Ir. addition to the normal code acceptance criteria, in order to be considered qualified for IGSCC or single sided access, the candidate is not allowed to have more than one missed detection in either of the IGSCC or single side access category. If the candidate passes the Supplement 2 acceptance criteria, but does not meet the IGSCC or single side access requirements, a Supplement 2 qualification is gained in auonitic material only. The candidate is reevaluated in the failed category (s) by being tested on additional flaws in the applicable category)s). De ability to qualify with both of the allowed misses in one or the other of these conditions will be tracked as it 01 09.

l De Code uses an acceptance tolerance for d"pth sizing dubbed " critical undersizing."

PDI has developed a position paper (PDI Position 94 002) and is recommending changes to the Code that depth sizing of cracks in piping be evaluated using a 0.125 inches root mean square error (RMSE) statistic. The critical undersizing criteria in the Code for depth sizing would greatly reduce the number of candidates passing the performance demonstrations. At the time of the review, the RMSE criteria was met by 66 candidates, if PDA used the critical undersizing criteria,17 of the 66 candidates would have failed.

In all cases, one crack was significantly undersized. Some of the reported values were close to the limit for' passing the demonstration, and the remainder were progressively worse. Since the sizing error results in undersizing, it is non conservative and may lead to inadequate corrective action. The use of the RMSE statistic provides a measure of performance in the field, whereas the use of critical undersizing requirements generally results in candidates trying to avoid an undersizing error by inflating the reported values (i.e., testmanship).

Tne cracks that have caused candidates the most difficulty are the cracks which are mid-wall IGSCC. Sizing techniques for shallow and deep cracks are quite effective, but sizing the mid-wall cracks is more difficult and is based on finding the deepest crack tip.

With the mki wall IGSCC specimens having such a profound effect on the pass rate PD1 and PDA should reexamine the procedures for adequate detail. This issue will be tracked as item 9541 10.

2.5.8 Coachseless Overall, the performance demonstratiori program for piping welds has a number of positive features. It has been designed to minimize testmanship, and the specimens require that candidates have a good procedure and extensive knowledge in UT to successfully pass the test. However, the manner in which PDA includes IGSCC and 28

,~

.ngle side access in the test set could allow candidates to pass who are not effectiv these areas. De team concluded that this position should be ree"11uated (see it i

09). The intent of the Code is that a candidate demonstrate effec inspection conditions and that a small number of the easier conditions be added to confirm that the candidate is effective for the less difficult conditions (i.e.,3 flawed and 6 unflawed ferritic grading units added to the austenitic test), and can correc all of the grading units.

The 1992 Edition of ASME Section XI, Appendix VI!!, included RMSE depth si i

piping. De RMSE provides in a single statistic a good measure of sizing performa when based on multiple coneet characterization of flaws for each tested condition, t

However, if the error that a candidate makes is largely associated with a single i

measurement, substantial undenizing can occur. Candidates need to better understand the cause of this undersizing so that they do not make this type of mistake as inspectors the field.

De team favored the RMSE method of determining acceptability of sizing results over the critical undenizing criteria that existed in the 1989 Edition of ASME i

Section XI. Appendix VIII. However, the W det.:rmination with the actual RMSE i

value remains with PDI and the Code.

2.6 Perfonnance Demonarnaion Tening of RPV Materials The team reviewed the performance demonstration testing for RPVs that is offered to the candidates. The review included designing the test sets, conducting the tests, scoring th l

1 results, setesting, explaining results, and reporting performance demonstration information.

1

' As part of the review, the team evaluated the performance demonstration of RPVs agai requirements of Suppitments 4 and 6 to Appendix VIII. This performance demonstration has been conducted only with automated UT equipment.

5 2.6.1 Test Set Design The test specimens for the RPV are designed to meet the requirements of the Code.

There are 9 specimens: however, the test grid locations can be redefined at will, thus creating a large number of test set combinationr. Using a black pen, the test grids are prodrawn on a specimen, then the candidate is instructed to inspect selected grids. The grids can be redrawn and different grids can be selected on a given specimen for retests.

A grid can be subdivided into zones.

Some companies with automated equipment are examining the entire specimen, then storing the data for later performance demonstration exaainations, ne stored data is under the control of PDA. To provide a large number of test combinations PDA selects the starting locadons in the stored data.

2.6.2 Test Executloo i

29

}

,em-

. + - - -, - -,

~--,,--,,m.v w

we

,---------e

<-wr,-, - ~,, - - -

The RPV specimens may be examined at the EPRI NDE Center or at an off site location.

For off sie examinations, the inspection organization uses a security plan acceptable to PDA, During the inspections, PDA provides a proctor to observe the testing process an scrutinize the candidate's data.

While analyzing the data collected on a zone, candidates are expected to maintain a complete set of paperwork supporting any of their findings. The candidate presents in detail the data for each declared flaw to the test proctor. Tne test proctor confirms that the procedure has been followed and that detailed location, sizing, and classification infonnation is given. Since most of the equipment used for RPV ISI is automated, the data packet is compiled for each zone analyzed. The candidate includes the images showing the flaws in the paperwork. Several members of the team observed a candidate completing an inspection analysis packet and being interviewed by the test proctor.

2.6.3 Scoring Results The tee. n evaluated scoring results by review'r.g he data of some candidates that have gone through the PDA program. The grading unit concept used for piping test sets is not used in the RPV test sets. For the RPV tests, PDI defines the detection of a flaw as being within l' of the flaw's true location. The tolerance used by PDI is double the tolerance of Appendix VIII, Supplement 4. PDI has not initiated any action with the Section XI Committee on the tolerance changes. This issue will be tracked as Item 95-01 11.

In addition, PDA has found systematic errors associhted with equipment setup and measurements that produced an identifiable shift of all the flaws in a recognizable pattem.

If PDA determines this to be the case and the systematic s'dt la within an additional 1" of the true-flaw location plus tolerance, PDA may declare a detection. The 1 inch tolerance plus 1 inch systematic shift quadruples the toleranen of Appendix Vill, The use of the systematic shift can be thought of as an " engineering tbhe factor" for variables not awociated with the candidate's skills. As equipment, techiQue, and procedures incorporate the appropriate adjustments, the systematic shift should become nonexistent.

2.6.4 Retesting and Expanalon The same comments apply as in the case of piping (see Section 2.5.4 above). However, since automated equipment is used in the data collection, and many of the grids on a specimen are examincd for a given scanning operation, a new test set can be identified for the candidate to analyse without rescanning the specimen. Some zones between the two test sets may overlap, but most will not.

30 7

O

I

,2.6.5 Fociback Provided to Candidate 4

i The feedback process for RPV is umur ta that which is used for candidates that tak piping test.

j 2.6.6 Reporting of Test Results i

The same comments apply as in the case of the piping. Results will be en the PD since this form has not been developed, it was not reviewed and evaluated, i

)

2.6.7 Concluelons in the past, UT procedures were very flexible with little detailed guidance. The i

procedures associated with the PDI program and the intent of Article VIII 2000 of j

Saction XI require significant detail. The UT procedures being submitted to PDA for procedure qualificauon are def): lent in detail, resulting in delays. One shortcoming is that some essential variables and essentirl variable ranges are omitted. Another

{

shortcoming is the absence of a method or criteria for discriminating between indications, i

As discussed in Section 2.1, for generic procedures, application specific procedures should contain sufficient detail to minimize variations between the performance i

i.

demonstration and the field implementation. In reviewing the final procedures that the companies used, the team found that they met the minimum requirements of Append VIII, but additional information would be necessary to enable confirmation that field implementation is consistent with the performance demonstration inspection (see Item 9 01 01).

Appendix VIII contains a list of the essential variables. The difficulty PDA has in j

meeting Appendix VIII requirements is that the essential variable list is too rigid. Some of the variables that are necessary for detectability, reproducibility, and repeatabilit applying new technologies are not on the list. '!he software used with a system is a i

ery variable that should be verifiable. These necessary variables are considered essential to the per,*rermance of the procedure and should be addressed. This is j

considered to be a problem that the code committee should address.

4 One area that is unclear is how to document software used in the performance demonstration test is correlated to softwart used in the field (i.e., is the software the same as that used during the performance demonstration). To address this concern, PDA decided that the candidate must provide a copy of the software being used so that, if 3

questions arise in the future, the version maintained on file by PDA can be compared to i

the version being used in the field. This will not be an easy comparison, and there is no t

easy method to address this area unless the softwwe is copyrighted and documented in i

PDA records 'This area of improvement will be tracked as Item 95-01-12.

i 31 l

wy-

.... -m..

.v

..m-,~,.

-.m..--

-.,.u

,...-,-e.-

,-- w

-.e..,

4 The team is aware of the tolerances used for scoring a detection in RPV performance demonstrations; however, in the :$ :nce of a PDI position paper, the team did not i

nerform a review on the subject (see item 95 01 11).

PDI has developed position papers for lengt sizing of cracks in RPVs (PDI Positions 94 005 and 94 008) and is recommending changes to the code committee that a tolerance of 0.75 inches root mean square error (RMSE) be used. The team does not take exception to PDl's position.

1 One area of concern is that the performance demonstration test is being conducted using scanners that have been modified to scan just for testing (i.e., these types of scanners are 1

not used to perform UT in the field). The field scanners are radioactive from previous ISI and cannot be used on PDI specimens. The specific concern is that there should be a means to prove that the improvised scanners used in the performance demonstrations are 4

not superior to the scanners used for th t field ISI so as not to compromise qualifications.

This issue will be tracked as Item 95 01 13.

2 2.7 Bohing Esamination Examination of bolting under the PDI program will be performed in accordance with instructions in Procedure PDP I-009.8 (Rev. 0), " Test Specimen Selection, Grading and Retest Instruction," and the PDI QA program. The procedure is in the final review process and as such lacks official document status. However, the team ascertained from PDI and 4

PDA that approval of the procedure in its present form is expected shortly.

In the procedure, pre-established parameters for particular organizations are used for i

specimen selection. A specimen test set contains at least three different bolt or stud diameters and lengths with a minimum of five flaws per test set. Each test set would contain one flaw per location, including circumferential notches located at rninimum and maximum t

metal paths of the specimen (i.e., thread root, shaft, and bore holes). EDM notches would be lanwi on the outside thread surface locations and inner bore hole surface of studs wi maximum depths and reflective areas as specified in the procedure. To enare a " blind" qualification, notches and specimen identification would be obscured from the candidate, an i

examination time limit, as prescribed by the procedure, would be enforced, and no specimen pre-service data would be available to the candidate.

Acceptance criteria for a successful demonstration would consist of minimum detection and

(

false call requirements. A successful detection demonstration would depend on the number of flaws within a test. For example, in a test set with five flaws, a minimum of five flaws would have to be detected with no false calls allowed. In a test set where the maximum ten flaws are included, a successful demonstration would require eight indications to be detected with two false calls permitted. Limits defining false calls are as follows:

e Notches axial location: i% inches or 5% of bolt or stud length, whichever is greater.

32 e

4.ge,

-my 3-m-, -, - -

w-g p,--

-,..,,__---,,,-,-y-+

o Flaw, circumferential position: detection is within i 60' of Daw centerline location.

Retest following an unsuccessful demonstration is permitted following debriefing on the contributing to the failure. A retest would contain a test set with at least 75% new Daws. If unsuccessful during the second demonstration, a waiting period of 30 days is required be another attempt is made. At the tirae of this review, PDI was developing fixtures and the administrative program to handle this activity. The above procedure and the bolting specimens observed by the assessment team, along with the administrative controls as described by the PDA, appear to be satisfactory for the activity. The limits for the axial location of notches complies with Appendix VIII, Supplement 8, acceptance criteria.

2.8 ASME Code The PDI program was developed to meet Appendix VIII to the 1989 Edition with 1989 Addenda for the fabrication of specimens and to the 1992 Edition with 1993 Addenda of Section XI of the ASME Code with application of certain Code cases for the demonstration program.

2.8.1 PDI/PDA Exceptions to ASME Code ne team reviewed the exceptions to Appendix VIII where the requirements have been identified as impractical.

2.8.1.1 Items Reviewed PDI is complying with Appendix VIII with the exception ofimpractical requirements identified during implementation of the appendix. The impmetical requirements relate to specimen security, testing tolerances, and training veluch are expected to be the subject of future code cases. De team reviewed the technical justification and reasonableness of the alternatives developed for the impractical requirements.

Situations not specifically addressed by code, were being addressed in the PDI program by the PDI steering committee and subcommittees, A summary of the impractical requirements with the team's conclusion for each position is contained in Table 2.

33

.... ~

~

^

Table 2 Summary of PDI's Alternative Poshions PDI Position Description Team's Conclusions94-001 Added tolerance to qualineation diameter Does not take exception and thickness ranges94-002 Change tolerance for length to 0.75" Does not take exception RMSE and depth to 0.125" RMSE, Pl ing P

94-003 Change tolerance for notch width to a tip Does not take exception l

dimension 94-004 Added tolerance to qualification diameter Does not take exception j

and thickness ranges 94 005 Change length tolerance to 0.75" RMSE, Does not take eTeeption MV 4

94-006 Change from sizing a specific location on piping to a region Does not take exception 94-007 Masking is not required See 2.8.1.2. below 94408 Change length tolerance to 0.75" RMSE, Does not take exception clad RPV 4

94-009 Change number of flaw depth ranges Sep 2.8.1.3. below 94-010 Change depth tolerance to 0.15" RMSE, Does not take exception RPV cladding /basemetal interface; 0.25" RMSE the remainder of the RPV l

2.8.1.2 PDI Posillom No M 007 This PDI position addressed the requirements in (upplements 4 and 6 of Appendix VIH to have specimen identification and flaw locati ins obscured or masked. The technical explanation contained in the PDI positio ' provides insufficient information to support it. The team does not take exception u the concept because the cladding on the surface of the RPV covers the flaws. However, the explanation should contain more detail and the suggested wording should be less ambiguous.

34 r

7___________.__-

o 2.8.1.3 PDI Posities No. 94 009

(

This position changed the flaw depth from three depth ranges to two depth ra

'!he change was to minimise testmanship as a factor for success in performance demonstration. The team does not take exception to the change in ranges; how i

as part of the suggested change, PD1 should include daws that would satisfy the i

acceptance criteria of Table IWB 34101. The tess sets examined by the team had at least one flaw that satisfied IWB 34101.

1 i

2.8.1.4 Deferring Using Appendix VII i

i Appendix VIII, Paragraph _V!ll 2200, requires candidates to meet the personal i

qualification requirements contained in Appendix VII. PDI defers the candidates i

prerequisites to the sponsoring agency or employer. PDI stated that in some areas i

candidates who do not meet Appendix VII requirements are being tested. To co i

with Appendix Vill, all candidates participating in the current PDI program may i

require a waiver or relief from the Appendix VII requirement. Further review of this i

issue may be necessary.

L 2.8.1.5 Changing RPV True 14 cation Tolerance Appendix VIII, Supplement 4, Paragraph 2.l(b), requires flaws to be reported within 1/2 inch of their true location. PDI is using a 1 inch tolerance from true location and i

in some c;.nes up to 2 inches as being acceptable. PDI should develop a position paper and initiate a Code case.

2.8.2 Mergias the IGSCC Program lato the PDI Program i

l During the formative stacas, PDI developed the agreement with the NRC that the appropriate parts of the "IGSCC Coordination Plan" could be included into the PDI program. In the merging of the tu programs, PDI instituted changes that could F

influence the effectiveness of both programs. The team examined PDI's methods and test j

results associated with the incorporation of the IGSCC program into the PDI program.

i.

j.

Cesapartees of NUREG4313, Regiolon 2, to the PDI Program L

NURB04313 recommends that UT examiners (candidates) be given formal

[

perforsennes demonstration tests, such as those prescribed in IE Bulletins 82-03 and 1

83428.' 'the bullstins prescribed that IGSCC examiners accurately detect and size j:

eight out of ten IOSCC flaws, i

i

  • !E Bulletis 3243. 'Strees Corvosson Cracking in Thick Wall Large-Diamseer. Seeinless Steel.

1 Recirculahan System Piring at kWR Plants,' October 14. 1982. IE Bulletin 8342. ' Stress Corrosion Cr Large Diameter Stainless Steel Recircunnhoe System Piping at BWR Plants.' March 4.1983, t

i 35 i

z i

,---,y,-y-.w-,,

.,,,,,,,,e.,,,._,,,w,,,.-.,y

.~.ym._,,.,__m__rwm,_.

...,-.,-,,_,--ww.._c,

, _mmy4,,.,.._.,,.___

_y--.,-c.,

PD: selected the number of IGAC flaws for inclusiva into the sample set on the

_s.

basis of their understanding of the requirements of Appendix VIII, Supplement 12.

The method for adding IGSCC flaws to Supplement 2 or 12 is more tolerant than the method for adding ferritic flaws to the same supplements. In PDI's opinion, the number of IGSCC flaws contained in the test set is sufficient to ensure that inexperienced candidates will fall the examination.

PDI considers their testing ofIGSCC more restrictive than NUREG-0313 because of (1) the different grading requirements and (2) the requirement that each flaw detected by the candidate be verbally explained to the PDA proctor using supporting paperwork. The verbal explanation reduces guessing and identifies poorly prepared l

candidates. The pass fail data shows that only 2 out of 75 candidates passed the IGSCC ponion of the examination by detecting one out of three IGSCC flaws while correctly detecting all the remaining carbon steel and stainless steel flaws. PDA stated that some candidates were able to pass the IGSCC portion of the performance -

demonstration without receiving IGSCC training, though most of these candidates were already IGSCC qualified. Additional discussion is contained in Section 2.5.7.

The team takes exception to the number of IGSCC samples that a candidate can detect and still be considered 1GSCC qualified. A candidate can receive credit for demonstrating detection of IGSCC when they have detected only one of the three IGSCC defects in the test set (see Item 95-01-09).

3.0 MANAGEMENT MEETINGS

'Ihe PDI and EPRI PDA management were infont.ed of the scope and purpose of the assessment at the initial entrance meeting on September 15,-1994. 'Ihe findings of the assessment were discussed with the PDA representatives during the course of the assessment and presented to PDI and PDA at the exit meeting on February 3,1995. This assessment involved extensive propnetary inforndon; therefore, the aussment report will be provided to PDI and PDA for review to klentify any proprietary items that should be deleted or withheld from the report prior to public distribution. The principal persons contacted during-NRC's assessment of the PDI program are listed below.

36

........... ~. -}