ML070790372
ML070790372 | |
Person / Time | |
---|---|
Site: | Fermi |
Issue date: | 08/11/1995 |
From: | Strosnider J Division of Engineering |
To: | Sheffel B Detroit Edison |
References | |
Download: ML070790372 (46) | |
Text
- " -** Enclosure C. 0 UNITED STATES
- dNUCLEAR REGULATORY COMMISSION WASHINGTON, D.C. 20565&1 Os' *August 11, 1995 Bruce J. Sheffel, Chairman, POI Detroit Edison Company Fermi 2 Power Plant 6400 North Dixie Highway Newport, Michigan 48166
SUBJECT:
NRC ASSESSMENT OF THE PDI PROGRAM
Dear Mr. Sheffel:
The NRC performed an assessment of the nuclear industry's efforts in implementing the requirements of Appendix VIII, "Performance Demonstration for Ultrasonic Examination Systems," to Section XI of the American Society of Mechanical Engineers (ASME) Boiler and Pressure Vessel Code (the Code). The nuclear industry created the Performance Demonstration Initiative (P01) to manage demonstration requirements cUntained in Appendix VIII. PDI designated the Electric Power Research Institute (EPRI) Nondestructive Examination (NDE)
Center to be the Performance Demonstration Administrator (PDA) to implement the POI program. Subsequently, POI was expanded to include demonstration of ultrasonic testing (UT) ability in the area of intergranular stress corrosion cracking (IGSCC) based on the "IGSCC Coordination Plan" agreement between NRC, EPRI, and the Boiling-Water Reactor Owners' Group. Individual utilities and vendors may satisfy Appendix VIII and the IGSCC Coordination Plan performance requirements by participating in the P0I program. Qualification procedures are either generic (available for broad application) or vendor- or utility-specific (established for unique application). The generic UT procedures PDI-UTI, PDI-UT2, and PDI-UT3 are a major improvement over the procedures currently used in the industry.
The assessment team reviewed POI and PDA activities associated with the performance demonstrations of the UT procedures, equipment, and personnel used to detect and size flaws. The current PDI testing program includes austenitic
- tainless steel and ferritic steel pipe welds (Appendix VIII, Supplements 2, 3, and 12), intergranular stress corrosion cracking (IGSCC) for stainless steel piping, and automated UT of reactor pressure vessel welds (Supplements 4 and 6). P0I is also doing preparatory work for bolting (Supplement 8), and the manual UT of reactor pressure vessel welds (Supplement 4 and 6).
The team observed POI meetings and had discussions with P01 and PDA staff at EPRI in September and December 1994. The onsite assessment was conducted between January 23 and February 3, 1995, with an exit meeting held at the EPRI NDE Center on February 3, 1995.
Overall, it was found that the P04) staff has established and is in the process of executing a well-planned and effec.,ive program to test UT technicians on selected portions of Appendix VIII. The team concluded that the PDI program effectively qualified NOE technicians in accordance with Appendix VIII with the exception of certain requirements that P01 found to be impractical. The specific requirements are the subject 3f ongoing evaluations by the
Bruce J. Sheffel -2 -
appropriate AS1E comittees and may be the subject of code cases. It is important that these issues be resolved on a timely basis to allow practical implementation of Appendix VIII. Situations not specifically addressed by code are addressed by the PDI steerir-j committee and subcommittee work.
Certain open items and areas where improvements could be made in the program are discussed in the assessment report. Please review the report and advise me within'30 days of any inaccuracies in the description of the PDI program or information that should, be proprietary. Also, please advise me of any actions you plan to take relative to the open *item identified in the repurt.
If you have any questions, please contact Edmund Sullivan at (301) 415-3266 or Donald Naujock at (301) 415-2767.
Sincerely, ORIGINAL SIGNED BY:
Jack R. Strosnider, Chief Materials and Chemical Engineering Branch Division of Engineering Office of Nuclear Reactor Regulation
Enclosure:
As stated
Enclosure 2 REPORT NO: 999 01288/95-01 TAC NO: M90846
SUBJECT:
The Ultrasonic Testing (UT) Performance Demonstration Initiative (PDI) of the US Nuclear Industry as currently Implemented by the Electric Power Research Institute (EPRI) Nondestructive Examination (NDE) Center as the Performance Demonstration Administrator (PDA)
NRC TEAM MEMBERS:
Full Time Harold Gray Chief, NRC Mobile Nondestructive Examination Lab (ML)
Steve Doctor Pacific North' est Laboratories (PNL)
Nick Economos Inspector, Materials Section, Region II Don Naujock NRC/NRR Materials and Chemical Engineering Branch Part Time Edmund Sullivan NRC/NRR Materials and Chemical Engineering Branch Joe Muscara NRC/RES Electrical, Materials, and Mechanical Engineering Branch Jim Coley Inspector, Materials Section, Region II Richard Harris NDE ML UT Technician Pat Peterson NDE ML UT Technician Pat Hessler Statistician, PNL APPROVED: &
Edmund J. Sullivar 4r., Chief, In-ervice Inspection Section Date Materials & Chemical Engineering Branch Office of Nuclear Reactor Regulation U.S. Nuclear Regulatory Commission
TABLE OF CONTENTS PAGE EXECUTIVE
SUMMARY
....... ................... iv ASSESSMENT REPORT
1.0 INTRODUCTION
AND PURPOSE .........................................
PD I Policy ...................................................................... 3 Program Description ........................................................ 4 Open Item s ................................................................... 4 2.0 AREAS ASSESSED ..................................... 6 2.1 Review of PDI Generic UT Procedures .............................. 7 2.1.1 Procedures PDI UT-I and PDI UT-2 ................... 7 2.1.2 Procedure PDI UT-3 .............................. 8 2.1.3 Essential Variables ............................... 9 2.1.4 Controlling Generic Procedures ............................... t0 2.1.5 Vendor Specific Procedures ...... .................. .10 2.2 Test Blocks ........................................ 11 2.2.1 Test Block Materials ............................................. 11 2.2.2 Test Block Defect Processes .................................. 12 2.2.3 Specimen Designs .............................. 13 2.2.4 Validation of True State of Flaws ........................... 14 2.2.5 Conclusions .................................... 15 2.3 PDI Testing Process ................................................... 15 2.3.1 Data Acquisition ............................................... 17 2.3.2 Data Analysis .................................. 17 2.3.3 Personnel Qualifications ....................................... 18 2.3.4 Procedure Qualifications .......................... 18 2.3.5 Conclusions ..... ..................... ................. 19 2.4 Non-UT Procedures and Documentation ........................... 19 2.4.1 Security of Test Specimens and Information .............. 19 2.4.1.1 Identifying and Fabricating Specimens ............. 19
2.4.1.2 Handling, Storage, and Shipping of Specimens ... 20 2.4.1.3 Handling Specimens During the Demonstration ... 21 2.4.1.4 Handling, Storage, and Shipping of Data .......... 21 2.4.1.5 Conclusions ............................................. 22 2.4.2 Qualification Records ............................................ 22 2.4.2.1 Candidate and Equipment Performance Demonstration Sheets ............................... 22 2.4.2.2 Performance Demonstration Qualification Summary ............................. 23 2.4.2.3 Conclusions .............................. 24 2.5 Performance Demonstration Testing of Piping Materials ......... 24 2.5.1 Test Set Design .................................................. 25 2.5.2 Test Execution ................................ . 27 2.5.3 Scoring Test Results ............................................ 27 2.5.4 Retesting and Expansion ....................................... 28 2.5.5 Feedback to Candidates that Fail a Test ................. 28 2.5.6 Reporting of Test Results .......................... 28 2.5.7 Findings ...................................... 28 2.5.8 Conclusions ......................................................... 30 2.6 Performance Demonstration Testing of RPV Materials ........... 30 2.6.1 Test Set Design ....................... ..... ........ 31 2.6.2 Test Execution ................................. 31 2.6.3 Scoring Results ................................. 31 2.6.4 Retesting and Expansion ....................................... 32 2.6.5 Feedback Provided to Candidate ..................... 32 2.6.6 Reporting of Test Results ...................................... 32 2.6.7 Conclusions ................................... 32 2.7 Bolting Examination .................................. 33 2.8 A SME Code ............................................................... 34 2.8.1 PDI/PDA Exceptions to ASME Code ...................... 34 2.8.1.1 Items Reviewed ... ......................... 34 2.8.1.2 PDI Position No.94-007 ..................... 35 2.8.1.3 PDI Position No. 94-u09 .......... ............... . 36 2.8.1.4 Deferring Using Appendix VII ....................... 36 ii.
2.8.1.5 Changing RPV True Location Tolerance 36 2.8.2 Merging the IGSCC trogram into the PDI Program ...... 36 Comparison of NUREG-0313, Revision 2, to the PDI Program ......................................... 36 3.0 MANAGEMENT MEEETINGS ....................... ..................... 37 PERSONS CONTACTED AT PDI BY NRC ASSESSMENT TEAM .... 38 TABLES Open Items ............................................... 4 Table 1 Least Number of Piping Flaws for Maximum Qualification 26 Table 2 Summary of PDI's Alternative Positions ..................... 35 iii
EXECUTIVE
SUMMARY
BACKGROUND Section 50.55a, "Codes and Standards," of Title 10 of the Code of Federal Regulations (CFR) gives the regulatory requirements for application of the American Society of Mechanical Engineers (ASME) Boiler and Pressure Vessel Code (BPV Code) to nuclear power reactors licensed in the United States. The current regulations incorporate the 1989 Edition of Section XI, "Rules for Inservice Inspection of Nuclear Power Plant Components,"
for inservice inspection (ISI) and examination of the reactor coolant pressure boundary (ASME Code Class 1) components and certain other safety-related pressure vessels, piping, pumps, and valves which are classified as ASME Code Class 2 or 3. The 1989 Edition of Section XI included mandatory Appendix VII, "Qualification of Nondestructive Examination Personnel for Ultrasonic Examination," which specifies requirements for the training and qualification of ultrasonic nondestructive examination (NDE) personnel in preparation for employer certification to perform NDE. The 1989 Addenda to Section XI added mandatory Appendix VIII, "Performance Demonstratior '-- Ultrasonic Examination Systems," to give requirements for performance demonstration for ultrasonic testing (UT) procedures, equipment, and personnel used to detect and size flaws. Appendix VIII references Appendix VII for personnel qualifications. The establishment of Appendices VII and VIII accelerated after a November 1984 public workshop in Rockville, Maryland, during which UT performance demonstration provisions prepared by Pacific Northwest Laboratories (PNL) under NRC Research funding were extensively discussed. Following the meeting, the Section XI Subgroup on Nondestructive Examination (SGNDE) agreed to undertake further development of the UT performance demonstration rules for publication in the ASME BPV Code in lieu of NRC developing and issuing a regulatory guide.
The U.S. nuclear industry created the Performance Demonstration Initiative (PDI) group to manage implementation of the performance demonstration requirements of Appendix VIII.
PDI contracted with the Electric Power Research Institute (EPRI) Nondestructive Framination (NDE) Center, located in Charlotte, North Carolina, to be the current Performance Demonstration Administrator (PDA) for implementation of the PDI program.
All nuclear power plant licensees participate in the PDI program.
In addition to managing the performance demonstration for Appendix VIII, PDI agreed to expand its purpose to include the intergranular stress corrosion cracking (IGSCC) performance demonstration testing program to fulfill the IGSCC Coordination Plan.' In
.merging the two programs, PDI has instituted changes that have benefitted both programs.
Individual utilities and vendors participating in the PDI program may satisfy Appendix VIII
'The IGSCC Coordination Plan originated from a tripartite agreement between NRC, EPRI, and the BWROG, and was described in "Coordination Plan for NRC/EPRI/BWROG Training and qualification Activities of NDE Personnel," Richard C. DeYoung, Director, Office of Inspection and Enforcement, USNRC, to Dr. Gary Dau, Senior Program Manager, NDE, EPRI July 3, 1984.
iv
performance, requirements and the intent of the IGSCC Coordination Plan. The NRC has agreed to allow the UT examiners for IGSCC, who were required to requalify every three years, to extend their qualifications to extend their qualification up to five years from the date of the last qualification examination before March 1, 1994, (i.e., to extend their requalification by up to 2 years). The IGSCC Coordination Plan extension is to avoid duplication of qualification requirements.
ASSESSMENT The purpose of the NRC review of PDI was to assess the program for conforming to and implementing the requirements of Appendix VIII and merging the IGSCC Coordination Plan into the program. The team reviewed activities of PDI and PDA associated with the performance demonstrations of the UT procedures, equipment, and personnel used to detect and size flaws. When problems in meeting the requirements were identified, PDI developed alternatives. These alternatives were reviewed by the NRC team (see Section 2.8 for details).
The assessinent team reviewed PDI policy and 1-DI programs and identified 13 items for potential improvement that will be tracked for resolution. The team found that PDI qualifies NDE technicians to a number of generic procedures (i.e., those procedures that have broad application for subscribing utilities) and works with vendors and utilities for qualification of technicians using specific procedures (i.e., those procedures which are for specific applications for the vendor or utility). PDI has developed and qualified generic procedures for manual examination of austenitic (including IGSCC) and ferritic piping. Generic procedures for bolting have been developed, while generic procedures for reactor pressure vessel examinations are under development. A review of the generic procedures indicated that the procedures could be enhanced if certain items were clarified and if additional details were added, though the generic procedures are more detailed than vendor or utility specific procedures.
TEST BLOCKS The team reviewed the test block designs and materials and the processes for manufacturing defects in the specimens and found conformance with the requirements of Appendix VIII.
Test specimen configurations are representative of field conditions with the flaws located where a component is most susceptible to cracking. Piping specimens containing IGSCC flaws were available from EPRI. Piping specimens containing non-IGSCC piping flaws were predominantly fabricated with thermal fatigue cracks. Reactor vessel specimens were predominantly fabricated with weld solidification cracks with electrical discharge machining (EDM) notches being used sparingly. PDI has an effective security program to protect details of the types and locations of defects in particular specimens including provisions to prevent compromise during testing and to protect test data. PDI exercises security oversight at vendor facilities and licensee plants whenever their specimens or procedures are being utilized. The specimens are considered appropriate for performance demonstration testing v
and allow for a significant challenge for the UT candidates. The associated security controls are appropriate for maintaining the security of the specimens. Maximizing use of real cracks in test specimens is considered a strong point of the PD! program.
TESTING AND QUALIFICATION PROCESS Review of the testing process included quality assurance, administrative controls, conduct of the qualification, selection of specimens, documentation of the candidates findings during examination, requirements for retest, and data on candidate success rates. The testing process was well organized, administratively controlled, and comprehensive with sufficient challenge to ensure that those candidates who are not adequately skilled will not pass.
However, the team identified that PDI does not meet the requirements of Appendix VII in limiting the number of retests to two per year (see Item 95-01-07).
Candidates participating in the performance demonstration test program submit statements to PDI stating that they satisfy ASME Code Level II or III requirements. PDI accepts statements as an indication of proficiency ard defers the training and implementation of Appendix VII to candidates' sponsors. Candidates who are unsuccessful during their initial demonstration may take a retest. However, candidates are restricted by ASME Code to two retests per year. Neither PDI nor Appendix VII address periodic proficiency demonstrations which is a departure from the American Society for Nondestructive Testing (ASNT) SNT-TC-lA requirements and the IGSCC Coordination Plan.
The performance demonstration tests for piping are being performed by candidates using manual UT equipment. The tests are tailored to the candidate's qualification request which may include the non-Code conditions of IGSCC and inspection of a weld from the opposite side (single-side access). The addition of IGSCC to the performance demonstration test is an effort by PDI to satisfy the demonstration testing contained in NUREG-0313, Revision 2, though the testing is not consistent with the tripartite agreement. For the non-Code conditions, the number of specimens per test set and acceptance criteria were set by PDI (i.e., three IGSCC flaws in a specimen where detecting one correctly enables a candidate to 1 )ass IGSCC portion for piping). Acceptance criteria established by PDI may allow a candidate to pass the performance demonstration test without adequately demonstrating proficiency for some of the non-Code conditiens. See Item 95-01-04.
Performance demonstration tests for reactor pressure vessels are being performed by candidates using automatic UT, data collection equipment, and vendor-specific procedures.
The automatic equipment presents an opportunity for PDI to expand the number of flaws in a specimen simply by changing the examination coordinates on the test specimens. Some companies are examining the entire specimen at one time and storing data for future performance demonstration tests and retests. The sensitivity of scanners being used for these examinations may not be representative of those used in the field (see Item 95-01-13).
vi
RECORDS.
Records are maintained on qualifications of equipment, procedures, and candidates participating in the PDI program. Performance demonstration qualification summary (PDQS) sheets summarize material categories, material thicknesses, and material diameters that have been successfully demonstrated. The compilation of the PDQS is a labor-intensive manual task that could be improved if an electronic data retrieval system is established (see Item 95-01-08).
CONCLUSIONS The team concluded that the PDI program is acceptable for qualifying NDE technicians in accordance with Appendix VIII with the exception of requirements in the areas of testing tolerances and test administration that PDI determined to be impractical. The team reviewed PDI's technical justifications for the impractical requirements. It is important that ASME Code activities to address the issues be completed on a timely schedule. Situations or conditions not specifically addressed by code are addressed in the PDI program as determined by the PDI steering committee and subcommittees.
vii
ASSESSMENT REPORT
1.0 INTRODUCTION
AND PURPOSE Section 50.55a, "Codes and Standards," of Title 10 of the Code of Federal Regulations (CFR) gives the regulatory requirements for application of the American Society of Mechanical Engineers (ASME) Boiler and Pressure Vessel Code (BPV Code) to nuclear power reactors licensed in the United States. The current regulations incorporate the 1989 Edition of Section XI, "Rules for Inservice Inspection of Nuclear Power Plant Components,"
for inservice inspection (ISI) and examination of the reactor coolant pressure boundary (ASME Code Class 1) components and certain other safety-related pressure vessels, piping, pumps, and valves which are classified as ASME Code Class 2 or 3. The 1989 Edition of Section XI included mandatory Appendix VII, "Qualification of Nondestructive Examination Personnel for Ultrasonic Examination," which specifies requirements for the training and qualification of ultrasonic nondestructive exam; lation (NDE) personnel in preparation for
.employer certification to perform NDE. The 19h9 Addenda to Section XI added mandatory Appendix VIII, "Performance Demonstration for Ultrasonic Examination Systems," to give requirements for performance demonstration for ultrasonic testing (UT) procedures, equipment, and personnel used to detect and size flaws. Appendix VIII references Appendix VII for personnel qualifications. The establishment of Appendices VII and VIII accelerated after a November 1984 public workshop in Rockville, Maryland, during which draft UT performance demonstration provisions prepared by Pacific Northwest Laboratories (PNL) under NRC Research funding were extensively discussed. Following the meeting, the Section XI Subgroup on Nondestructive Examination (SGNDE) agreed to undertake further development of the UT performance demonstration rules for publication in the ASME BPV Code in lieu of NRC developing and issuing a regulatory guide.
Appendix VIII gives the performance demonstration requirements for specific types of components in Supplements 2 through 11' as follows:
Supplement Title 2 Qualification Requirements for Wrought Austenitic Piping Welds 3 Qualification Requirements for Ferritic Piping Welds
'Supplement 1, "Evaluating Electronic Characteristics of Ultrasonic Instruments," describes UT instrument characterization and its use is optional. Supplements 12 and 13 give requirements for the coordinated implementation of selected aspects of supplements 2, 3, 10, and 11, and supplements 4, 5, 6, and 7, respectively. Reference the 1995 Edition of Section XI of the ASME BPV Code.
I
4 Qualification Requirements for the Clad/Base Metal Interface of Reactor Vessel 5 Qualification Requirements for Nozzle Inside Radius Section 6 Qualification Requirements for Reactor Vessel Welds Other Than Clad/Base Metal Interface 7 Qualification Requirements for Nozzle-to-Vessel Weld 8 Qualification Requirements for Bolts and Studs 9 Qualification Requirements for Cast Austenitic Piping Welds (In the Course of Preparation) 10 Qualification Requirements for Dissimilar Metal Piping Welds I1I Qualification Requirements for Full Structural Overlaid Wrought Austenitic Piping Welds The U.S. nuclear industry created the Performance Demonstration Initiative (PDI) group to manage implementation of the performance demonstration requirements of Appendix VIII.
PDI contracted with the Electric Power Research Institute (EPRI) Nondestructive Examination (NDE) Center, located in Charlotte, North Carolina, to be the current Performance Demonstration Administrator (PDA) for implementation of the PDI program.
All nuclear power plant licensees participate in the PDA program, though some utilities and NDE contractors have developed and qualified their own procedures within the current PDI program (e.g., Virginia Electric Power Company, Northeast Utilities, Duke Power Company).
In addition to managing the performance demonstration for Appendix VIII, PDI agreed to expand its purpose to include intergranular stress corrosion cracking (IGSCC) performance demonstration testing program to fulfill the IGSCC Coordination Plan.' In merging the two programs, PDI has instituted changes that have benefitted both programs. Individual utilities and vendors participating in the PDI program may satisfy Appendix VIII performance requirements and the intent of the IGSCC Coordination Plan. The NRC agreed to allow the UT examiners for IGSCC, who were required to requalify every three years to extend their qualification up to five years from the date of the last qualification examination before March
""he IGSCC Coordination Plan originated from a tripartite agreement between NRC, EPRI, and the BWROG, and was described in "Coordination Plan for NRC/EPRI/BWROG Training and qualification Activities of NDE Personnel," Richard C. DeYoung, Director, Office of Inspection and Enforcement, USNRC, to Dr. Gary Dau, Senior Program Manager, NDE, EPRI July 3, 1984.
2
1, 1994, (i.e., to extend their requalification by up to 2 years). The IGSCC Coordination Plan extension is to avoid duplication of qualification requirements.
The purpose of the NRC review of PDI was to assess the program for conforming to and implementing the requirements of Appendix VIII and merging the IGSCC Coordination Plan into the program. The team reviewed activices of PDI and PDA associated with the performance demonstrations of the UT procedures, equipment, and personnel used to detect and size flaws. When problems in meeting the requirements were identified, PDI developed alternatives. These alternatives were reviewed by the NRC team.
The assessment team developed an understanding of the PDI program by attending a number of PDI and PDA meetings, observing the testing process, and reviewing procedures and documentation. The assessment was performed during the period from September 15, 1994 to February 3, 1995, as outlined below:
- Attendance at PDI Steering Committee Meeting in Charlotte, North Carolina, September 15, 1994.
- Meeting with PDA on September 16, 1994, to discuss the PDI program.
- Three-day team planning meeting at Region I in November 1994.
- Attendance at the December 7-8, 1994 PDI, Meeting in New Orleans, Louisiana.
- Preliminary assessment. at EPRI December 12 - 16, 1994, to review procedures, observe testing in progress, and meet PDA staff.
- Observe performance demonstration (PD) work on reactor vessel specimens in Chattanooga, Tennessee, January 23 - 24, 1995.
- Assessment of PDA work at the EPRI NDE Center, January 25 through February 3, 1995, including ultrasonic examinations on selected piping arid vessel test specimens performed by NRC NDE UT technicians.
PDI Policy "Performance Demonstration Protocol" delineates the PDI policies for accomplishing qualification in accord with the requirements of Appendix VIII and the IGSCC Coordination Plan. The policies cover testing prerequisites, procedure review, dispute resolution, scheduling, specimen selection, security, test administration, documentation, test grading, and retesting. The policies are embodied in the implementing procedures.
3
Program Description "Program Description" presents a paragraph-by-paragraph comparison of Appendix VIII requirements to the PDI program for implementing the requirements. The PDI program includes alternatives to the code requirements found to be impractical as described in position papers that justify the alternatives. The program description identifies three sets of implementing procedures:
"Quality Assurance of Specimen Fabrication" "Quality Assurance for the Performance Demonstration Process" "Quality Assurance Instruction Manual for the Performance Demonstration Process" Within these three sets of procedures, there are approximately 50 implementing procedures, each covering a particular aspect of the PDI program.
Open Items The following table lists the open items that were identified as areas for improvements. The items are discussed in detail in the referenced report section.
Item Number Report Section Description 95-01-01 2.1.1 Additional examination techniques developed during performance demonstrations and clarification of certain items should be included in examination procedures PDI-UT-1 and PDI-UT-2.
95-01-02 2.1.2 For PDI-UT-3, integrate test performance and equipment requirements into a single section of the procedure and include more detailed instructions.
95-01-03 2.1.3 An examiner who encounters equipment combinations not previously demonstrated as part of his qualification should receive training on the equipment before its use.
4
[_Item. Number Report Section Description 95-01-04 2.1.3 When a performance demonstration does not include all of the techniques listed in the quali t cation procedure, a candidate should demonstrate proficiency in using the other techniques before qualification is complete.
95-01-05 2.1.4 While monitoring of the effectiveness of generic procedures appears to be adequate, written guidance for the oversight function should be developed.
95-01-06 2.1.5 All essential variables (i.e., important equipment parameters that have an impact on inspection performance), with specified values, should be included in the vendor qualification procedures for reactor vessel examinations.
The automated equipment procedures for piping, non-generic piping, bolting, dissimilar metal welds, overlays, and cast austenitic welds, and the procedures for reactor pressure vessels should be examined closely to ensure that the procedures that are being qualified meet the requirements of Appendix VIHII 95-01-07 2.3.3 Limit retests to two in 1 year to comply with Appendix VII-4360(c) requirements. Unlimited retesting may compromise specimen security and present a situation for incorrect reporting of the candidate's qualification.
95-01-08 2.4.2.3 Complete the electronic data retrieval system for qualification records. The format of the Qualification Data Summary Sheet (QDSS) could cause a reviewer to miss the exceptions to the test or misidentify a candidate's IGSCC qualifications. The current method of finding each Qualification Calibration Data Sheet (QCDS) in the candidate's file and comparing it to the equipment files is cumbersome.
5
Item Number_ Report Section Description 95-01-09 2.5.7 The Code does not address IGSCC or single-side access acceptance criteria in the sui ?lements. Therefore, PDI established the acceptance criteria which the team considered weak and in need of improvement.
95-01-10 2.5.7 PDI should ensure that adequate details are included in the qualification procedures for IGSCC mid-wall qualification.
95-01-11 2.6.3 The tolerance on the determination of the location of a flaw used by PDI is double the tolerance allowed in Appendix VIII, Supplement
- 4. PDI should initiate action with the Section XI Committee on the tolerance changes that may oe needed.
95-01-12 2.6.7 PDI should develop a clear position on the use of software for the performance demonstration tests as compared to performance in the field.
95-01-13 2.6.7 PDI should develop a means to prove that the improvised scanners used in the performance demonstrations are not superior to the scanners used for the field ISI.
2.0 AREAS ASSESSED The assessment of PDI activities is divided into Sections 2.1 ,nrough 2.8 as follows:
2.1 Review of PDI Generic UT Procedures 2.2 Test Blocks 2.3 PDI Testing Process 2.4 Non-UT Procedures and Documentation 2.5 Performance Demonstration Testing of Piping Materials 2.6 Performance Demonstration Testing of RPV Materials 2.7 Bolting Examination 2.8 ASME Code At the time of the assessment, PDI used generic procedures that it developed. During the two week on-site assessment, the team observed personnel participating in performance 6
demonstrations on pipe welds and reactor vessel shell welds. The team limited its assessment to the in-process testing and completed work and avoided items under development so as not to influence PDL. The team reviewed in-process qualifications for meeting Appendix VIII, Supplements 2, 3, and 12. The team reviewed automated UT data collection portions for qualifications to the requirements of Supplements 4 and 6, and the preliminary procedures with typical test specimens for meeting the requirements of Supplement 8.
2.1 Review of PDI Generic UT Procedures Generic procedures are those that PDI develops for various materials and types of equipment that will apply to many applications in the field. Compare with vendor- or utility-specific procedures which are developed for unique or specific applications of materials or equipment that may not be covered by generic procedures (e.g., UT steel pipe with stainless steel cladding on the inside and outside diameters). PDI developed three generic procedures for UT examinations of piping:
PDI-UT detection and length sizing 'f flaws using manual UT on similar metal welds in ferritic piping.
PDI-UT detection and length sizing of flaws using manual UT on similar metal welds in wrought austenitic (stainless steel) piping.
PDI-UT depth sizing of flaws in ferritic and wrought austenitic (stainless steel) piping.
The procedures are criteria-based in that they specify values for the essential variables, including instruments and transducers, used in examining piping of a particular material type, thickness, and diameter. Criteria-based procedures are decision-assistance tools that guide the user in selecting the proper equipment and settings.
2.1.1 Procedures PDI UT-I and PDI UT-2 PDI generic procedures for ultrasonic detection and length sizing of indications in ferritic and austenitic piping materials were reviewed for compliance with Appendix VIII and for technical adequacy. The procedures give guidance for the essential variables such as search unit bandwidth, applicable frequencies, type of sound wave, focal depth, and signal-to-noise ratio. The reference sensitivity levels are established using calibration blocks.
The procedures are better (i.e., have a higher success rate of detecting and properly sizing flaws) than those used in the nuclear industry that do not meet the requirements in Appendix VIII; however, PDI could improve the procedures by giving more detail in the data interpretation steps. The insights and expertise developed by the candidate preparing for the performance demonstration test may be lost from the lack of use of these skills.
7
If procedural guidance were expanded, the candidate could rely on details in the procedure rather than having to memorize numerical values or technique specifics. For example, Section 8 of procedures eDI-UT-l and PDI-UT-2 includes items that could be improved as follows:
- 1) PDI-UT-2, Paragraph 8.2.2.3, states that a second angle is used in the same direction. It is unclear whether this means that two angles (e.g.,
45 and 50 degrees or 40 and 45 degrees) are equally acceptable.
- 2) PDI-UT-2, Paragraph 8.2.2.7.a, states that "... when using a higher frequency..." It is unclear whether this means that going from 1.5 MHz to 2 MHz, or increasing by a factor of 2 from 1.5 to 3 MHz, are equally acceptable.
- 3) PDI-UT-2, Paragraphs 8.2.2.9 and 8.2.3, state that an inside diameter creeping wave probe is used. No guidance for using these probes is given regarding the frequency of operation or the element size.
- 4) PDI-UT-2, Paragraph 8.3, states that there are tests being conducted at different angles and frequencies, but provides no guidance on what the other angles or frequencies should be used.
- 5) Specifics such as items 1 through 4 should be included in the flow charts (e.g., Figure 4) in PDI-UT-2.
- 6) PDI-UT-1 has similar types of items that could be better clarified.
Currently, UT technicians must learn the critical values necessary for successful performance demonstration through trial and error. When combinations are found that produce desired results, the UT technicians typically write notes or rely on remembering the combinations. By including such information in the -,malification procedures, UT te-hnicians will more quickly learn the correct techniques. Additional details in the procedures complement the mandatory requirements in Appendix VIII for all essential variables and variable ranges. The recommendations for improving procedures will be tracked as Item 95-01-01.
2.1.2. Procedure PDI UT-3 Procedure PDI-UT-3 for ultrasonic through-wall (depth) flaw sizing of ferritic and austenitic piping materials was reviewed for compliance with Appendix VIII and for technical adequacy. The procedure includes a well-written scope, a description of the techniques to be used, a listing of the equipment to be used, a detailed calibration procedure, and a description of the process for applying the technique - to determine a final flaw depth estimate.
8
The procedure requires UT personnel to perform tests identified in Section 7 of the pr'oedure. It then refers back to Section 5 of the procedure for determining the correct equipment for conducting the tests. Integrating these sections would simplify use of the procedure. Also, a lack of detail similar to that found in PDI-UT-1 and PDI-UT-2 was noted. These comments will be tracked as Item 95-01-02.
2.1.3 Essential Variables To develop the generic procedures, PDI had to identify the dependent variables. To identify the dependent variables, PDI first asked the equipment manufacturers to specify the equipment best suited for the piping applications found in the light water reactor industry. Using the equipment recommendations from manufacturers, PDI measured a variety of flaws in various materials and worked with the manufacturers to determine which controls affected equipment system performance. From these measurements, operating values that must be used with the equipment were identified. The operating values were ones which affect the center frequency and the bandwidth of the equipment.
Pacific Northwest Laboratories (PNL) has conducted extensive modeling and experimental studies on equipment operating tolerance parameters and found that to ensure repeatability of inspection performance, the system center frequency and bandwidth must be controlled. Therefore, a table in each generic procedure lists all the controls that affect equipment center frequency and bandwidth for each instrument. The flaw detector instruments (e.g., pulsars, transducers, receivers, and search units) specified in a procedure are the only ones that can be used with the procedure. New instruments can be added to procedures only after extensive evaluation and testing to determine the correct setting for all controls that affect center frequency and bandwidth.
Candidates using generic procedures use flaw detector instruments specified in the procedures and a variety of transducers made by various manufacturers. To qualify a transducer, it must be successfully used in a performance demonstration. When a non-qualified transducer is used, it must first be listed on the equipment data sheet. As each specimen is inspected and results are given to the test proctor, the candidate describes to the proctor the key UT data used to make the detection or sizing decision. The test proctor then notes whether that particular transducer was useful in achieving the desired results. If the candidate successfully passes the performance demonstration, the transducer is added to the equipment qualified for that procedure.
When a candidate successfully passes the performance demonstration, the candidate is qualified to use any of the equipment combinations listed in the procedure, as well as any future additions to the procedure. Such latitude with equipment selection introduces a variable not addressed in Appendix VIII. However, the equipment control settings listed in tables in a criteria-based procedure limit the amount of variation that the equipment could introduce into the system. As generic procedures are being applied, a successful candidate could be asked to use unfamiliar equipment in the field. If such a situation 9
arises, the candidate's proficiency with unfamiliar equipment should be verified through some foria of training. That a successful candidate may later encounter equipment combinations not previously demonstrated as part of his qualification will be tracked as Item 95-01-03.
The sizing procedure describes different techniques that produce equivalent results. A candidate who has successfully passed the performance demonstration using one technique in the procedure is now qualified for all of the techniques listed. The team concluded that, because the qualification is performance based, the candidate should demonstrate all of the techniques in the procedure. If the candidate did not demonstrate all of the techniques listed in the procedure to qualify, he should demonstrate to the test proctor proficiency with the other techniques and the demonstration of ability should be documented in the test results. The completion of a performance demonstration without demonstrating all of the techniques of the UT procedure will be tracked as Item 95-01-04.
2.1.4 Controlling Generic Procedures Generic procedures are frequently revised; therefore, a method for controlling and evaluating the effectiveness of the generic procedures should be available. PDI has not yet developed written controls for the generic procedures, but was able to described the controls currently in place. User-identified inadequacies in generic procedures are submitted to the PDI Technical Working Group (TWG) for resolution. The TWG screens equipment before it is included in a generic procedure to identify incompatible combinations of equipment and criteria. If testing with specific equipment or a specific procedure results in a high candidate failure rate, the TWG performs a detailed review of the test and resolves concerns. In screening equipment, the TWG considers whether the equipment is sufficiently different to require its own procedure and the TWG is developing guidelines to identify when changes in equipment, techniques, or training are extensive enough to warrant a new procedure. The assessment team did not have concerns with the PDI method for monitoring the effectiveness of their generic procedures; however, the lack of written guidance for the oversight function will be tracked as Item 95-01-05.
2.1.5 Vendor Specific Procedures The team reviewed data for several companies participating in qualification for the reactor pressure vessel (RPV) supplements in Appendix VIII. Procedures developed by these companies were much less detailed than the PDI generic procedures. The team recognizes that, for automated or computer-based systems, all of the listed essential variables may not apply while other variables not listed in the code may be important.
The absence of essential variables in the supplements is a problem that the Section XI Committee should address., However, all essential variables (i.e., important equipment parameters that have an impact on inspection performance), with specified values, should be included in the qualification procedures. The automated equipment procedures for 10
piping, non-generic piping, bolting, dissimilar metal welds, overlays, and cast austenitic welds, and the procedures for reactor pressure vessels should be examined closely to ensure that the procedures that are being qualified will meet the requirements of Appendix VIII. This issue will be tracked as Item 95-01-06.
2.2 Test Blocks The fabrication process for making test specimens was reviewed beginning with the design of specimens, acquisition of material, introduction of flaws, and validation of the true state of the flaws. Flaws were fabricated in wrought austenitic stainless steel and in ferritic materials for pipe and clad and unclad vessel materials. PDA procedures and test specimens for dissimilar metal, castings, and nozzle welds were in various stages of completion and not available for review.
The objectives of the review were to (1) verify compliance with the requirements of Appendix VIII and the IGSCC Coordination Plan and (2) determine the representativeness of the nmaterial configurations and the flaw selection. To assess the acceptability of test blocks, the team considered four factors: (1) materials, (2) defect processes, (3) specimen designs (including similarities to field conditions), and (4) validation of the true state of the flaws.
2.2.1 Test Block Materials PDI obtained much of its material from discontinued nuclear plant sites. For ferritic piping specimens, PDI used ASME SA-106 Grade B or SA-516 Grade 70.steel. For austenitic piping specimens, PDI used ASME SA-312 Type 304 or 316, and SA-376 Type 304 or 316. The large-diameter pipe specimens were manufactured from rolled and welded plate. The plate material was SA-358 Type 316, and welded using Type 308 weld material. The piping used for IGSCC demonstrations was Type 304 stainless steel.
Reactor pressure vessels specimens were ASME SA-533 or ASME SA-508, Class 2.
Cladding material was weld deposited with a Type 308 modified material using the submerged arc process. Some cladding was deposited using Type 309 weld material.
Although Inconel is an austenitic material, it has different UT characteristics than austenitic stainless steel and would require a separate performance demonstration. There were no Inconel specimens in the PDI inventory for a separate demonstration and UT examiners will have to qualify once these specimens are available. Also, the cast elbows
.at PDI were not used nor were they intended for flaw evaluation. The scope of the PDI program does not address these materials at this time. If specimens are not available through PDI, licensees may have to use plant-specific specimens or perform an evaluation
.on a plant-specific basis of the qualification process for unique materials or configurations.
11
2.2.2 Test Block Defect Processes Defects were fabricated to represent conditions that could exist in the field. For piping, the predominant flaw was the thermal fatigue implant. To produce the thermal fatigue flaw and implant, a bar extension the size of the designed flaw was welded to the "weld-prep" pipe end. Placing the bar under stress, the bar-pipe-weld joint was subjected to cycles of heating and cooling until failure. The fractured surface was cut from the bar length, machined to the designed size, matched to the fractured surface on the pipe, and seal-welded to the pipe. The pipe was butt-welded to an adjoining pipe. The integrity of the finished weld was verified by penetrant testing (PT), radiographic testing (RT), and UT.
For 1GSCC flaws in piping, PDI acquired specimens and inventoried material from the EPRI IGSCC testing program. The IGSCC detection specimens are primarily piping with service-induced cracks. Additional specimens were fabricated from the material obtained from inventory for the EPRI IGSCC Program. Specimens used for depth sizing were grown using a graphite wool process.
PDI secured piping specimens with mechanical fatigue flaws, thermal fatigue flaws, and IGSCC flaws. PDI considered flaws manufactured by the hot isostatic pressing (HIP) process (i.e., hot pressurized embedded flaws), but determined that the process produced unacceptable attenuation.
For reactor pressure vessel tests, the majority of the flaws were manufactured using a weld-solidification process. PDI also had slag and lack-of-fusion flaws embedded in the test specimens. The slag flaws were large enough to be rejectable per ASME,Section III, NB-5320.
Because electrical discharge machined (EDM) notches are not representative of flaws found in the field, PDI used them sparingly in the RPV and small nozzle specimens, even though Appendix VIII allows their use for up to 50% of :he flaws in RPVs. PDI is using pipe test sets that do not contain EDM notches; they only contain cracks. EDM notches provide ultrasonic signals that are easier to identify than signals from tight cracks. The pressurized-water reactor (PWR) vessel specimens have some very shallow EDM notches. These EDM notches were tapered, plugged with similar material, and welded shut. Solidification cracks could not be introduced into some of the small boiling-water reactor (BWR) nozzles, and these nozzles used EDM notches that were plugged and welded shut. The limited use of EDM notches allowed for extensive use of real cracks in the test specimens and is considered a strong point of the PDI program. The team considered the flaws used in.-the test specimens acceptable.
12
2.2.3 Specimen Designs In reviewing specimen designs, the team considered the configurations used, the condition of the weld crown (or top layer of weld metal), the conditions of the weld root, the locations of the cracks, and the counterbore region3 . The weld crown conditions (i.e.,
ground ,flush, ground flat, unground) are dependent on the intended use. The crowns of the specimens that are used for crack depth sizing have been ground smooth with the parent material of the pipe specimens. The pipe specimens used for detection have crowns that are either ground smooth with the parent material or the crowns are flat topped (i.e., ground flat but not flush with the joined material). If the inspection procedure is to simulate crowns that are in the as-welded condition, then the crowns are covered with tape to preclude ultrasonic inspection through the weld crown. The assessment team agreed that this practice is acceptable for simulating as-welded crown conditions found in the field.
The counterbore and the weld root in pipe specimens were examined in two ways. First, they were examined visually to see the va,"etv of conditions that were present.
Counterbore conditions (i.e., conditions of the geometry of the joint due to preparation of the ends) included such conditions as very short lands, vertical transitions, and partial-circumferential extent. These conditions are comparable to conditions a candidate would see in the field. Using PDI generic UT procedures, the NRC staff from the NDE Mobile Laboratory did blind (i.e., without any knowledge of the specimen) UT inspections of some of the specimens and confirmed that the specimens were challenging to inspect and representative of field conditions that would make accurate detection difficult. The team concluded that since these conditions are representative of field conditions, the specimens are appropriate for performance demonstration.
The fabricated cracks were installed in a variety of locations within the pipe specimens, including the fusion line of the weld, the'counterbore transition, and in the counterbore flat land. These locations, coupled with the crack orientations, made the detection and the proper identification of the cracks a chalienge. The variation in crack locations and orientation of cracks are considered to be appropriate for performance demonstration.
PDI had assembled extensive information on the range of piping and reactor pressure vessel conditions that exist in nuclear power plants. A condensed version of this information was reviewed and found to be consistent with the nuclear industry. The information contained insight into how the fabricated specimens covered the range of conditions that are found in the nuclear industry. There are, however, some
'T7he "counterbore region" is the area that has been counterbored (i.e., machined to enlarge the opening to a larger diameter). A counterbore process is generally used when pipes of different wall thicknesses are mated by welding. The thicker-walled pipe is machined so that the eccentric inner-pipe diameters will mate up at the joint. UT of the joint is often difficult because of the "geometrics" of the joint that result from the mating process.
13
configurations that are unique to a particular licensee (for instance, inside and outside clad pipe) that would still require licensee specific demonstrations.
The team examined the RPV specimens for geometry and cladding. The primary clad surface was applied with the shielded metal arc welding (SMAW) process and was ground sufficiently smooth for inspection. 1'he conditions of these specimens were similar to the conditions found in nuclear power plants. Because of the size of these specimens, their complex geometry, and the lack of RPV inspection equipment, it was not possible tv assess these specimens from the perspective of a realistic RPV inspection.
However, from the limited specimen volumes that were evaluated, the team determined that the conditions and flaws reflected a realistic range of field conditions and that these specimens are appropriate for performance demonstration.
The NRC staff from the NDE Mobile Laboratory conducted UT examinations of a shell course plate. The examination consisted of looking for near-surface, mid-wall, and far-surface flaws. The plate contains many segregated flaws that provide a challenge in the detection and sizing of the cracks. The generic rrocedures provided useful guidance for length sizing, characterizing, and evaluating flaws.
2.2.4 Validation of True State of Flaws PDI required that suppliers of flaws demonstrate the processes used to make the flaws and identify flaw-size adjustments, if needed. The flaw-size adjustment for an implant is the change in flaw dimensions that occurs as a result of the seal welding process. When PDI orders specific flaw sizes (height and length) for piping, the fabricator adds the appropriate flaw-size adjustment to the flaw dimensions.
One supplier of thermal fatigue flaws fabricated the flaws by welding an extension of like metal to a weld-prep end of a pipe. The extension is then fatigued to failure at the weld joint. The thermal fatigue area of the weld prep end of the piping is validated in the following manner:
The center line of the thermal fatigue area is pictorially laid out to a zero datum mark on the pipe and photographed. The mating fractured piece is machined to the design dimension, measured, and then photographed in the presence of a scale. A true-size tracing of the perimeter is made of the fractured piece. The two fractured surfaces are reassembled and sealed by welding the perimeter. The flaw-containing pipe surface is circumferentially welded to a matching pipe. The weld integrity is verified using PT, RT, and UT. To establish the correlation between initial and final flaw size, some of the weld joints were destructively tested in a way that minimizes dimensional change. The unwelded portion of the flaw is measured. The measurement differences between before and after welding are averaged. The average value 14
is the flaw-size adjustment. The physical measurements provide highly accurate flaw sizing and are independent of UT examinations.
Flaws that were grown for PDA were verified by the destructive examination of similar flaws. Flawed specimens secured from operating facilities were used only for detection.
The EPRI NDE Center created what was referred to as a "fingerprint" of each defect.
The fingerprint includes dye penetrant results, optical images of surface-breaking cracks, and automated laboratory UT tests of the zones containing the cracks. The "fingerprint" supplements information given by the manufacturer of the flaws and verifies that there are no extraneous indications near the flaws.
2.2.5 Conclusions The specimen configurations were found to be representative of field conditions. The flaws are located where they would be expected to be found in the field and also where they would be considered a challenge to detect and correctly classify. The specimens are consiaered appropriate for performance demonstration and provided a significant challenge from the perspective of UT inspection. The test block specimens used in the performance demonstration program met the requirements found in the applicable supplements of the Appendix VIII.
2.3 PDI Testing Process The PDI testing process for qualification of personnel and procedures is implemented and controlled through procedures in the PDI-approved Quality Assurance (QA) Manual and in the Operating Instruction Manual. A review for technical content and adequacy was performed for the following procedures:
PDP-Q-009, Rev. 2 "Performance Demonstration Control" PDP-Q-009.1, Rev. 4 "Test Administration" PDP-Q-018.3, Rev. 1 "Surveillance" PDP-I-009.3, Rev. 2 "Candidate Registration and Scheduling Instruction" PDP-I-018, Rev. 3 "Piping Surveillance Instruction" In addition to the procedure and record review, the team held discussions with cognizant PDI and PDA personnel and performed observations of the testing in progress. Candidates applying for performance demonstrations submit statements of qualification from their sponsors or employers certifying that they aje qualified Level II or III examiners in accord with the ASME Code. The requirements for qualification to Level II or III are contained in 15
the 1989 Edition of Section XI which references the American Society for Nondestructive Testing (ASNT) SNT-TC-1A, 1984 Edition, or the equivalent. However, Appendix VIII states that personnel shall meet the requirements of Appendix VII. PDI deferred training and implementation of Appendix VII up to the candidate's sponsor (i.e., PDI has not implemented Appendix VII in its performance demonstration program). The team did not express a position on the PDI approach to Appendix VII; however, if Appendix VIII becomes a regulatory requirement, candidates who do not satisfy Appendix VII would have to request NRC approval for any deviations from the requirements. Such requests will be reviewed on the basis of their individual merit.
Administrative controls are used to ensure the security and confidentiality of die performance demonstration information, testing process, and results. Candidates suspected of violating security requirements are not permitted to continue the demonstration. All information generated during the demonstration is submitted to PDA for review and evaluation.
Performance demonstrations for candidates are performed with previously qualified procedures or in conjunction with the qualification of a new procedure. Candidates provide the PDA with the essential variable values (or ranges of valves) to be used for the demonstration. These are verified before the demonstration begins. Special limitations to the qualification, such as the exclusion of IGSCC or through-wall sizing, are permitted.
However, these exceptions become a part of the candidate's qualification record and restrict the candidate's qualification. Before starting the demonstration, the candidate is given a written description of the specimens to be examined and the applicable calibration blocks, data report forms, test sets of specimens, and sufficient detail on scan areas for preparing scan plans. Test specimens are examined in accordance with applicable procedure requirements, essential variable values, and limitations previously agreed upon.
Test specimen selection is controlled by the following instruction procedures:
PDP-I-009.4 "General Test Specimen Selection Instruction" PDP-I-009.4.1 "Test Specimen Selection Instruction. Manual or Semi-Automated Austenitic Piping Examinations for Detection, Length, and Depth Sizing (Supplement 2)"
PDP-I-009.4.2 "Test Specimen Selection Instruction. Manual or Semi-Automated Ferritic Piping Examinations for Detection, Length, and Depth Sizing (Supplements 3 and 12)"
In general, test sets include at least four specimens having different nominal pipe diameters and thicknesses. Test sets have specimens with the minimum and maximum pipe diameters and pipe thicknesses for the applicable examination procedure. If IGSCC or far-side qualifications are requested by the candidate, the test sets will contain specimens for IGSCC or far-side examination. The IGSCC flaws are connected to the inside pi, . surface and extend to the depths specified in Appendix VIII.
16
The test specimens may contain both fabrication and service- or laboratory-induced flaws.
Flaws were installed near various inside diameter and outside diameter weld reinforcement conditions, geometric conditions (e.g., counterbores), and certain surface-associated conditions. The service- or laboratory-induced flaws include thermal and mechanical fatigue cracks or IGSCC cracks (or both). Test specimen selection is directed towards providing a uniform degree of difficulty. Test specimens are divided into grading units, with each unit having at least three inches of weld length. For a candidate taking the performance demonstration for detection, a minimum number of flaws in a test set are identified in the appropriate supplement to Appendix VIII. For the performance demonstration to qualify procedures with multiple essential variables, the minimum number of grading units can be used for each limit of each variable. Flawed grading units within test set specimens are required to meet specified criteria for flaw depth, orientation, and type. Other demonstration conditions controlled by the procedures include single-side access qualification, length sizing, through-wall sizing, and IGSCC qualification.
2.3.1 Data Acquisition Candidates examine specimens in accordance with applicable pipe procedures, essential variable values, and preestablished limitations. Areas to be examined on specified test specimens are identified by PDA, and examinations are restricted to those areas.
Portions outside the area of interest are masked to prevent identification and examination.
Administrative controls limit scanning time for detection to 2 hours2.314815e-5 days <br />5.555556e-4 hours <br />3.306878e-6 weeks <br />7.61e-7 months <br /> per flawed grading unit, although this time can be extended by the PDA following a conference with the candidate. Additional time is allowed if the candidate appears to be performing acceptably but needs more time to complete the qualification. For manual examination demonstration, on the average, up to 4 hours4.62963e-5 days <br />0.00111 hours <br />6.613757e-6 weeks <br />1.522e-6 months <br /> has been allowed per flawed grading unit for detection and analysis.
2.3.2 Data Analysis Candidates performing manual UT demonstrat ions on pipe specimens are permitted to choose the analysis method of their preference provided they finish within the allotted time for this activity. Radiographs are available to the candidates upon request.
However, these radiographs do not contain any useful information on specific flaws within applicable grading units. The radiographs contained representative images of geometric or weld root conditions (or both).
Candidates document flaws detected during performance demonstrations on forms provided by PDA. The procedures include instructions for documenting indications detected in terms of position, length and depth in piping, nozzles, vessel shell welds, and bolting. Performance demonstration results are graded on a pass-fail basis. Upon completion of the demonstration, the PDA prepares a Performance Demonstration Qualification Summary (PDQS). The team noted that, although the information on the PDQS showed the areas qualified, the sequence of events was difficult to follow without 17
examining each document in the file (i.e., demonstrations performed, test sets used, retests taken, dates, and in some cases, signatures), in part, because all records were still undergoing review. Discussions with cognizant PDI personnel disclosed that efforts were underway to review these records for completeness and accuracy and to computerize the data for easy access.
2.3.3 Personnel Qualifications To determine the performance of the candidates, the team randomly selected 27 performance demonstration packages for review. The items reviewed included such things as flaw detection, length sizing, through-wall depth measurements, IGSCC detection, access restriction, retests, and pass-fail results. Based on the review, the team noted that 78% of the candidates met minimum detection requirements; 59% met minimum length-sizing requirements; 33% met minimum through-wall depth measurements requirements; and 51% successfully met IGSCC detection requirements.
Although these percentages are based on a sampling that included retests, they show that the candidate must be highly skilled in order to successfully pass the test.
Candidates who were unsuccessful during their initial demonstration may take a retest. If the candidate fails the retest, the candidate receives training for 7 days and then takes a second retest. After each unsuccessful attempt, the candidate is briefed on the problem areas that contributed to the failure. A candidate may take more than two retests if PDI has testing time available. Taking more than two retests in 1 year is contrary to Appendix VII-4360(c) requirements. PDI said that if more than two retests are taken in I year, it would state this as an exception on the PDQS. The unlimited retesting may compromise specimen security and present a situation for incorrect reporting of the candidate's qualification. This issue will be tracked as Item 95-01-07.
2.3.4 Procedure Qualifications Rules governing equipment and procedure qualifications are the same as those used for personnel qualification described above, with three exceptions:
(1) Results of several successful candidates may be combined for this purpose.
(2) Candidates must demonstrate that each transducer identified was used and that it was an integral part of the demonstration.
(3) Administrative limits used for the subject demonstrations and retests are optional.
18
2.3.5 Conclusions The testing process was well organized, administratively controlled, and comprehensive.
The nature of the indications in test specimens used for demonstrations presented a formidable challenge to well-qualified UT examiners and should cause those with limited capabilities to fail. PDI has bypassed implementation of Appendix VII, which is referenced in Appendix VIII. As noted above, when candidates take retests, they should be subjected to the two per year retest restriction contained in Appendix VII-4360(c).
2.4 Non-UT Procedures and Documentation In reviewing the non-UT procedures and documents, the team concentrated on security and candidate qualification records. PDI incorporated the "conduct of performance demonstrations" from the supplements to Appendix VIII into the security procedures. PDI incorporated in the quality assurance (QA) program, Appendix VIII, Subsection 5000, "Record of Qualification," requirements for qualification records.
The team performed a limited review of the PDI and PDA organization plan, prerequisites of candidates, testing administration, internal qualification of PDI staff, calibration and measurement, nonconforming items, audit items, and quality assurance records not connected with the Performance Demonstration Qualification Summary (PDQS).
2.4.1 Security of Test Specimens and Information PDI developed security procedures to ensure that the candidates do not possess knowledge of flaw identity prior to testing. The team reviewed the procedures for specimen design, controls during specimen fabrication, safeguarding of specimens, security during testing, and safeguarding of demonstration data. The objective of the review was to identify strengths and weaknesses with the confidential aspects of the program.
2.4.1.1 Identifying and Fabricating Specimens To initiate the specimen fabrication process, PDI developed Procedure PSP-001 (Rev. 0, dated October 26, 1992), "General Design Specification for Piping Performance Demonstration Specimens." The procedure translates Appendix VIII requirements into terms that are suitable for designing full-scale mock-ups of the piping and RPVs found in the plants that have ASME Section XI ISI programs. PDA identified the general requirements for material selection, flaw size and frequency, flaw characteristic and placement, flaw implantation, nondestructive examination, mounting and handling, quality assurance, and confidentiality. PDI and PDA maintained the records generated dL. ing the course of specimen design, material procurement, special fabrication processes, in-process and final inspection, and other related activities. From these general requirements, PDI selected the flaw 19
configurations to be included in the various fabricated test specimens. After flaw selection, the responsible PD[ :ngineer developed a specific manufacturing specification, flaw specification, and any drawings appropriate for the fabrication of that specimen.
To control the specimen fabrication process, PDI developed Procedure PDI-G-001 Rev. 2, dated June 1, 1993, "Preparation, Use and Control of Process Control Sheet." Process Control Sheets (PCSs) are used to document various examinations, measurements, or tests of materials, and processing as necessary (PCSs are the same as shop work orders). A PCS is prepared by the responsible engineer who is familiar with the processing. The PCS is a task comprised of sequential steps and work instructions necessary to accomplish the finished fabrication and may be described as a separate document, drawing, model, or a combination of these. Upon completion of each step, the individual performing the work is required to initial and date the PCS. There are multiple tasks for each fabrication. After completion of each task, there is a QA review and an approval by the responsible engineer. These PCSs are filed as part of the permanent manufactu:iag records for that particular fabrication.
To maintain the necessary documents for the specimen fabrication process, PDI developed Procedure PDI-Q-013 (Rev. 2, dated April 8, 1994), "PDI Specimen Manufacturing and Examination Records Indexing." The procedure gives the instructions for controlling the manufacturing and examination documents and for establishing the basis for quality control records and record retention. Using- the PDI Specimen Quality Records Checklist, the responsible engineer identifies and verifies that the necessary quality assurance records have been completed. The verification is the last step in the specimen fabrication process.
After specimens are fabricated, the PDI and PDA program shifts to the performance demonstration process.
2.4.1.2 Handling, Storage, and Shipping of Specimens To control the tracking of specimens, PDI developed Procedure PDP-Q-013, (Rev. 0, dated October 1, 1993), "Handling, Storage, and Shipping." The procedure describes the controls, handling, storage, and shipping of PDI test specimens. Although most of the performance demonstration is conducted at the EPRI NDE Center, the performance demonstration may be performed at other locations (i.e., the licensee or vendor facility). The test specimens are kept under the direct control of PDA or under the control of an organization that has a PDA approved security plan.
When test specimens are transported to remote testing locations, PDA uses commercial trucking firms. Included in the shipping order is a PP)A statement stipulating that the test specimens be shipped as a single load. Tne single load requirement avoids delays and prevents breaking the door seals before the truck 20
reaches the test site. Unless other arrangements are made, a designated supervisor, Level Im examiner, or a member of the PDA off-site team will meet the shipment of test specimens upon arrival at the ISI vendor's facility, the EPRI NDE Center, or the host utility. After delivery, the test specimens are stored in designated and controlled access storage areas or the PDA secured laboratory. Large test specimens are covered and secured with locked chains or cables. The cover over the specimens prevents visual and nondestructive examination.
2.4.1.3 Handling Specimens During the Demonstration To control test specimens during the demonstration, PDI developed Procedure PDP-Q-008. 1, "Control and Security of Test Specimen," which describes the essential elements for maintaining the security of PDI test specimens during the performance demonstration process. At the EPRI NDE Center and off-site testing facility, the specimens are continually monitored. Monitoring is predominantly performed by PDA staff and is supplemented with remote video recording and display equipment.
When the test specimens are not in uF , they are located in designated storage areas that are secured and monitored. Routine surveillance is performed by PDA staff during handling, storage, and testing of the specimens and test information. The surveillance verifies that requirements are being met for the security and control of the test specimens and that confidentiality of the test information is maintained. The surveillance is implemented in accordance with written procedures and documented on checklists.
2.4.1.4 Handling, Storage, and Shipping of Data To control the information generated during the performance demonstrations, PDI developed Procedure PDP-Q-008, "Security of Test Specimen Information." The procedure describes the essential elements for maintaining the security of confidential information during the performance demonstration process. Access to test data is controlled similar to access to test specimens (i.e., all access to confidential information is controlled by the PDI project manager). Once access is granted, a "Signature Authority/Authorized Access Log" is filled out and posted in the records room. Access is granted only on a need-to-know basis. When information is not in use, it is locked in cabinets, rooms, or key-controlled computers. PDI and PDA computers are protected with password access. At remote locations, dam generation and information handling are controlled by the PDA-approved security plan. PDA requires that the security plan be signed by a company officer or representative.
Automated UT systems that are used for performance demonstrations are examined by the PDA staff for security and data-generating aspects. Only the files necessary for conducting the performance demonstration are allowed on the computer systems. The proctor verifies files on the computer by running a directory of the computer files necessary for performing performance demonstrations and recording the remaining 21
file storage space. The data disks are kept under positive control of the PDA proctor who is assigned surveillance responsibiiity.
During an automatic UT examination of a RPV conducted per the PDI program, PDI demonstrated the site-specific compute- security to the team. Data generated from the scanning transducer was collected on a removable hard drive. The hard drive was removed and placed into a second computer system that transferred the data to a compact disc (CD). After the accuracy of.transferred data was verified for integrity, the hard drive was reformatted and placed back into the data-gathering equipment.
The CD now contains all the raw data. PDI does not make copies of the CDs containing the raw data. By contractual agreement, PDI retains the control and access to the CDs. If a licensee or contractor wanted data storage control by an entity other than PDI, PDI would stipulate that the licensee or contractor would not have access to the data without PDI involvement. PDI does not accept responsibility for the quality of the data on the CDs.
Tiw use of alternative methods for achie-in- the required level of test information security is permissible provided PDA approves the alternative. The security should be provided by an independent organization other than the organization of the person taking the examination. Alternatively, where special conditions exist that preclude meeting security requirements, the individual must provide a suitable method of achieving the level of security required by PDA.
2.4.1.5 Conclusions PDI and PDA have devised and implemented an effective security program to protect the integrity of the test specimens, to prevent compromise during testing, and to protect test data. The security program satisfies the requirements of Appendix VIII.
No areas for improvement were identified for the security procedures that were reviewed.
2.4.2 Qualification Records Appendix VIII, Subsection 5000, "Record of Qualification," requires that PDI and PDA maintain qualification records. The team reviewed the storage and retrievability of performance demonstration records. The review covered the performance test results for candidates, the equipment used by candidates, locating data in the files, and identifying potential problem areas. The objective was to *examine the difficulties associated with a response to an external inquiry for a candidate's qualification.
2.4.2.1 Candidate and Equipment Performance Demonstration Sheets To document the performance demonstration results, PDA developed Procedure PDP-Q-017, (Rev. 1, date March 23, 1994), "Quality Assurance Records." This 22
procedure describes the storage, identification, retention, and retrievability of performance demonstration QA records.
The performance demonstration Qualification Data Summary Sheet (QDSS) identifies the material categories, thicknesses, and diameters that have been successfully demonstrated by the candidate, including the appropriate ASME Code and equipment.
The QDSS has an easy-to-read format except for IGSCC entries. IGSCC entries are identified by asterisks and the meanings of these entries are explained as footnotes at the bottom of the summary sheet. Occasionally, PDA would take an exception to specific items in a candidate's procedure. Any exceptions are also explained at the bottom of the summary sheet.
The essential variables associated with the equipment used by the candidate during the performance demonstrations are listed on the Qualification Calibration Data Sheets (QCDS). The QCDS form is filled out by the candidate, The QCDS is combined with theperformance demonstration Qualification Surveillance Checklist (QSC). The QSC is an aid for the PDA proctor to verify that the individual performing the demonstration is following the procedures, equipment settings, and filling out the paperwork completely. Each entry is initialed by the PDA proctor overseeing the demonstration. The QCDSs are filed with the candidate's records that also contain the candidate's work sheets. As described in Section 2.3.2 above, the reconstruction of the test sequence and various qualifications is difficult.
A compilation of QCDSs is used as the input for the QCDS summary. The QCDS summary identifies the qualified range for each essential variable. The QCDS summaries are not-in the equipment files, but are listed in the generic procedure files.
The generic procedures are routinely revised as new equipment and essential variable settings are added.
2.4.2.2 Performance Demonstration Qualification Sununary (PDQS)
By combining the data from the QDSS and QCDS, PDA. creates a Performance Demonstration Qualification Summary (PDQS). Currently, these sheets have to be compiled from the individual's file and the equipment qualification file which is a labor-intensive task. PDI is in the initial phase of creating electronic retrieval of the data. PDQSs are not available for candidates at this time. To date, PDA has issued to one organization, four procedure PDQSs, for equipment and UT procedures used in a performance demonstration. These qualifications were for a company outside the United States (U.S.) and not applicable to the U.S. nuclear industry.
Upon completion of the demonstration, PDA will verbally notify the candidate and the candidate's employer of the test results. PDA does not intend to issue PDQSs until after January 1996. A utility can ve-ify a candidate's qualification by calling PDA.
While PDA has committed to provide the information over the phone, the protocol 23
has not been finalized. In the future, the PDQSs will be stored on an electronic database with backup files. The backup file will be sent to off-site storage areas.
After-the electronic database is created, the current paper file system would be kept in place and be available for data retrieval.
PDI procedures state that lifetime recorus will be maintained for the useful life of the project as directed by the PDI Steering Committee; that is, the continued existence of lifetime records is dependent upon PDI's continued existence. Lifetime records consist of the PDQS for the equipment, individuals, and procedures. Nonpermanent records are kept for 7 years. Typical nonpermanent records are candidate and organizational use agreements, candidate registration forms, the supporting data with actual test results used to compile the PDQS records, other checklists, and QC examinations.
In the event that changes are made to the acceptance criteria that affect the test results, PDA would review the data and issue a revised PDQS, if appropriate. The ease of retrieving the data is discussed in Section 2.3.2. Any major change would rzquire the same level of review as the original PDQS.
Besides obvious difficulties with the paper-intensive filing system, an additional burden will be placed on utilities when equipment or procedures are revised or an equivalent procedure evaluation is made. The utilities will have to perform an evaluation of the new revision against the revision that the candidate used for qualification; the same is true for equivalent procedures.
2.4.2.3 Conclusions The performance qualification records satisfy the requirements of Appendix VIII.
However, the team identified areas for improvement. The format of the Qualification Data Summary Sheet (QDSS) could cause a reviewer to miss the exceptions to the test or misidentify a candidate's !GSCC qualifications. Fvrthermore, the current method of finding each Qualification Calibration Data Sheet (QCDS) in the candidate's file and comparing it to the equipment files is cumbersome. These are currently considered areas for improvement that may be addressed when the electronic data retrieval system is in place. The areas will be tracked as Item 95-01-08.
2.5 Performance Demonstration Testing of Piping Materials The team reviewed the performance demonstration test sequence that is offered to candidates for piping material. The review included assembling the test sets, conducting the tests, scoring of results, retesting, discussing the results, and reporting the information. As part of the review, the team evaluated the performance demonstration with the requirements of Supplements 2, 3, and 12 to Appendix VIII.
24
2.5.1 Test Set Design The test sets are designed with a computer program containing the requirements of Appendix VIII, manual input of the candidate's qualification request, and manual input of available specimens. The candidate's quwlification is defined by the material types, the wall thickness, the pipe diameters, the flaw types, the access conditions, and the number of unflawed grading units needed. As specimens are selected, the contributions that they make to the test set are noted and subtracted from a list of the requirements. Specimens are selected until the matrix is complete. The computer is a useful tool for ensuring that all of the Code requirements and candidate's needs are met. Separate computer programs are used to design the detection test set and the sizing test set.
Not all the conditions being demonstrated by the candidates are explicitly required by Code, for example, IGSCC and single-side-access cracks (cracks located on one side of the weld that are inspected from the opposite side of the weld). The addition of IGSCC to the performance demonstration program is a PDI effort to satisfy the demonstration testing contained in NUREG-0313, Revisit n-2. These non-Code conditions were included in the baseline test set with the number of specimens per condition being arbitrarily determined by PDI.
In the PDI program for Supplement 12 of Appendix VIII, the candidates are required to detect the minimum number of flaws as shown in the appropriate supplements. From this flaw selection, the candidate can qualify for detection in piping for the following conditions: minimum and maximum diameter, thickness range, ferritic, austenitic, far-side examination, and IGSCC. In addition, 50% of the flaws must be associated with geometry. Multiple conditions may exist within one test specimen. The lumping of conditions can have a negative compounding effect on the candidate's success rate.
A review of test sets showed that some candidates test sets had more than one specimen with no flaws. The absence of flaws in some of the test specimens adds authenticity to the performance demonstration program.
2.5.2 Test Execution The PDA conducts performance demonstrations in special laboratories at the EPRI NDE Center. When testing is in progress, the labsare maintained under continuous surveillance with a PDA proctor and TV-camera monitoring systems overseeing the testing process. To ensure the security of the tests, the candidates cannot remove paperwork from the testing laboratory, and are not to disclose any test information while participating in the performance demonstration. In addition, the specimens for a given test condition were made to appear similar with no visible permanent markings, and the large number of piping specimens ensure ýthat candidates from the same company do not use the same test sets.
25
The candidates are restricted to a maximum of 4 hours4.62963e-5 days <br />0.00111 hours <br />6.613757e-6 weeks <br />1.522e-6 months <br /> per specimen to complete the detectior, requirements. After testing each specimen, the candidate submits paperwork generated during the test to the PDA proctor. The candidate explains to the PDA proctor the reasoning for declaring a detected flaw. To support the detection, the candidate uses the tests and associated signals. The PDA proctor uses data forms to enter comments and verify test equipment.
2.5.3 Scoring Test Results Using a test specific code sheet, PDA manually compares detection test results for the specimens used against the associated specimen's true-state drawings. The team reviewed results from a sampling of candidates and found that data interpretation and scoring were straight forward. The grading units were consistent with ASME Code requirements.
The scoring of the length sizing is determined by the number of observations (n) and the measured length of each crack (m)- as compared to the true state (r) using the root mean square error (RMSE) statistic. Neither circumf reitial offset nor axial offset in the flaw location is considered in the RMSE statistic. In a similar manner, the depth sizing is evaluated using the RMSE statistic. The ASME Code has begun including the RMSE in the supplements of Appendix VIII. For example, Supplement 12 of Appendix VIII gives the RMSE as follows:
RMSE= (Mi-n X n
2.5.4 Retesting and Expansion As the testing progresses, the candidate submits test results for evaluation to the test proctor. If a candidate demonstrates insufficient UT capabilities during the test, the testing may be stopped and the candidate may be directed to spend time on practice samples before starting a retest. This practice makes the testing more cost effective and efficient for candidates. The QA protocol that was developed (PDP-I-009.5 and PDP-I-009.6) identifies a set of conditions (e.g., failing a category of pipe wall thickness) in which a limited retest could be performed; however, limited retesting is seldom done. In general, candidates pass or fail with very few borderline cases.
Once a candidate successfully passes a set of conditions, the candidate can request to be tested on other conditions. For instance, after passing a dual-side access condition test, the candidate can request for a single-side access test set. The requirements are spelled out in the relevant PDI documents (for instance, PDP-I-009.6.1 for Supplement 4 for single-side access identifies the minimum number of flaws as 5) and include the minimum 26
and maximum flaw size. Other parts of the PDI documents cover other categories and supplements.
2.5.5 Feedback to Candidates that Fail a Test The feedback process was not observed tor an actual candidate; however, it was simulated. In the simulation, an NRC NDE technician took an abbreviated performance demonstration. Following the testing, the PDA staff provided feedback about the detections, missed flaws, and false calls. The feedback focuses on the kinds of conditions represented by the specimens. The candidates are expected to use the feedback to identify their weaknesses. For retesting, PDA selects different test sets than what the candidate used before.
2.5.6 Reporting of Test Results Ultimately, the performance of the candidate will be formally documented in the PDQS.
The discussion on the PDQS is in Section 2.4.2.2 above. Currently, test results are documented in individual files for each candidate.
2.5.7 Findings Substantial work has gone into developing the PDI/PDA program. PDI and PDA analyzed the IGSCC Coordination Plan testing program for BWR IGSCC qualification.
The major team observation of the IGSCC Coordination Plan program was that candidates taking the demonstration tests could skew the results to increase the probability for guessing an acceptable answer (testmanship). To avoid the short comings of the IGSCC Coordination Plan program, the PDI program was designed to minimize testmanship. Some features of the PDI program are intended to cause candidates trying to use testmanship to fall the demonstration. Such features include use of multiple flaws in a single specimen, use of one or more entirely blank specimens, unknown number of total flaws in a test set, number of blank grading units being used, sizes of the grading units, length sizes of cracks, location and characteristics of tAle counterbore, and crack sizes in the specimen set as a function of the pipe wall thickness. These conditions are scattered throughout the test and represent a significant challenge to the candidate.
The Code has used a hierarchical approach for testing of piping. In this approach, a candidate first demonstrates acceptable performance on austenitic pipe and then can include ferritic pipe with the addition of 3 flawed grading units and 6 unflawed grading units. The candidate must classify all 9 grading units correctly to successfully pass the ferritic portion of the detection test. The addition of the 3 ferritic pipe flaws does not contribute to the flaw grading unit count shown in Appendix VIII, Table VIII-S2-1.
However, the PDI program uses this approach only for ferritic piping.
27
The Code does not specifically address IGSCC or single-side access acceptance criteria in the supplements; therefore, PDI determined what constituted a successful performance demonstration. When the categories of IGSCC and single-sided access examinations are included, the PDA program integrates these conditions as part of the detection and length sizing test. Under PDI's present program, - test set for Supplement 2, which includes these categories, will consist of at least three IGSCC flaws and no less than three single-side access flaws. In addition to the normal code acceptance criteria, in order to be considered qualified for IGSCC or single-sided access, the candidate is not allowed to have more than one missed detection in either of the IGSCC or single-side access category. If the candidate passes the Supplement 2 acceptance criteria, but does not meet the IGSCC or single-side access requirements, a Supplement 2 qualification is gained in austenitic material only. The candidate is reevaluated in the failed category(s) by being tested on additional flaws in the applicable category)s). The ability to qualify with both of the allowed misses in one or the other of these conditions will be tracked as Item 95-01-09.
The Code uses an acceptance tolerance for d-pth sizing dubbed "critical undersizing."
PDI has developed a position paper (PDI Position 94-002) and is recommending changes to the Code that depth sizing of cracks in piping be evaluated using a 0. 125-inches root mean square error (RMSE) statistic. The critical undersizing criteria in the Code for depth sizing would greatly reduce the number of candidates passing the performance demonstrations. At the time of the review, the RMSE criteria was met by 66 candidates.
If PDA used the critical undersizing criteria, 17 of the 66 candidates would have failed.
In all cases, one crack was significantly undersized. Some of the reported values were close to the limit for passing the demonstration, and the remainder were progressively worse. Since the sizing error results in undersizing, it is non-conservative and may lead to inadequate corrective action. The use of the RMSE statistic provides a measure of performance in the field, whereas the use of critical undersizing requirements generally results in candidates trying to avoid an undersizing error by inflating the reported values (i.e., testmanship).
The cracks that have. caused candidates the most difficulty are the cracks which are mid-wall IGSCC. Sizing techniques for shallow and deep cracks are quite effective, but sizing the mid-wall cracks is more difficult and is based on finding the deepest crack tip.
With the mid-wall IGSCC specimens having such a profound effect on the pass rate, PDI and PDA should reexamine the procedures for adequate detail. This issue will be tracked as Item 95-01-10.
2.5.8 Conclusions Overall, the performance demonstration program for piping welds has a number of positive features. It has been designed to minimize testmanship, and the specimens require that candidates have a good procedure and extensive knowledge in UT to successfully pass the test. However, the manner in which PDA includes IGSCC and
ý28
single-side access in the test set could allow candidates to pass who are not effective in these areas. The team concluded that this tvosition should be reealuated (see Item 95 09). The intent of the Code is that a candidate demonstrate' effectiveness on the difficult inspection conditions and that a small number of the easier conditions be added to confirm that the candidate is effective for the less difficult conditions (i.e., 3 flawed and 6 unflawed ferritic grading units added to the austenitic test), and can correctly classify all of the grading units.
The 1992 Edition of ASME Section XI, Appendix VIII, included RMSE depth sizing for piping. The RMSE provides in a single statistic a good measure of sizing performance when based on muitiple correct characterization of flaws for each tested condition.
However, if the error that a candidate makes is largely associated with a single measurement, substantial undersizing can occur. Candidates need to better understand the cause of this undersizing so that they do not make this type of mistake as inspectors in the field. The team favored the RMSE method of determining acceptability of sizing results over the critical undersizing criteria that existed in the 1989 Edition of ASME Section XI. Appendix VIII. However, the firal detL..rmination with the actual RMSE value remains with PDI and the Code.
2.6 Performance Demonstration Testing of RPV Materials The team reviewed the performance demonstration testing for RPVs that is offered to the candidates. The review included designing the test sets, conducting the tests, scoring the results, retesting, explaining results, and reporting performance demonstration information.
As part of the review, the team evaluated the performance demonstration of RPVs against the requirements of Supplements 4 and 6 to Appendix VIII. This performance demonstration has been conducted only with automated UT equipment.
2.6.1 Test Set Design The test specimens for the RPV are designed to meet the requirements of the Code.
There are 9 specimens: however, the test grid locations can be redefined at will, thus creating a large number of test set combinationf. Using a black pen, the test grids are predrawn on a specimen, then the candidate is instructed to inspect selected grids. The grids can be redrawn and different grids can be selected on a given specimen for retests.
A grid can be subdivided into zones.
Some companies with automated equipment are examining the entire specimen, then storing the data for later performance demonstration examinations. The stored data is under the control of PDA. To provide a large number of test combinations, PDA selects the starting locations in the stored data.
2.6.2 Test Execution 29
The RPV specimens may be examined at the EPRI NDE Center or at an off-site location.
For off-sie examinations, the inspection organization uses a security plan acceptable to PDA. During the inspections, PDA provides a proctor to observe the testing process and scrutinize the candidate's data.
While analyzing the data collected on a zone, candidates are expected to maintain a complete set of paperwork supporting any of their findings. The candidate presents in detail the dat2 for each declared flaw to the test proctor. The test proctor confirms that the procedure has been followed and that detailed location, sizing, and classification information is given. Since most of the equipment used for RPV ISI is automated, the data packet is compiled for each zone analyzed. The candidate includes the images showing the flaws in the paperwork. Several members of the team observed a candidate completing an inspection analysis packet and being interviewed by the test proctor.
2.6.3 Scoring Results The tez.n -,valuated scoring results by reviewing .he data of some candidates that have gone through the PDA program. The grading unit concept used for piping test sets is not used in the RPV test sets. For the RPV tests, PDI defines the detection of a flaw as being within 1" of the flaw's true location. The tolerance used by PDI is double the tolerance of Appendix VIII, Supplement 4. PDI has not initiated any action with the Section XI Committee on the tolerance changes. This issue will be tracked as Item 95-01-11.
In addition, PDA has found systematic errors associated with equipment setup and measurements that produced an identifiable shift of all the flaws in a recognizable pattern.
If PDA determines this to be the case and the systematic shift is within an additional I" of the true-flaw location plus tolerance, PDA may declare a deXection. The 1-inch tolerance plus 1-inch systematic shift quadruples the tolerance of Appendix VIII. The use of the systematic shift can be thought of as an "engineering fudge factor" for variables not associated with the candidate's skills. As equipment, technique, and procedures incorporate the appropriate adjustments, the systematic shift should become nonexistent.
2.6.4 Retesting and Expansion The same comments apply as in the case of piping (see Section 2.5.4 above). However, since automated equipment is used in the data collection, and many of the grids on a specimen are examined for a given scanning operation, a new test set can be identified for the candidate to analyze without rescanning the specimen. Some zones between the two test sets may overlap, but most will not.
30
2.6.5 Feedback Provided to Candidate The feedback process for RPV is similar to that which is used for candidates that take the piping test.
2.6.6 Reporting of Test Results The same comments apply as in the case of the piping. Results will be on the PDQS, but since this form has not been developed, it was not reviewed and evaluated.
2.6.7 Conclusions In the past, UT procedures were very flexible with little detailed guidance. The procedures associated with the PDI program and the intent of Article VIII-2000 of Section XI require significant detail. The UT procedures being submitted to PDA for procedure qualification are deficient in detail, resulting in delays. One shortcoming is that some essential variables and essentil variable ranges are omitted. Another shortcoming is the absence of a method or criteria for discriminating between indications.
As discussed in Section 2.1, for generic procedures, application specific procedures should contain sufficient detail to minimize variations between the performance demonstration and the field implementation. In reviewing the final procedures that the companies used, the team found that they met the minimum requirements of Appendix VIII, but additional information would be necessary to enable confirmation that field implementation is consistent with the performance demonstration inspection (see Item 95-01-01).
Appendix VIII contains a list of the essential variables. The difficulty PDA has in meeting Appendix VIII requirements is that the essential variable list is too rigid. Some of the variables that are necessary for detectability, reproducibility, and repeatability in applying new technologies are not on the list. The software used with a system is a necessary variable that should be verifiable. These necessary variables are considered essential to the performance of the procedure ard should be addressed. This is considered to be a problem that the code committee should address.
One area that is unclear is how to document software used in the performance demonstration test is correlated to software used in the field (i.e., is the software the same as that used during the performance demonstration). To address this concern, PDA decided that the candidate must provide a copy of the software being used so that, if questions arise in the future, the version maintained on file by PDA can be compared to the version being used in the field. This will not be an easy comparison, and there is no easy method to address this area unless the software is copyrighted and documented in PDA records. This area of improvement will be tracked as Item 95-01-12.
31
The team is aware of the tolerances used for scoring a detection in RPV performance
'demonstrations; however, in the 2-_+/-nce of a PDI position paper, the team did not perform a review on the subject (see Item 95-01-11).
PDI has developed position papers for lengt. sizing of cracks in RPVs (PDI Positions94-005 and 94-008) and is recommending changes to the code committee that a tolerance of 0.75-inches root mean square error (RMSE) be used. The team does not take exception to PDI's position.
One area of concern is that the performance demonstration test is being conducted using scanners that have been modified to scan just for testing (i.e., these types of scanners are not used to perform UT in the field). The field scanners are radioactive from previous ISI and cannot be used on PDI specimens. The specific concern is that there should be a means to prove that the improvised scanners used in the performance demonstrations are not superior to the scanners used for the field ISI so as not to compromise qualifications.
This issue will be tracked as Item 95-01-13.
2.7 Bolting Examination Examination of bolting under the PDI program will be performed in accordance with instructions in Procedure PDP-I-009.8 (Rev. 0), "Test Specimen Selection, Grading and Retest Instruction," and the PDI QA program. The procedure is in the final review process and as such lacks official document status. However, the team ascertained from PDI and PDA that approval of the procedure in its present form is expected shortly.
In the procedure, pre-established parameters for particular organizations are used for specimen selection. A specimen test set contains at least three different bolt or stud diameters and lengths with a minimum of five flaws per test set. Each test set would contain one flaw per location, including circumferential notches located at minimum and maximum metal paths of the specimen (i.e., thread root, shaft, and bore holes). EDM notches would be located on the outside thread surface locations and inner bore hole surface of studs with maximum depths and reflective areas as specified in the procedure. To ensure a "blind" qualification, notches and specimen identification would be obscured from the candidate, an examination time limit, as prescribed by the procedure, would be enforced, and no specimen pre-service data would be available to the candidate.
Acceptance criteria for a successful demonstration would consist of minimum detection and false call requirements. A successful detection demonstration would depend on the number of flaws within a test. For example, in a test set with five flaws, a minimum of five flaws would have to be detected with no false calls allowed. In a test set where the maximum ten flaws are included, a successful demonstration would require eight indications to be detected with two false calls permitted. Limits defining false calls are as follows:
0 Notches axial location: + /A-inches or 5 % of bolt or stud length, whichever is greater.
32
- Flaw, circumferential position: detection is within + 600 of flaw centerline location.
Retest following an unsuccessful demonstration is permitted following debriefing on the areas contributing to the failure. A retest would contain a test set with at least 75 % new flaws. If unsuccessful during the second demonstration, ,1 waiting period of 30 days is required before another attempt is made. At the time of this review, PDI was developing fixtures and the administrative program to handle this activity. The above procedure and the bolting specimens observed by the assessment team, along with the administrative controls as described by the PDA, appear to be satisfactory for the activity. The limits for the axial location of notches complies with Appendix VIII, Supplement 8, acceptance criteria.
2.8 ASME Code The PDI program was developed to meet Appendix VIII to the 1989 Edition with 1989 Addenda for the fabrication of specimens and to the 1992 Edition with 1993 Addenda of Section XI of the ASME Code with application of certain Code cases for the demonstration program.
2.8.1 PDI/PDA Exceptions to ASME Code The team reviewed the exceptions to Appendix VIII where the requirements have been identified as impractical.
2.8.1.1 Items Reviewed PDI is complying with Appendix VIII with the exception of impractical requirements identified during implementation of the appendix. The impractical requirements relate to specimen security, testing tolerances, and training which are expected to be the subject of future code cases. The team reviewed the technical justification and reasonableness of the alternatives developed for the impractical requirements.
Situations not specifically addressed by code, were being addressed in the PDI program by the PDI steering committee and subcommittees. A summary of the impractical requirements with the team's conclusion for each position is contained in Table 2.
33
Table 2 Summary of PDI's Alternative Positions PDI Position Description Team's Conclusions94-001 Added tolerance to qualification diameter Does not take exception and thickness ranges94-002 Change tolerance for length to 0.75" Does not take exception RMSE and depth to 0.125" RMSE, piping 94-003 Change tolerance for notch width to a tip Does not take exception dimension 94-004 Added tolerance to qualification diameter Does not take exception and thickness ranges94-005 Change length tolerance to 0.75" RMSE, Does not take exception RPV 94-006 Change from sizing a specific location on Does not take exception piping to a region 94-007 Masking is not required See 2.8.1.2. below 94-008 Change length tolerance to 0.75" RMSE, Does not take exception clad RPV 94-009 Change number of flaw depth ranges See 2.8.1.3. below 94-010 Change depth tolerance to 0.15" RMSE, Does not take exception RPV cladding/basemetal interface; 0.25" RMSE the remainder of the RPV 2.8.1.2 PDI Position No.94-007 This PDI position addressed the requirements in Supplements 4 and 6 of Appendix VIII to have specimen identification and flaw locations obscured or masked. The technikl explanation contained in the PDI position provides insufficient information to support it. The team does not take exception to the concept because the cladding on the surface of the RPV covers the flaws. However, the explanation should contain more detail and the suggested wording should be less ambiguous.
34
2.8.1.3 PDI Position No.94-009 This position changed the flaw depth from three depth ranges to two depth ranges.
The change was to minimize testmanship as a factor for success in performance demonstration. The team does not take exception to the change in ranges; however, as part of the suggested change, PD1 should include flaws that would satisfy the acceptance criteria of Table IWB-3410-1. The test sets examined by the team had at least one flaw that satisfied IWB-3410-1.
2.8.1.4 Deferring Using Appendix VII Appendix VIII, Paragraph VIII-2200, requires candidates to meet the personal qualification requirements contained in Appendix VII. PDI defers the candidates prerequisites to the sponsoring agency or employer. PDI stated that in some areas candidates who do not meet Appendix VII requirements are being tested. To comply with Appendix VIII, all candidates participating in the current PDI program may require a waiver or relief from the Appendix VII requirement. Further review of this issue may be necessary.
2.8.1.5 Changing RPV True Location Tolerance Appendix VIII, Supplement 4, Paragraph 2.1(b), requires flaws to be reported within 1/2-inch of their true location. PDI is using a 1-inch tolerance from true location and in some cases up to 2-inches as being acceptable. PDI should develop a position paper and initiate a Code case.
2.8.2 Merging the IGSCC Program into the PDI Program During the formative stages, PDI developed the agreement with the NRC that the appropriate parts of the "IGSCC Coordination Plan" could be included into the PDI program. In the merging of the two programs, PDI instituted changes that could influence the effectiveness of both programs. The team examined PDI's methods and test results associated with the incorporation of the IGSCC program into the PDI program.
Comparison of NUREG-0313, Revision 2, to the PDI Program NUREG-0313 recommends that UT examiners (candidates) be given formal performance demonstration tests, such as those prescribed in IE Bulletins 82-03. and 83-02i'. The bulletins prescribed that IGSCC examiners accurately detect and size eight out of ten IGSCC flaws.
'IE Bulletin 82-03, "Stress Corrosion Cracking in Thick-Wall, Large-Diameter, Stainless Steel, Recirculation System Piping at BWR Plants,' October 14, 1982. IE Bulletin 83-02, "Stress Corrosion Cracking Large-Diameter Stainless Steel Recirculation System Piping at BWR Plants," March 4, 1983.
35
PDI selected the number of IGSCC flaws for inclusion into the sample set on the basis of their understanding of the requirements of Appendix VIII, Supplement 12.
The method for adding IGSCC flaws to Supplement 2 or 12 is more tolerant than the method for adding ferritic flaws to the same supplements. In PDI's opinion, the number of IGSCC flaws contained in thp test set is sufficient to ensure that inexperienced candidates will fail the examination.
PDI considers their testing of IGSCC more restrictive than NUREG-0313 because of (1) the different grading requirements and (2) the requirement that each flaw detected by the candidate be verbally explained to the PDA proctor using supporting paperwork. The verbal explanation reduces guessing and identifies poorly-prepared candidates. The pass-fail data shows that only 2 out of 75 candidates passed the IGSCC portion of the examination by detecting one out of three IGSCC flaws while correctly detecting all the remaining carbon steel and stainless steel flaws. PDA stated that some candidates were able to pass the IGSCC portion of the performance demonstration without receiving IGSCC training, though most of these candidates were already IGSCC qualified. Additional discussion is contained in Section 2.5.7.
The team takes exception to the number of IGSCC samples that a candidate can detect and still be considered IGSCC qualified. A candidate can receive credit for demonstrating detection of IGSCC when they have detected only one of the three IGSCC defects in the test set (see Item 95-01-09).
3.0 MANAGEMENT MEETINGS The PDI and EPRI PDA management were informed of the scope and purpose of the assessment at the initial entrance meeting on September 15, 1994. The findings of the assessment were discussed with the PDA representatives during the course of the assessment and presented to PDI and PDA at the exit meeting on February 3, 1995. This assessment involved extensive proprietary inforntation; therefore, the as,,ssment report will be provided to PDI and PDA for review to identify any proprietary items that should be deleted or withheld from the report prior to public distribution. The principal persons contacted during NRC's assessment of the PDI program are listed below.
36