ML20058P685

From kanterella
Jump to navigation Jump to search
Provides Commission W/Info on Status of NPRDS & Progress During 1989
ML20058P685
Person / Time
Issue date: 07/11/1990
From: Taylor J
NRC OFFICE OF THE EXECUTIVE DIRECTOR FOR OPERATIONS (EDO)
To:
References
TASK-PII, TASK-SE SECY-90-239, SECY-90-239-01, SECY-90-239-1, NUDOCS 9008170274
Download: ML20058P685 (33)


Text

Y E

Qcp

($.

\\....../

...eaw j???....'

POLICY ISSUE.

Ju'ly 11, 1990 (Information)

'S ECY-9 0-'2 3 9 For:

LThe Commissioners

-From:

James M. Taylor Executive Director for Operations

Subject:

NUCLEAR PLANT RELIABILITY DATA = SYSTEM (NPRDS)

Purpose:

To provide the. Commissioners with information on the.

status of-the NPRDS and its progress during-1989.

Background:

In August 1981,tthe' stiff proposed in SECY-81-494; deleting component failure reporting requirements from the operational data reperting rulemaking then in progress.

At that time, INP0 had announced that it would assume manage-ment of the NPRDS.

The staff believed that INPO management would correct the two principal deficiencies that had pre-viously;made NPRDS an inadequate source of reliability data.

Those deficiencies were the inability <of the existing-committee management structure to provide the necessary technical direction and a low level of participation by the utilities.

Based on the staff's-proposal, the Commission eliminated component failure reporting.from the scope of the new reporting requirements:(10 CFR-50,73).. They directed the staff to periodically report on the-effective-ness of INPO management of NPRDS and the responsiveness of NPRDS to NRC needs.

The enclosed report is the staff's 13th evaluation and covers primarily. calendar year 1989.

In this evalua-tion, the staff addressed;the. areas of concern outlined in SECY-81-494, industry participation and INPO_ management of the system,.as well as its responsiveness to~NRC's'needs.

Contact:

NOTE:-.PROPRIE Y-MITE TO' John Crooks, AE00 NRC U ESS CO SSION 49-24425

-DETE IINE OTHERW E

f' SECY NOTE:. This doewetit-tut s information'that shoul? not be released outside the NRC.

Qpendix C obthe Enclosure should be removed before release is made; Pi)]j g 9008170274 900711 s

-sh PDR SECY d'6)-M9 -

PDR_da

{iY

-i

,c * ' The-Commissioners-(2)L Discussion:

IndustryParticipation In:1981, the principal' staff concern was the low level of industry NPRDS participation.

The NPRDS is.the only system specifically dedicated to the collection of component failure data.

Subsequently, the-industry participation improved considerab'ly,.such that'all currently operating plants (except for two atypical plants that are excluded) have committed to'INPO to participate in NPRDS.

The staff now focuses on the completeness. timeliness and the quality of reporting.

Whencomparedto1988ithe_ staff'notedthatin1989: (1) average completeness-remained;about the same (65% to 70%

of the NPRDS reportable failures are being reported);.-

(2) the median time for a fai_ lure report to be available on the data base after-discovery-of the failure remainedL

~

between three and-four months (slower than NPRDS reporting guidelines of the _ period); and,-(3) overall quality did not change noticeably.

Individual plant reporting continues.to i

vary widely.,

TheindustryhasrecogdizedtheneedtoupgradeNPRDS.1 Areas highlighted for attention were more h?RDS use by.

utilities,-scope, expansion to_ monitor additional balance of plant (B0P)-equipment, improved completeness and time-liness of failure reporting, and enhancements to increase NPRDS effectiveness.

Management of the System The staff has reviewed _INP0's activities _-to upgrade the automated data processing (ADP) system, enhance data quality, provide user support, and encourage member participation.

Information for this review was^obtained directly through-use,:as a member of the NPRDS Users' Group (NUG), and through discussions and visits with INP0 at the staff level.

During 1989, INPO completed major improvements in the ADP-capability of the system.

Release 3 of.NPRDS-Plus, issued in April 1989, was'the final transition from the Prime to-

'l the. IBM computer.

NPRDS Plus now provides users a-menu-driven, user-friendly, data entry and retrieval system.. The system supports on-line' data entry, enhances computer quality assurance features, and provides new--

J t

" Industry Action Plan for Continued Improvements in j

-Maintenance of' Nuclear Power Plants" (updated through January 25, 1990; forwarded to James M. Taylor, Executive Director for Operations, NRC, in a letter from Joseph Fi j

Colvin, Executive Vice President and Chief Operating Officer, NUMARC, dated February 16, 1990)..-

I l

~

a 3

The Commissioners (3) l

-canned _information-repcrts and the ability-to perform some i

selected statistical:ana'ysis of the data.-

INPO is enhancing the quality of the data through audits of input data, the addition of r,ew data fields in 1989, and'an upgrade of the component engileering data (i.e., instrumen' e

tation)'for the-reactor prote-tion system (RPS) and the engineered' safety features actuations system (ESFAS)!

throughout.1990 and 1991.

Over a period of time, s_tandard' model identification-codes werti added to 285,000 engi.neering records-(currently available to users) and-failure mode codes are being added to component failure records.(tc ce available in mid-1990);

INPO approved an orp nsion of the scope of NPROS to include approximately 60 to 80 additional 1

BOP components, which is scheduled to be completed in..

December 1990; failure reporting for these; components is_

lj!

scheduled to start in January 1991.

]

INPO continued user-support activities,-along with providing new reporting guidance documentation and-reporter training-

]

following Release 3.

INP0 continued monitoring reporting patterns.and initiated a program: for specific site _ visits -

i (27,in 1989) to_ assess reporting factors such as ccmplete-l ness and-timeliness.

In March 19S0, INPO issued new

'i guidance on timeliness that' lengthened the time interval previously speci ied to report failures to set more realistic time requirements.

Responsiveness'to NRC's Needs

~

In 1989, staff use of the failure data in'NPROS increased.

NPRDS continues to-be a source of operationalidata to

- i supplement other sources _for:some NRC' programs.- For.

qualitative-uses-and for quantitative uses which accommodate limitations of NPRDS such as the, staff's work in the: area-1 of-maintenance indicators, NPROS has: provided useful-information.

However, the NPRDS is not ideally suited'to provide component reliability or overall component performance data.

In general,: engineering analysis.is necessary to i

correctly apply the failure data.

When the NPROS is used 1

in an automated manner to calculate the failure. rate _of;a specific component, problems'may arise'from the data quality, i.e., the. applicability of the failure to the-desired failure rate calculation either because-of the

. nature of the-failure or the failure mode.- The current'

)

efforts to add failure mode _. codes to the. system may offset some of these difficulties.. The' absence of component-unavailability data for causes'other than component failures presents additional limitations on system usage'.

d

.. w.

i-R

^

.]

.)

.'-.6

,The Commissioners (4)'

q RES is exploring the degree to which NPROS data can be 1

.used to develop a performance indicator to flag occurrences _of high unavailability of-selected. safety:

a systems at operating nuclear power plants.- - Further,.NPRDS is also used in.RES in~a; scoping way to help identify-failure rates-and root causes of failures for NRC-sponsored risk assessments,~ plant aging studies', and' in regulatory analyses under the-Backfit Rule..Where sensitivity studies indicate a quantitative value is-significant to.-the estimation of plant risks', a more, _

detailed data: analysis is generally. required, for. instance, j

using plant operating and maintenance logs as a source of.

information.

Summary The staff's enclosed evaluationLfor 1989.found:that overall the industry participation _in NPRDS remained about the-same-as;in 1988; that.INP0 met several; major milestones in.

making the NPRDS more useful;' and.'that.NPRDS continues to provide useful.information for.some.NRC programs.

-The staff's'next evaluation will address the results.of NPROS related activities for 1990..

/-

Q es M.

or ecutive irector-for Operations-

Enclosure:

NPROS Evaluation Report 1

DISTRIBUTION:

-Commissioners OGC OIG-GPA EDO

.l ACRS ASLBP ASLAP SECY j

u l

i

_ ~ _ _

r u

t-p Enclosure l

l l

STAFF EVALVATION OF THE STATUS 0F THE 1

' NUCLEAR PLANT RELIABILITY DATA SYSTEM (NPRDS)

~IN 1989

= June 19'90

-l 1

1 i

-Prepared by:

l Office for Analysis anc: Evaluation I

of Operational Data l

NOTE -

The NRC has determined that plant specific NPRDS data =are-l proprietary. Thus, removal of Appendix C is necessary prior to release of i

this occument outside the NRC.

9

)

r g-

't g

C P

CONTENTS Paae 1.

BACKGROUND 1-11.

INDUSTRY PARTICIPATION 2

11-1 Industry Reporting to NPROS.......

'2-11.1.1 Completeness-of FailureLReporting 2

-II.1.l.1 Measures of Completeness and Analysis 3

5

!!.1.l.2 Conclusion'.

11.1.2 Distribution of Failure Reporting 5

11.1.~3 Timeliness of Failure Reporting

-7 11.1.4 Quality of the Data 9-

!!.l.4.1 1989 Evaluation of Quality...._......-

9

!!.l.4.2 Improvements in the Quality of-New Records..

9 11.1.4.3. Improvements' in the Quality of Old Records..

10 11.1.4.4 Conclusion..

12 11-.2 Industry Use of the System 12' 111.

TECHNICAL MANAGEMENT AND DIRECTION OF THE-SYSTEM

'13 a

'!!I.1 Organization 13' 111.2 Oversight of NPRDS Activities...............

14-III.3 ADP System,,NPROS Plus. Release 3 14 Ill.4. Enhancements to Data and Reports 15 Ill.4.1 New Codes....... ;..............

15

'111.4.2 Component Failure Analysis Report (CFAR)'

15 111.4.3 Scope Expansion..............

~16 111.5 User Support and Quality Control 16 111.6 Major Objectives'for 1990'..................

17 Ill.7 Conclusion 17 IV.

RESPONSIVENESS T0 NRC'S NEEDS..................

,e 18 IV.1 Maintenance Effectiveness Indicators (MEls).........

18 IV.2 Component Studies......

1

-18 IV.3 Preparati';n for Site Visits...........

.c.

18 IV.4 Revisior.s in Technical Specifications Requirements 19 IV.5 Probabilistic Risk Assessment...

19 IV 6 Aging Stildies.........................

20 IV.7 Generic Issues

,20 IV.8 Conclusion 1

21

~V.

SUMMARY

OF THE STATUS AND EVALVATION 0F-NPRDS............

'21 Appendix A Observations from Maintenance Inspection Team Reports.of Plant's Usage and Reporting'of NPRDS. Data.... '...... L A-1

..., 1B 1 Appendix 8 NPRDS Component Failures 1985-1989 Appendix C NPRDS Failures 1985-1989 (Proprietary Information)

C1 y

{

}

f J

T' t.

t NPRDS EVAL.UATION REPORT The Nuclear Plant Reliability Data System (NPRDS), an INP0 managed' data base, is NRC's primary industry wide source of data on component failures.

For several years, it has been a tool for identifying and analyzing operational safety concerns.

During the past two years, its importance to the l,RC has increased with NPRDS' role in the evaluation of maintenance.

The first section of this report provides a brief history of the NPRDS and the past staff evaluations of the system.

The second section addresses industry's current participation in the system.

it discusses industry's reporting in terms of the completeness, timeliness, and quality of records submitted to the data base and industry's use of the NPRDS data base.

Section 111 provides a review of INP0's main activities related to the NPRDS during 1989.

Section IV addresses the cuestion of the usefulness of the data for NRC's current uses.

Section V summarizes the status of the system and the results of this evaluation.

I.

BACKGROUND l

In 1981,.!NP0 announced that it would assume management of the NPRDS. At that time, the NRC staff was considering rules for reporting operational data.

In recognition of INP0's decision -the staff proposed to the Commission that component failure reperting ce deleted from the operational data rule.

The staff relied on INP0 to correct two of the major deficiencies of the sy3 tem at that time: a lack of close management of the. system and the low -level of plant participation.

The Commission directed the-staff to proceed with the LER rule (10 CFR 50.73), without component level failure reporting, but to monitor closely the status and management of the-system and to report to the Commission on the effectiveness of INP0 management and the ability of the system to meet NRC's diverse needs.

For the period from-1982 through 1987, the staff conducted semi-annual 4

evaluations of the system and prepared 11 reports to the Commission on the progress of the system, They reported that since 1982 INP0 had made a.

significant commitment in resources and management to the-NPRDS.

The NRC staff found that the average number of failure transactions (new failure reports and revisions to old reports) submitted per plant each quarter increased by a factor of five--(from nine in 1982 to 46 in'1987); the total number of NPRDS engineering records on the data base more than tr.ipled; and the median time to submit failure reports.was reduced significantly.

During the time period, the plant population grew from 68 commercial units to 100,-

Although these gains were impressive,.in January 1988 the staff report noted that the average rate of failure reporting per plant per year had reached an apparent plateau. Thus, they proposed-that the evaluation shift from a statistical analysis of the data base to evaluations based on plant visits, more use of the data, and the inclusion of NPRDS in the evaluation of_ industry maintenance initiatives, in view of the status of NPRDS at-that time and the

-time required for INP0 planned enhancements to be implemented, the staff recommended a change to an annual evaluation and report to the Commission-on NPRDS.

I a

i e-l i

The first such evaluation report provided in February 1989 found that NPRDS remained a usable-source for component f ailure data. However, the staff l

continued to be concerned with: 1) the variation in reporting among similar plants and components, 2) the timeliness of failure reporting, and 3) the low quality of some data, particularly some of the engineering data.

l In that evaluation report, the staff proposed that the next evaluation be more focused on. evaluation through usage.

The staff would monitor NPROS progress through: 1) reviews of industry use of the NPRDS data for maintenance l

assessment, 2) development of failuce based maintenance effectiveness indicators using NPRDS, 3) review of reporting pat',0 1s at selected operating plants, and 4) the continued use of NPRDS for NRC 5,; grams (events assessment, l

component performance studies, Nuclear Plant Aging Research (NPAR), reviews of frequencies of Technical Specification surveillances, and limited i

probabilistic risk assessments efforts).

)

Thus, as part of '.ne 1989 evaluation, in addition to the previous evaluation

{'

activities, AE00 ased input from Maintenance Team Inspections for a more in-i depth review of hPRDS use and participation.

The staff also used information obtained from visits to INPO to discuss their NPROS related activities (i.e.,

s their initiatives and progress) and from participation in NPROS Users Group activities.

This:is the W C's second annual evaluation of the NPRDS.

]

INDUSTRY PARTICIPATION Industry participation in the system has two components - industry reporting to the system and industry use of the system.

Thus, in this evaluation, the staff examined not only inputting to the data base but also industry usage of the NPRDS; it reviewed and assessed the observations on NPROS usage noted by the Maintenance Team Inspections on their visits.

!!.1 Industry Reporting to NPRDS The staff's 1989 evaluation of NPRDS reporting addressed the three qualities of the failure reports that impact their usefulness in meeting-NRC's needs for component failure data: the completeness and timeliness of failure reporting and the quality of the data.

11.1.1 Completeness of Failure Reporting The completeness of failure reporting to NPROS involves the reporters' interpretation of the NPRDS reporting guidance, the information available to complete the required record, and a variety of other factors related to resources and priority.

Reportable failures may not be reported because of the misinterpretation of the reporting guidance, a' lack of personnel assigned to reporting, or because of oversight due to inadequate source material (e.g.,

maintenance work requests or other information).

The,;ompleteness of failure reporting to the data base was previously measured uy AE00 by comparing the number of NPROS reportable failures identified in a sample of LERs to the number from the sample found on the data file at the end of each quarter.

The 2

I

1

~l staff had estimated that on the average plants were reporting roughly 65% of i

their NPRDS component failures, in 1959, INP0 began a similar matching process based on both maintenance work requests (MWRs) and LERs.

For this evaluation, tne staff reviewed the results of INP0's completeness evaluations and analyzed data on failure reporting and component fallares in 1989.

The staff believes that the average completeness of failure reporting in 1989 remained about the same as in 1988-(65% to 70%).

They also noted a downward trend in component-failures within the industry.

11.1.1.1 Measures of Completeness and Analysis In 1989, INPO began conductii.g. site visits to measure completeness of failure reporting using both comparisons with maintenance work requests and LER reported failures (see Section 111.5).

In a meeting with NRC staff on i

February 2,1990, INPO discussed the completeness evaluations conducted at 27 plants during 1989, it was stated that these 27 units were selected as a-representative sample of industry to enable them to estimate-industry-wide

-L completeness.

They use an algorithm to assign plants as low, medium, or high-reporters.

From these groups they then select the plants to be visited each year.

From the 27 units evaluated in 1989, INPO reported that they ' estimated that the total plant population is reporting 74% of their failures and that 90% of the plants are reporting between 64% and 84% of their failures.

INP0

-staff provided NRC staff the data on the distribution of these plant evaluations for the 27 units reviewed in 1989 and two in 1990, which is shown l

in Figure 1.

INPO Estimate of NPRDS Completeness FIGURE 1 Numter of Plante in % Range l

i Industry Avg.

F' i

V i

lo=

/

/

o# & ##

?

r o-20%

21-40%

41 80%

61 80%

61-100 %

INPO Estimate of % Complete Based on INPO Inforenation U9o i

3 l

l l

1

I L

O l

Over the years, the NRC staff has repeatedly compared its interpretation of 1

the NPRDS component failure reporting requirements with that of other users including INP0.

In general, the interpretations agree within 10%,

i.e., with the staff estimate being 10% lower than INP0's.

Thus, INP0's estimate of overall completeness is not too different-from the NRC estimate, which is based on the number of NPRDS reportable component failures identified in LERs.

As a further check of-INP0's average completeness figures, the staff i

determined the change in-the number of NPRDS failures between 1988 and 1989, and then used trends in LER data to estimate the change in tne occurrences of component failures in the industry between 1988 and 1989.

Component failures-involved in certain events are required to be reported in LERs by 10 CFR 4

50.73.

A comparison of these two trends would then permit an estimate'of the completeness for 1989 from the earlier LER and NPROS data comparison

)

estimates.

To estimate the number of actual NPRDS reportable component failures that occurred in 1989, a search was made of-the Licensee Event Report (LER) data base in the Sequence Coding and Search System (SCSS).

Sequence coding requires the coder first to identify the individual occurrences that make up the events described in the LERs and then to encode each piece of information into the data base.

To identify component failures that occurred during reportable events, a " repair" code was included in the SCSS fields, Each component failure requires a " repair" code to be included in the coding.

Tigure 2 shows the number of SCSS component failures (using the " repair" code) and the number of NPROS 'silures for 1985 through 1989, it should be-COMPONENT FAILURES IN NPRDS' AND SCSS FIGURE 2 Thousando 14 s

3g -:

1o g _.. - -

6 2

4 o

1988

'1986 1967 1988 1969 E NPRo8 8 8088 NPRDO: 54444 44 Otatowery Date

  • Data se of May 1990
  • .'L.;*::'EM.'t.:t 4

1

t recognized that component failures reported in LERs have a less restrictive definition than those used in NPRDS but one would expect that they would show the same trends over time since many NPRDS re:ortable components are addressed in L ".s.

These two series did have a very high correlation (correlation coefficient of 0.94).

This high correlat, ion and the f act that the number of SCSS repair codes declined about 18% from 1988 to 1989 provides eviderte that component failures in the industry have experienced a general decline and, as a result, the number of failures reported to NPRDS declined.

The staff found that the number of NPRDS failures with discovery dates in 1989 declinec 21%

from the number in 1988.

Thus, the decline observed in SCSS repair codes was about the same as the decline in NPRDS component failures from 1988 to 1989.

This also suggests that the average completeness of failure reporting in 1989 was about the same as in 1988.

In response to a question from a Commissioner, AE00 selected a sample of 100 Preliminary Notifications.(PNs) in the first quarter of 1989.

They found that 18 of the 100 PNs involved equipment problems that would be reportable to NPPDS. NPRDS records were found for 15 of the 18.

After discussing the'three missing events with the licensee, AEOD concluded that one of these did not involve an equipment failure that would be reportable to NPRDS.

Although this was a small sample, the results indicate 88% completeness.

!!.l.1.2 Conclusion From this information, along with qualitative information from Maintenance Team inspections and other discussions with plant staffs, the staff concludes that average completeness' of f ailure reporting in 1989 remained at the same level as in 1988 (65% to 70% complete).

!!.l.2 Distribution of Failure Reporting During 1989, the data base grew to around 98,000 component failure records and around 527,000 engineering records.

The number of reports submitted by individual plants varies greatly for a variety of reasons other than simply not reporting (e.g., different designs, actual failure rates, etc.).

Thus, low reporting is not necessarily incomplete reporting.

Figure 2 shows how the distribution of the number of failure reports submitted to the system has changed over the years.

The data

' The staff also reviewed other factors that might have affected the volume of failure reporting such as the effects of the number of refueling outages, the addition of new plants, and changes in reporting guidance, all during the evaluation period. The impact of these factors on the trend in reporting between 1988 and 1989 was found to be minor.

2 Completeness and timeliness of failure _ reporting are interrelated.

As there is little incremental increase in completeness after nine months, completeness was defined as the percentage on the data base nin months after th9 discovery date.

5

u O

and Figure 3 also_show that about two thirds of the plants reported over 100 failures discovered in both 1987 and 1988; DISTRIBUTION OF NUMBER OF NPRDS FAILURES FIGURE 3 Number of Plante 40-g i

G 30$

h'-

qjl si 20 - -

b l

-50 O

-l s{<s' 6s5 3d

~

i (5

SS: ' as to - -

r

== 1 354-

'5G QY h:

. 2 <:

g!E

g

p@ij

'g;2' - @

. g-g,,

O o 60 51-100 101 160 151 200 201 300 301 400 401 460 Number of NPROS Component Failures M ises O tose 553 iss7 C isse The particular plants in the lower ranges of reporting also vary from year to year.

Plants noted by the NRC as possibly under reporting # 3 NPRDS in one year may appear as strong participants the following year and vice versa due to the effects of refueling outages and plants taking corrective actions in their NPRDS programs.

Staff analysis did not identify any grouping such as NSSS vendor, age, power level, or type at tR b'g significant in. determining the likelihood of a plant being in the upper, middle, or lower third of NPRDS reporters.

Some plants move between ranges of reporting from year to year, others remain in the same range year after year.

To obtain additional information on failure reporting, the staff held telephone interviews with the NPRDS contacts at nine utilities that had reported few failures in 1989.

Three of these sites had basically discontinued NPRDS reporting for extended periods of time; in the last eight months of 1989, they had submitted only a couple of new failure reports.

However, all three now have major programs underway to backfit data and to become current.

Two sites indicated that there were few failures to report as each plant's operating experience had improved (both hao near record on-line performance in 1989).

The remainder indicated that there were backlogs of failure reporting ranging from t. bout twenty to sixty reports.

Reasons cited for the backlog were low priority assignment, lack of resources, administrative delays, etc.

Several sites were beginning to work off the backlog. The discussions confirmed the analysis above that a decrease in the number of failures was a factor involved in the reduced reporting volume in 1921 The sizable incompleteness at a number of plants was evidence that the 6

i

average estimate of one fourth to one third incompleteness can be exceeded at i

individual plants and that continued monitoring is needed, i

The distribution of failure reports by type of component is shown in Appendix B (based on data as of May 1990) for 1985-1989. The data in proprietary 1

apperdix C also show the failure reporting patterns for ech utility for 1985 through 1989 based on data as of May 1990.

!!.l.3 Timeliness of Failure Reporting j

The second quality of industry failure reporting that has been important to i

NRC uses has been timely reporting of new failures.

The timeliness of NPRDS failure reporting still involves considerable initial time delays but the median time from the date the failure is discovered to the date it is available on the data base was about the same as in 1988.

]

For 1989, the staff has estimat.d independently the timeliness of ner failure report ing.

Three intervals were measured: 1) the time in days from the failure discovery date to the input date (the date the plant submitted the report to NPRDS), 2) the time in days from the failure end date (the date the failure was reported to be repaired) to the input date for each failure record, and 3) the time in days from the data input date to the data acceptance date. The last interval is the time it takes for INPO to review ar.d audit failure records and make them available to users of the data base.

The median times for the three intervals for 1988 and 1989 are shown in Table 1.

Table 1.

TIMELINESS ESTIMATES IN DAYS' MEDIAN MEDIAN MEDIAN TOTAL FROM DISCOVERY END DATE INPUT DATE DISCOVERY DATE DATE TO TO TO TO YEAR INPUT DATE INPUT DATE ACCEPTANCE DATE ACCEPTANCE DATE 1988 90 64 19 109 1989 99 69 15 114

  • Based on failures with one year er less for input as measured from the failure discovery date.

INPO data on timeliness for the first half of 1988, which was provided to the NRC, showed.the median time from discovery date to acceptance date to be 109

days, The staff's 1988 evaluation report to the Commission reported that the median time from discovery date to acceptance date for 1988 was between three and four months.

The agreement of these figures with those in the above table provides some confidence in the method used to develop the estimates of 1989 time intervals.

7

From these results, the staff concludes that, during 1989, the timeliness of failure reporting, as measured from the date of failure discovery to the acceptance date, remained about the same as in 1988.

Once a month, INPO reviews the timeliness of f&ilure reporting and comoutes the time from the failure end date to the failure inout date for each record.

Their estimates for the last half of 1989 show that:

1, 24% of the records were input by 30 days.

2.

48% of the records were input by 60 days.

3.

56% of the records were input by 90 days.

4, 73% of the records were input by 150 days.

INPO's estimates that the median time from failure end date to input date was betwcen 60 and 90 days is in agreement with the staff's estimate of 69 days, as shown in Table 1.

Timeliness varier among the plants from short delays (averaging less than 30 days from failure end date) to long delays (averaging five months or more).

Delays also occur in the INPO audit r,rocess (the time from the input dat? to the data acceptance date).

This time varies from days to weeks depending e the balance between auditor availability, the backlog, and the time for the utility to upgrade the records rejecttd by the audit.

INPO has changed the goals for the industry average reporting time twice in the past two years.

Initially, the reporting time guidance stated that failure reports should be submitted within 30 days of the end of the month in which the failure occurred.

In March 1989, that guidance was changed to report within 30 days of the failure end date (the failure end date averages about 30 days after discovery date).' If repairs were expected to take longer than 60 days from the discovery date, a preliminary report was requested.

In March 1990, INP0 issued new guidance that changed the industry average goal to reporting 60 days from the failure end date.

If repairs were expected to take longer than 90 days, a preliminary report was requested.' Table 1 shows that the actual median time from the failure discovery date to the reporting (input) date was about 99 days.

The staff does not expect major changes in reporting timeliness to result from the lengthening of these goals even though the new goal is more achievable, in the mid 1980s, the Nuclear Utility Task Action Committee's Vendor Equipment Technical Information Program also recommended accelerated NPRDS failure reporting (including partial reports), as part of an industry response to item 2.2.2 in Gencric letter 83 28, " Vendor Interface Program" (a staff action related to the Salem ATWS events in 1983).

However, that response was never implemented.

3 NPRDS Reoortina Guidance Manual, February 1989.

' NpRDS Reoortina Guidance Manual, March 1990.

8

MW O

. t.

To be more effective in reducing delays, the utilities must commit dedicated support to meeting the reporting time goals and INP0 must continue to provide the necessary resources to avoid lengthy delays in the acceptance process.

11.1.4 Quality of the Data Although the staff did not find noticeable improvements in the quality of the data in 1989, it did note a number of programs and activities at the plants that may gradually improve the data, 11.1.4.1 1989 Evaluation of Quality In previous evaluations, a qualitative assessment of the report quality was made based on reviews of the narrative for the failure description and from verification of engineering data, addressed in studies, by site visits, or discussions with licensees and vendors.

These two methods of report quality assessment were continued in 1989.

The reviews of narratives in 1989 were done primarily as part of the maintenance effectiveness indicator (M !) development, while the verification of a sample of NPRDS engineering data was done in a reactor trip breaker study.

During the maintenance indicator development activities, the-industry raised several issues about NPRDS data. The main issue was the inconsistency in NPRDS in coding a f ailure as " degraded" or " incipient." A1:o the maintenance I-work requests (MWRs) lacked sufficient detail to permit good root cause description analysis for NPRDS reports.

Consequently many NPRDS failures are coded as cause " unknown" or "wearout."

While noting that the quality of the failure narratives had improved significantly over the years, the staff agreed that further improvement was needed.

Any improvement appears dependent on improving the quality of the maintenance work order' documentation and closecut process at the plant sites.

During an AE00 study on reactor trip breakers (RTB), the RTB data were found to be deficient in many fields such that extensive efforts were needed to correct parameters pertinent to the study.

Calls to the utilities verified inaccuracies particularly in the "actuations," " current rating," " voltage rating capacity." and " manufacturer model number" fields.

Several component j

records were missing application codes.

The results of these assessments were similar to those of previous assessments and these types of deficiencies are f airly well known.

These results were provided to INP0 by the staff in May 1989 for corrective action, 11.1.4.2 Improvements in the Quality of New Records INPO staff has informed NRC staff of several programs to improve the quality of new records, e.g., the auditing of incoming failure records and the additional computer edits initiated with Release 3 of NPRDS Plus (see Section 111.3),

in April 1989, INP0 also began auditing revisions to failure reports.

in another area, INP0 is attempting to improve the accuracy of NPRDS data for new plants by reviewing the engineering records, adding application codes-9

% r

O o 6.

(which identify the components' key function and system and are used to retrieve data about components with similar functions or applications at different units even though designs, models, and other engineering data may be different) to key component records, and taking follow up action to resolve deficiencies.

The application coces have.been added to all units except for four new units that were not commercial in 1989.

INPO resolved date conflicts on some records when Release 3 was issued and continues to pursue corrections to user identified errors.

In 1990, INPO is improving the RPS and ESFAS instrumentation engineering data that they had creviously identified as particularly needing attention.

The latter is part of Revision 3 to the NPRDS reportable scope that will be a major effort for about half the plants in 1990 and 1991.

INPO indicated that the addition of standard manuf acturer model ID codes and application codes has improved the quality of engineering records. One objective was to permit users to retrieve data based on manufacturers' model even if inconsistently reported originally.

In addition, with the knowledge that the model ID code is correct, users can obtain from the manufacturer the correct engineering data.

INP0 also suggested that the use of standard application codes, mentioned above, will assist users in overcoming inconsistencies in data.

However, to the staff's knowledge, this possibility had not been conveyed to users and the staff questions whether this will provide users improved quality in the data.

11.1.4.3 Improvements in the Quality of Old Records Although INPO has implemented programs such as the addition of standard model 10 codes and key component application codes in 1988, the quality of the pre-1984 engineering records remains low.

More than one fourth of the engineering records were submitted prior to 1984, before the current Scoce Manuals and Recortina Guidance Manual (RGMI reporting guidance were issued.

Only recently has INP0 implemented a review, upon submittal, of the engineering records.

Consequently, there are errors in some engineering data fields and i

inconsistencies between plants in the way they report certain data.

INP0 does I

resolve and correct errors that are called to their attention by users but has not implemented an organized program to correct the existing engineering records.

1 The staff has measured in their past evaluations the number of " failure transactions," i.e., the number of new failure records added plus the number of revisions to old failure records.

Revisions to earlier submitted reports may be an indication of a plant's dedication to improving the quality of old records.

In 1989, there were about 10,000 failure records revised and about 53,000 engineering records revised.

These revisions were associated with general upgrades or changes of the failure and engineering data, often related to the standard model ID effort initiated by INPO, as this affected the component ID on both records in some cases.

Figure 4 shows the average number of failure transactions submitted per plant each quarter since 1981.

Failure transactions increased in the second and following quarters of 1989 as compared to recent years.

This increase may be attributed to the completion of Release 3 in April 1989 that simplified on-10

+

i t

. t +

5 i

AVERAGE FAILURE TRANSACTIONS PER QUARTER FIGURE 4 Based on Previous NPRDS Dats Reviews i

40 i

N.

So -

  • ~

40 e-v v yN

'A.

, ', [-

-j 20 o'1234123412341234123412341234123412341 83 i 84 1 85 ! 88 I 87 i 88 ! 80 90

! 81 1 82 i

Quar ter/ War 7.rn;3T.;""""'

i line updates and modifications of the failure data.- INP0 staff has informed us in a telephone conference that some plants delayed revising old records before April in anticipation of the easier data entry with Release 3.

With the availability of Release 3, plants tt,en revised and submitted large amounts of data, mostly associated with the model 10 program.

The increase observed around the start of 1985 coincides with INP0's upgrade to Revision 10 of the l

data base.

j During 1989, the 111 plants reporting to NPRDS submitted around 24,000 failure transactions and 75,000 engineering transactions (some by plants not yet commercial).

The median number of failure transactions for all plants in 1989 was 164 and the mean number was 227.

The difference between the mean and the median is due to a large number of transactions (over 450) being submitted by about a dozen plants. The number of transactions submitted by individual plants ranged from 12 to over 1100.5 Staff has noted that the quality of NPROS reports submitted by the plants varies among the plants.

Typically, the plants devote one man year per unit to NPRDS reporting and related activities.

INPO staff indicated that the 4

5 This particular plant has confirmed that most of these transactions were revisions made following Release 3 and were related to upgrades associated with the model 10 program.

11 4

~

. s Qualifications of the NPRDS reporters range from degreed engineers to plant personnel with limited technical background.

Although the quality of NPRDS reporting may vary somewhat with the individual reporter's knowledge of the component. INPO staff indicates that a strong f actor in ensuring auality is the NPRDS reporter's access to others with experience, that is, to other individuals and plant operators having experience and knowledge of the failures.

INPO staff suggests also that, although the quality of the initial input records varies. the quality of what actually reaches the data base is improved because INPPs audits encourage quality and consistency.

11.1.4.4 Conclusion The staff has concluded that there was no change in the overall quality of the i

data during 1989.

!!.2 Industry Use of the System As previously mentioned, the other side of industry participation in the system is industry use of the system for such purposes as trending failures, identifying components with high f ailure rates, locating spare parts, etc.

The use of NPRDS by the plants is reviewed as part of the review of NPRDS participation in the staff's maintenance team inspections.

The teams' reports provided some insight into the extent to which plants are using NPRDS and a characterization of the plants that are active users.

Appendix A provides some of the observations from these inspections, conducted between July 1988 and October 1989, on the plants' usage of NPRDS.

Based on these observations of plant usage, we divided the plants into " users" and " minimal users."

From 12 sites evaluated by maintenance team inspections for NPRDS activities, only five of the sites could be described as active NPRDS " users." Examples of observations noted for these plants were that they (1) used NPRDS data for trending and for determining the root cause of failures and components with significantly high failure rates, (2) had formal procedures for determining NPRDS reportable failures, and (3) devoted higher than average resources and management support to NPRDS. On the other hand, plants classified as " minimal users" were only using NPRDS for such purposes as purchasing parts and job scheduling.

They did not use NPRDS data for failure trending or for management decisions.

Other observations on some of these particular plants

)

were that they had poor methods of determining NPRDS reportable failures and little coordination between those reporting NPRDS, those reporting LERs, and other plant operations staff who identify component failures.

Some reports noted that failures were not being reported to NPRDS and that there were deficiencies in engineering records, timeliness of reporting, and failure cause determinations.

It is our view that many plants are only making minimal use of NPRDS. More recent data from other sources (e.g., discussion with some utilities) tend to confirm this assessment.

INPO staff compute three indicators of plant usage.

They found that at the end of 1989 the " number of reports run" was up 5%, " connect time" was up 72%,

12 l

t and " CPU usage" was up 27% from 1988.

However, only the first measure is a measure of usage of the data base by plants. Connect time and CPU usage both are affected by on line data entry.

In summary, INP0's assessment is that usage is not going down and that plants are adding more resources for NPRDS entry and usage.

Also, INP0 staf f indicated there is some evidence that the plants that were above average reporters were also aoove average users.

To improve plant participation, INPO is actively encouraging the plants to use NPRDS.

As discussed in Section 111.4.2, they will be requiring all plants to conduct a quarterly analysis of plant performance using NPRDS.

Also, in September 1989, the President of INPO wrote to all utility CEOs describing enhancements to the system, citing examples of recent data base usage and its benefits to plants, and asking the CEOs to share these examples and encourage full and frequent use of NPRDS.

111. TECHNICAL MANAGEMENT AND DIRECTION OF THE SYSTEM Some of the elements that management usually provides to maintain a good data base syWem are: an organizational structure that provides technical direction and closely monitors and oversees operational activities: a user-friendly automat 6d data processing (ADP) system; data that meets users' needs; and ample user support and quality control.

The staff reviewed INP0's activities to plan and implement actions in these areas in 1989.

For this evaluation of the NPRDS, the NRC staff met with INPO staff and Jiscussed technical aspects of NPRDS and measures of data base activity during 1989 several times. Much of Section til was developed from these discussions, feedback from the NPRDS Users Group meetings, and other materials provided to NPRDS users.'

Staff found that INPO's management activities are directed a, bringing the needed improvements in the data base although additional effort and focus may be needed.

111.1 Organization in a meeting with NRC staff at INPO on February 2, 1990, INPO representatives explained a reorganization of the NPRDS Department in 1989.

In February 1989, y

INPO reorganized to consolidate all events analysis functions (i.e. human factor performance, event analysis, and component failure analysis) in one division.

The NPRDS Department was renamed the Equipment Performance Department and moved into the Analysis Division.

They were given two additional staff members along with the additional responsibility of icentifying the generic implication of component performance industry wide.

The Equipment Performance Department now has 18 full time staff positions (15 professionals, two technicians, and one secretary) working on NPRDS, as well as support contracts for three full time failure report auditors.

Up to four

  • The existing protocols between the NRC and INPO, however, limited the staff's ability to assess specific site NPRDS evaluation documentation. Thus, the staff's evaluation is more qualitative than quantitative in some areas.

f 13

. t additional contract staff worked on data and system upgrades in CY 1989 but, with the completion of the planned efforts, they will not be assisting in CY 1990.

Recognizing the expanded responsibilities, INPO indicated that 6 to 8 staff years of INPO resources that were devoted to the IBM conversion and other one time ADP associated efforts were now available for routine NPRDS support.

!!!.2 Oversight of NPRDS Activities In the meeting mentioned above, the INPO representatives also explained the review and oversight of NPRDS.

They feel that the NPRDS is receiving high level review within INPO.

The highest level decisions on INP0's policies and i

programs for NPRDS are made by INP0's Board of Directors and INP0's President.

The INP0 Advisory Council advises INP0 and the Board on broad issues.

An Industry Review Group (IRG) also oversees and provides feedback on the proposed plan and goals for NPRDS aid evaluates the workload being placed on the industry.

Review on a more detailed level comes from the NPRDS Users Group (NUG) that is an industry user group charte*.d to provide feedback into the development and operation of NPRDS.

This body consists of rotating assignees from utilities and representatives from 00E, NRC, EPRI, NSSSs, and AEs and has in the past supported task groups on selected issues.

Recently INPO has discontinued the NPRDS Data Needs Working Group, as well as other working groups, from the NPRDS Users Group.

INPO felt that this group had achieved its objective and had no new items on its agenda.

As the NUG charter does not permit standing committees, it was decided to discontinue the Working Group.

If at some later time the group is needed, it can be reestablished.

111.3 ADP System, NPRDS Plus Release 3 The introduction of NPRDS Plus Release 3 in April 1989 was a major milestone in the history of the NPRDS.

This marked the completion of the transition from the Prime computer to the IBM.

INPO documents (provided to NRC as users of the system and in meetings of the NPRDS Users Group) have discussed the development of the system and the features it provides.

The first two stages of this conversion process, Releases 1 and 2 of NPRDS Plus, provided a menu-driven, user friendly retrieval system, new canned information reports, and the ability to perform statistical analysis of the data.

Release 3 provided plant reporters a new and easier way of entering data into and maintaining data already in the NPRDS data base.

Plant users can now add, update, correct, and delete data quickly and easily on line.

The new electronic batch image also permits failure narratives up to 400 characters, up from 168 characters, to match the on line capabilities.

For NRC users, the new Release 3 provides easier downloading of data.

NRC users have already noted that a number of the earlier problems in accessing the system and retrieving data have now been eliminated.

INPO is currently coordinating with NRC staff (IRM) on resolving data transmission limitations (e.g., high speed transmission).

14

s A new NPRDS Reoortino Guidance Manual replaced the NPRDS Reoortino Procedures in gl, and describes the changes in NPRDS reporting guidance implemented with

~

n NPRDS Plus.

In conjunction with the completion of Release 3, IN90 staff developed and began a training course for NPRDS reporters in data entry and on new guidance for reporting NPRDS engineering and failure data.

Release 3 also provides sophisticated computer edits to improve the quality of the data.

INPO staff also discussed its efforts to make needed improvements in the quality of reported data on instruments in NPRDS.

INPO in 1989 prepared Revision 3 to the reportable scope and issued it in early 1990.

Revision 3 provides improved guidance for the reactor protection system (RPS) and engineered safety features actuation system (ESFAS).

Plants will be reviewing this guidance, making changes in their records, and in 1990 adding function codes to component engineering records for RPS and ESFAS instrumentation components. Completion of this major upgrade is scheduled for September 1991.

For some other components, engineering records were required to be submitted only when the component failed. Most of these components have been deleted from the NPRD' ope with Revision 3.

Failures of piping and relays, however, are still reli ble to NPRDS on failure.

111.4 Enhancements to Data and Reports Documents from the NPRDS Users Group meetings provided staff with information on INP0's objectives and programs to enhance the usefulness of NPRDS through the addition of new data fields, new reports capabilities, and expansion of the NPRDS scope.

!!!.4.1 New Codes in 1989, INP0 completed the addition of failure mode codes to all failure reports since January 1, 1984, and the addition of standardized manufacturer model identification numbers to about 285,000 components.

The latter are l

currently available to users; the former will be made available in mid 1990.

Ill.4.2 Component Failure Analysis Report (CFAR)

To foster increased industry usage, INP0 staff said that the capability to run CFAR was made available to the utilities.

Currently, CFAR is not available to the NRC staff.

CFAR allows utilities to use NPRDS to identify plant components with high failure rates when compared to the industry averages.

The CFAR methodology is a two step process that does a systematic statistical screening of NPRDS data, followed by an engineering analysis of the results.

CFAR covers 88 component types, 240 apnlication codes for BWRs, and'140 application codes for PWRs.

In 1989, 1NPO staff developed a program to have the utilities conduct quarterly reviews of plant performance using the CFAR l

system, in November 1989, INP0 initiated this program by sending a letter to-l the utilities requesting them to conduct this quarterly analysis of their i

plants' equipment using CFAR and other plant data to identify recurring equipment problems, determine root causes, and initiate cost beneficial l

15

+

i corrective action.

This analysis by the utilities is to begin the first quarter of 1990.

INPO staff says tnat the plant evaluation teams will be reviewing the implementation of the program and the corrective actions taken.

Currently the INPO evaluation teams run CFAR independently and review it with utility staff while on site.

!!!.4.3 Scope Expansion Both the IRG and a NUG Working Group recommended an expansion of the scope of NPRDS to include additional balance of plant (BOP) components.

INPO reported that they reviewed several years of industry data to determine components trit are significant contributors to plant transients or lost power generation.

This review has been completed and the additional components (such as the main generator exciter and voltage regulator, main condensee, turbine and moisture separator reheaters) have been approved for inclusion.

In April 1990, INPO staff issued reporting guidance for the added components and a schedule for utility implementation.

On January 1, 1991. utilities are scheduled to begin reporting failures for the additional components.

111.5 User Support and Quality Control in the February 1990 meeting, INPO staff also oiscussed their on-going NPRDS activities and their programs to improve data quality.

Orgknizationally, the Equipment Performance Department staff is evenly divided between the User Support Section and the User Enhancement Section. User support provides assistance to users; audits incoming records; procesLs the data; mails reports; conducts training courses, workshops, and assistance visits; and controls user access and mailing lists.

The User Enhancement Section, as the name implies, is responsible for enhancements and changes in software, coding, and reporting guidance; and updates to documents and manuals.

In addition, they perform the review functions of the department, conducting the on site NPRDS completeness reviews and the department's own internal reviews.

They conducted four training courses for new reporters in 1989 and plan to conduct three in 1990.

In addition to providing user support, the Equipment Performance Department has programs aimed at improving the data through various types of reviews of the quality, timeliness, and completeness of NPRDS data.

Full-time auditors (three) are reviewing all new and revised data inputs to ensure conformance to the reporting scope and guidance. According to INPO staff, while this effort results in some time delay before the data are added to the data base, their review results in revisions (many minor) to one third to one half of the records.

Before each INPO plant evaluation, the INP0 staff reviews the quality of all failure records and engineering records submitted since the last evaluation.

The results of this review are provided to INPO evaluators for their follow up at the site.

The second type of review is a monthly review of the number and timeliness of each plant's failure reporting.

The other review activity which was begun in 1989 is on-site evaluations of the completeness of NPRDS reporting.

Each year about 20 to 25 plants will be reviewed so that all plants will be covered over a four to five year period.

In preparation for an NPRDS completeness evaluation, INPO staff will obtain 16

t

=

j from the plant a sample of maintenance work requests (MWRs) covering eight systems.

Staff observed in an INPO letter that at one plant they reviewed 850 MWRs.

Other systems are added to the list if a review of the plant's operating experience indicates a possible problem in these areas.

The INP0 reviewer reviews these MWRs and compiles a list of MWRs judged reportable 1

which have not been submitted to the NPRDS.

During the on-site NPRDS evaluation, they review this list and resolve questionable events.

in addition to the MWR review, the plant's failure and engineering data history and t.ERs are also examined considering the actual operating history.

The results are communicited back to a utility executive.

For plants with poor reporting records, MPO follows up with a formal letter stating findings and recommendations for improved action.

For plants with an INPO plant evaluation either on going or in the near future, the NPRDS completeness evaluation results are included as part of the INP0 evaluation report.

During 1989, they conducted 27 of these detailed on site reviews to assess the completeness of component failure reporting.

!!!.6 Major Objectives for 1990 Having addressed needed improvements such as NPRDS Plus Release 3, the model ID upgrade. additions of failure mode codes, better documentation, reporter training, etc., INP0 staff is undertaking additional planned tasks.

INPO staff in the february meeting outlined their major objectives for 1990:

1.

Continue stressing completeness through routine monitoring and the four year program of assessing the completeness of failure reporting at all plants through NPRDS site evaluation trips.

2.

Provide users the ability to search on the new component failure mode codes that were added in 1989.

3.

Assist utilities in improving instrumentation data as stated in Revision 3 to the reportable scope.

4.

Implement the expanded scope of 80P components per Revision 4 5.

Increase usage of NPRDS in the Equipment Performance Department including the use of NPRDS for ihe identification of the generic implication of component failures and a study on feedwater component failures with recommendations for improvement (scheduled for release in September 1990) including feedback of data problems in NPRDS.

6.

Actively assist utilities in increasing NPRDS usage.

7.

Improve data quality through on going reporter training and the correction of problems with impact on users.

111.7 Conclusion The staff concludes that INP0's management of the NPRDS remains focused on gradually improving the system to meet user needs in the quality of the data 17 l

l

. +

. s (particularly some of the engineering data), completeness of failure reporting, and timeliness.

IV.

RESPONSIVENESS TO NRC'S NEEDS Completeness, timeliness, aM quality are parts of the question asked by the Commission in 1981 concerning the responsiveness of the NPRDS to meet NRC's needs for' component failure data.

The staff made a further review of the usefulness of the NPRDS data for NRC through discussions with users and reviews of reports using NPRDS data.

All NRC staff members having user access were interviewed in late 1989. We surveyed their experience in using the system and data and their assessment of its usefulness and limitations for their purposes.

The results are described in the following sections.

IV.1 Maintenance Effectiveness Indicators (MEls)

As part of the Commission's directive to monitor industry initiatives and progress in implementing effective maintenance programs, AE00 developed an indicator, called the maintenance effectiveness indicator (MEI) using NPRDS (described in AE00 S/804A, October 1988; S/804B, February 1989; and S/804C draft, May 1990).

This work has provided an additional view of industry trends in this important area.

IV.2 Component Studies AE00 has used NPRDS both as a source of " qualitative" information to identify examples of component failures for more in-depth case study analysis and

" quantitative" information for statistical analyses of component failures.

NPRDS, as a supplement to information from LERs and other sources, continues to provide useful information for case studies to identify relevant component failures and to help identify root causes.

However, superficial use of NPRDS to calculate the failure rate for a specific component is not without problems.

Early on, AEOD noted that-statistical use of the data to calculate the failure rate of a specific component analyzed, without engineering analysis to confirm the data applicability and quality, could likely lead to invalid results.

IV.3 Preparation for Site Visits Before NRC site visits, NPRDS, supplemented with information from other sources, has been useful to select candidate systems and components for i m action during Maintenance Team inspections and other NRC inspections such as diagnostic inspections. At the plant, the inspectors again reviewed the data to further identify equipment failures and operating problems and to assess licensee root cause analysis and corrections in response to events.

18 i

u

,. s o IV.4 Revisions in Technical Specifications Requirements NPRDS serves as an additional source of information in developing technical bases for proposed changes in surveillance testing frequencies.

NRR continues to use NPRDS data, along with LERs anc other sources, to assess the impact of such requirements on plant operations.

However, NPRDS was not designed to study the failure rates associated with particular component configurations and does not include some necessary data such as outages due to testing and preventive maintenance and component unavailability due to human error.

Thus.

NPRDS cannot and does not serve as a sole basis for evaluating such changes.

IV.5 Probabilistic Risk Assessment Although, as mentioned, NPRDS was originally envisioned to provide basic reliability statistics and f ailure rates, there are limitations for the calculation of failure rates.

Overall performance monitoring should reflect the component's availability to function (availability) and the probability that it will start on demand and run for the time necessary to fulfill its intended mission (reliability). Since the NPRDS does not capture all sources of component unavailability or demand rates, it must be supplemented with other information to provide overall component performance information (e.g.,

unavailability due to errors such as valved out or switched off, and planned or preventative maintenance activities which are not entered into NPRDS).

INP0 staff has also noted some limitations: reporting of incipient conditions is no longer encouraged; for certain components, engineering data are provided only on failure; and failures prior to 1984 were not regularly reported.

Because of these limitations and concern about system quantitative accuracy, major NRC PRA programs such as the risk rebaselining for NUREG-ll50 (Severe Accident Risk:

An Assessment for Five Nuclear Power Plants) have made only qualitative use of NPRDS (e.g. review for failure modes). As the technica analysis for this program has been completed, NPRDS is no longer used as a source of qualitative information in this program.

In 1989 the Office of Nuclear Regulatory Research began the low Power Shutdown Risk studies which have made limited use of NPRDS data.

NPRDS' usefulness for these studies is limited by the lack of data on demands and the lack of completeness in failure and engineering reports.

In other areas requiring risk and reliability statistics,.such as for risk-based performance indicators, the Office of Nuclear Regulatory Research has found that NPRDS is difficult to use in estimating the availability of standby safety systems because the data base is not reactor-systems oriented.

it does not include:

data on equipment unavailability due to events other than equipment failures, e.g., human error and maintenance downtime; clear information on the impact of equipment failures on the equipment's safety function; and the plant mode (a) when the failure was discovered and (b) when the system was repaired.

19

4 1

k

~

1 Similarly, NPRDS does not collect sufficient information to estimate the reliability of components.

It does not include the number of demands or the number of operating hours on a given type of component.

Therefore, it is not possible to calculate the number of failures per demand or per hour.

IV.6 Aging Studies The NPRDS has been useful to NRC's Office of Nuclear Regulatory Research in the Nuclear Plant Aging Research (NPAR) Program to identify comoonents and systems most subject to aging in connection with plant life extension.

INPO has noted some of the limitations of NPRDS for aging studies:

NPRDS is I

not designed to capture the age of a component.

Age or other data on a component is not based on the component's manufacturer serial number.

Component predictive and preventive maintenance activities are not systematically reported to NPRDS.

The like for like component replacement history resulting from predictive and preventive maintenance is not reportable to NPRDS.

Therefore component life extracted from the NPRDS may be misleading.

Reports from the NPAR Program issued in 1989 noted, as in earlier reports, i

that NPRDS provided relative information regarding which systems and components have been significantly affected by aging and the underlying cause of that aging. They continued to note its strengths and limitations that reflect on the quality of the data and its applicability to aging studies.'

IV.7 Generic issues To determine if a particular problem is a plant specific problem or represents an industry-wide generic problem, the staff will search NPRDS to locate similar failures and to determine what industry experience has been with a particular component.

If NPRDS indicates that similar experiences are occurring, the staff will conduct further investigation to determine generic applicability.

' Among the strengths ire: 1) it is a large, computerized data base: 2) many of the equipment failures reported to NPRDS are not in LERs; 3) component engineering data and inservice date are provided; 4) the failure records provide a categorization of the failure by the utility personnel and a failure

.]

description; and 5) it provides reasonable sufficient information to permit a 1

reasonable determination of the relative number of equipment failures

]

attributable to various mechanisms.

Some of the limitations for this program (in addition to those mentioned above by INPO) are: 1) incomplete equipment failure reporting; 2) incipient failures not required to be reportable; 3) approximately 50% of the NPRDS data are placed in the unknown or other devices i

failure cause categories; 4) in many cases the cause descriptions codes do not i

reflect the mechanism causing the failure but are related to the effect of the 1

failure; 5) many of the failure narratives do not provide sufficient information to verify the cause category or cause description; and 6) narratives contain ambiguous words which adds to confusion as to whether a failure is aging related i

or whether the piecepart was incorrect.

20 l

1 l

l q'

0 In some applications, NPRDS is adequate to distinguish between an isolated plant specific problem and a wide spread problem.

However, its usefulness can be limited for this purpose due to data quality problems.

For example, three units at one site had identified the m nufacturer of the handwheel for manual s

valve operations as the manufacturer fo. the pneumatic valve operator.

This made identification of plants with a specific valve and valve operator combination more difficult.

IV.8 Conclusion The staff concludes that NPRDS, when supplemented with information from other sources, remains a useful source of information to support various NRC and industry uses. However, when NPRDS is used to calculate failure rates for a specific component, without engineering analysis, problems may arise.

The utility of the data is directly related to the completeness and quality of the plant's reporting practices.

V.

SUMMARY

OF THE STATUS AND EVALUATION OF NPRDS Since receiving new focus in 1982, NPRDS has become a large, mature data base.

It is a dynamic systemt it is continually undergoing planned changes (e.g.,

scope changes to meet user needs, revisions to enhance data input quality, p

revision and deletion of incorrect records, etc.).

The plants reporting to i

NPRDS are reporting in varying degrees of completeness, timeliness, and 1

quality and with such factors varying over time at specific sites.

Data are routinely being added, revised, and deleted according to reporting guidelines issued by INPO.

The NPRDS data base at the end of 1989 contained about 527,000 engineering records and around 98,000 component failure records.

The data base, in its current state of maturity, is meeting, to varying degrees, a number of the NRC's diverse needs (identification of generic component performance concerns, maintenance monitoring, aging studies, preparation for site visits, etc).

More staff use is being made of the NPRDS than in the past'because of growing interest in component level data. There presently is no other comparable industry wide source for such data.

During 1989, INP0 completed several of its major goals to make the data base a more easily accessible, usable, and useful system.

Industry participation in the system continued at about the same level in 1989 as in 1988.

Based on reviews of NPRDS data trends and other relevant information noted in the report, the staff judges that the overall completeness of the NPRDS i. near its previous estimate of 65% to 70% and that the median timeliness from date of failure discovery to acceptance date remained about the same as in 1988 (a median of between three and four months).

In 1990, INPO lengthened its goals for timeliness from 30 days to 60 days from the failure end date to input date, presumably to make the goal more achievable and thus improve timeliness.

The staff has not noted significant improvements to the quality of the data, it remains generally fair to good, but varies from low to high among plants, j

among types of components, and over tine with the more current data expected to be of somewhat higher quality due to recent efforts to improve the incoming l

21 1

  1. -10 1.

1

. o cea i

data. At the end of 1989, the system is more user friendly.

However, 'he data in the system still need improvement.

INP0's near term efforts are devoted to improving the. data and to encouraging better industry participation.

The major catalyst for.this improvement in.

industry participation is increased awareness of the uses and benefits of the system for plant operations.

Plants need to' ensure good. documentation on

j component failures for NPRDS review, a. timely' review, a technically-knowledgeable staff and adequate resource application,'especially through j

major outages.

y 1

i l

-i i

t

.)

s i

i s

'h t

22.

i

.e I

.p l

F b

s,

APPENDIX A Observations from Maintenance Team Inspection Reports of Plant's Usage and Reporting of NPRDS Data USERS:

Silg Observation Grand Gulf Observations on use:

a)

Plant performs trend analysis of NPRDS components.

b) They use trend analysis to make recom.

mendations to plant engineering and main-

tenance, c)

They also trend Maintenance Work Orders associated with NPRDS components.

Other observations:

a) Operations analysis group reviews all MW0s for NPRDS reporting.

Harris Abservations on use:

a)

Licensee reports failures to NPRDS and makes effective use of data.

b) Historical records and NPRDS are easily accessed and aro used by planners and other site personnel, c)

Plant trends failures of equipment covered by NPRDS.

Other observations:

a) A systems engineer makes a determination of what needs to be reported to NPRDS.

Nine Mile Point' Observations on use:

a)

Licensee's engineering group in their corporate office uses NPRDS data for root cause

analysis, b) Maintenance personnel use NPRDS to determine.

root cause of failures.

Other observations:

a)

They routinely enter plant equipment failure data.into NPRDS.

b)

Team found that they actively use NPRDS.

' Subsequent to the Maintenance Team. Inspection Visit in November and December 1988, AE00 found that NPRDS reporting at NMP1 had deteriorated.

The utility is working to effect improvements.

A-1 L

~

F o

w*

. s..

Observations from Maintenance Team Inspection Reports of Plant's Usage and Reporting of NPRDS Data (cont'd)

USERS:

111g Observation Quad Cities' Observations on use:

a)

The site is making good use of NPRDS information for plant specific issues.

b)

Quad Cities is actively using CFAR.

c)

The results of NPRDS searches were noted in LERs.

Other observations:

a) 'Cooroinator spends. 70% of her tir.e on N'vR25, b)

All MWRs are reviewed by coordi'.ator.

San Onofre Observations on use:

a) Use of NPRDS was considered a strength by the team.

b) One trending program compares plant's failures'to NPRDS component failures.

MINIMAL VSERS:

Arkansas Observations on use:

a) NPRDS data only used in job planning summary use.

Other observations:

a) Records were inadequate for trending because i

of inadequate technical description of the cause of failure.

Duane Arnold Observations on use:

a)

NPRDS data were not utilized to resolve preventive maintenance concerns on an RHR pump, b)

Evaluation of NPRDS data for negative trends or other signs of deteriorating performance was not performed.

}

l 1

8 AE00 also found that timeliness af reporting at Quad Cities is very good compared to other. plants.

Quad Cities was one of the examples highlighted by INP0 as using NPRDS.

A-2

c n

.o Observations from Maintenance Team Inspection Reports of Plant's Usage and Reporting of NPRDS Data (cont'd)

MINIMAL USERS:

Lilg Observation Maine Yankee Observations on use:

a)

The use of the system has essentially been for identification of parts for purchase, b) Recently Maine Yankee started retrieving data from NPRDS to assist in their analyses.

There'is an effort within engineering to make more use of NPRDS data, Millstone Observations on use:

a) Unit 3 Maintenance and !&C Departments are not typically using NPRDS data (no observations on Units 1 and 2),

Palo Verde Observations on use:

a) The licensee's participation in providing failure data to the NPRDS data base and in using this operating experience information source has been poor prior to the spring of 1989, in November 1989, improvement was evident.

Other observations:

a) The team found that, after the initial engineering records had been developed prior to plant operation, for a period of up to three years, little input was made to the NPRDS based-on plant failure records, b) Following a reassignment of NPRDS reporting responsibility, the licensee has made significant progress in correcting NPRDS reporting problems.

c) A weakness noted by the team was the inaccuracy of NPRDS engineering data. Reportable components had improper inclusions and exclusions, for example, the NPRDS engineering data reportable components did not include 11C-and electrical components because of lack of specific data, 1

A-3'

f ~~!

l

. a o Observations from Maintenance Team Inspection Reports

's of Plant's Usage and Reporting of NPRDS Data (cont'd)

MINIMAL USERS:

1113 Observation Trojan Observations on use:

a)

The team identified the limited access and minimal utilization of the NPRDS as a weakness, b)

Systems engineers and event reviewers did not have training or facilities to access NPRD3.

c)

Lack of staff and management knowledge of new NPRDS Plus software was considered a roo'.

Cause.

Other observations:

a) Management support for NPRDS was minimal until recently (Fall 1988).

At the time, engineering or clerical resources were not available for maintenance of NPRDS. However, a contract engineer was assigned to focus on entering the backlog of applicable event reports into the NPRDS data base.

Zion Observations on use:

a) Trending programs were uncoordinated.

i

=

A4

/

o 'o

. w o APPENDIX B NPRDS COMPONENT FAILURES 1985 1989*

COMPONENT 1985 1986 1987 1988 1989 ACCUMU 27 41 42 41 42 AIRDRY 7

31 26 39 15 ANNUNC 9

9 2

4 1

BATTRY 132 125 132 127 120 BLOWER 212 322 262 208 160

-CKTBRK 601 731 780 829 591 CONROD 10 8

2 15 2

CRDRVE 453 325

-143 107-147 DEM!N 30 25 32 35 29 ELECON 54 70 37 41 34 ENGINE 177 175 144 94 70 FILTER 134 135 125 85 59 GENERA 180 169 153 156 127 HEATER 26 31 47 42 22 i

HTEXCH 235 199 249 251 197 IBISSW 1006 1005 893 920 716 ICNTRL-359 385 415 484 347 INOREC 593 675 699 625 485 INTCPM 852 921 989 1021 795 IPWSUP 242 245 244 261 245 i

ISODEV 63 86 53 64 83-IXMITR 1546 1504 1611 1369 1095 MECFUN 131 102 116 112 69 MOTOR 192 173 219 145 164 PENETR 120 140 137 113

-117 P!PE 71 56 32-33 48 PUMP 903 889 699 618

.509-RECOMB 15 49-15 10 4'

RELAY 323 273 294 303 226 SUPORT 487 601 452 291 117 TRANSF 30 33 32 26 18 TURBIN 69 79 70 52 25 VALVE 3277 3311 3406 3470 2709 VALV0P 1756 2069

'2052 1758 1533 VESSEL 13 21 18 10 9

TOTAL 14335 15016 14621 13759 10930

  • Based on Discovery Date, excludes incipient records, data as of May 1990, 81 l

l

'