ML20140H617

From kanterella
Jump to navigation Jump to search
Forwards Assessment of Lers,As Input to SALP Rept for Sept 1984 - Feb 1986.Licensee LERs of Above Average Quality Per 10CFR50.73 & Comparison W/Other Facilities
ML20140H617
Person / Time
Site: Duane Arnold NextEra Energy icon.png
Issue date: 04/01/1986
From: Norelius C
NRC OFFICE OF INSPECTION & ENFORCEMENT (IE REGION III)
To: Leslie Liu
IES UTILITIES INC., (FORMERLY IOWA ELECTRIC LIGHT
References
NUDOCS 8604040149
Download: ML20140H617 (14)


Text

_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ . _ . _ _ _ _ _ . ____.

Pm6 Docket No. 50-331 Iowa Electric Light and~ Power Company ATTN: Mr. Lee Liu President and Chief Executive Officer IE Towers P. O. Box 351 Cedar Rapids, IA 52406 Gentlemen:

The NRC's Office for Analysis and Evaluation of Operational Data (AE00) has recently completed an assessment of Licensee Event Reports (LERs) submitted b you as part of the NRC's Systematic Assessment of Licensee Performance (SALP)y .

In general, your submittals were found to be of above average quality based on the requirements of 10 CFR 50.73 and in comparison to other facilities that have been evaluated using the same methodology.

A need to comit to supplemental reports (Item 14 in the coded fields) became apparent when selecting the sample LERs. For example, several LERs (i.e.,

85-003-00, 85-016-00, and 85-030-00) discuss a latching mech'anism problem on airlock doors which resulted in numerous containment violations. These same LERs indicate that an engineering review will be done to determine proper actions to be taken, but none consnit to a supplemental report to report the results of the review. The r.eed for further review almost always implies the need for a supplemental report. In fact, an LER that states "there is a need for further evaluation" without comitting to a supplemental report, is usually considered to be incomplete.

We are providing you a copy of AE00's assessment prior to the issuance of the SALP 5 Board Report so that you might be aware of their findings and to also provide you a basis by which future submittals should be patterned.

We appreciate your cooperation with us. Please let us know if you have any questions.

Sincerely,

" original signed by E. c. Greenman" for g40gog 86g0h PDR t

0 Charles E. Norelius, Director Division of Reactor Projects

Enclosure:

AE00 Assessment @M See Attached Distribution ) l RIII RIII /l L5 R If RIL ' RIII RI

'~

a hA ) A Pearsqn/lt Sc$]IY eibinz L .an 96yd G demand No pP 3 3 7 g4 k

' W 1lt(p r

I EMtric Light and Power 2 APR01 1E l

Distribution cc w/ enclosure:

'O. Mineck, Plant Superintendent Nuclear W. Miller,-Assistant Plant l

Superintendent Technical Support DCS/RSS (RIDS)

Licensing Fee Management Branch l Resident Inspector, RIII Thomas Houvenagle, Iowa State Commerce Commission l

l l

l

AE00 INPUT TO SALP REVIEW FOR DUANE ARNOLO Introduction In order to evaluate the overall cuality of the contents of the Licensee Event Reports (LERs) submitted by Duane Arnold during the September 1,1984 to February 28, 1986 Systematic Assessment of Licensee Performance (SALP) assessment period, a representative sample of the licensee's LERs was evaluated using a refinement of the basic methodology I

presented in NUREG/CR-4178 . The sample consists of 28 LERs, which is half of the LERs that were on file at the time the evaluation was started.

See Appendix A for a list of the LER numbers in the sample.

It was necessary to start the evaluation before the end of the SALP assessment period because the input was due such a short time after the end of the SALP period. Therefore, not all of the LERs prepared during the SALP assessment period were available for review.

Methodology The evaluation consists of a detailed review of each selected LER to determine how well the content of its text, abstract, and coded fields meet 2 3 4 the reautreinents of NUREG 1022 , and Supplements 1 and 2 to NUREG-1022.

The evaluation process for each LER is divided into two parts. The first part of the evaluation consists of documenting comments specific to the content and presentation of each LER. The second part consists of determining a score (0 10 points) for the text, abstract, and coded fields of each LER.

The LER specific comments serve two purposes: (1) they point out what the analysts consioered to be the specific deficiencies or observations concerning the information pertaining to the event, and (2) they provide a

basis for a count of general deficiencies for the overa.1 sample of LERs that was reviewed. Likewise, the scores serve two purposes: (1) they serve to illustrate in numerical terms how the analysts perceived the content of the information that was presented, and (2) they provide a basis for the overall score determined for each LER. The overall score for each LER is the result of combining the scores for the text, abstract, and coded fields (i.e., 0.6 x text score + 0.3 x abstract score + 0.1 x coded fields score = overall LER score).

The results of the LER ouality evaluation are divided into two categories: (1) detailed information and (2) sumary information. The detailed information, presented in Appendices A through 0, consists of LER sample information (Appendix A), a table of the scores for each sample LER

( Appendix B), tables of the number of deficiencies and observations for the text, abstract and coded fields (Appendix C), and comment sheets containing narrative statements concerning the contents of each LER (Appendix 0).

When referring to these appendices, the reader is cautioned not to try to directly correlate the number of comments on a comment sheet with the LER scores, as the analyst has flexibility to consider the magnitude of a deficiency when assigning scores.

Although the purpose of this evaluation was to assess the content of the individual LERs selected for review, the analysts often make other observations which they believe should be brought to the attention of the licensee. Tne following discussion addresses a general observation that was noteo during the evaluation.

General Observation A need to commit to supplemental reports (Item 14 in the Coded fields) became apparent when selecting the sample LERs. For example, several LERs (i.e.,85-003-00,85-01600,and85-03000) discuss a latching mechanism problem or airlock doors which resulted in numerous containment violations. These same LERs indicate that an engineering review will be done to determine proper actions to be taken, but none commit to a

i supplemental report to report the results of the review. The need for further review almost always implies the need for a supplemental report.

In f act, an LER that states "there is a need for further evaluation" without committing to a supplemental report, is usually considered to be incomplete.

Discussion of Results A discussion of the analysts' conclusions concerning LER ouality is .

presented below. These conclusions are based solely on the results of the evaluation of the contents of the LERs selected for review and as such represent the analysts' assessment of each units performance (on a scale of l

0 to 10) in submitting LER$ that meet the reouirements of 10 CFR 50.73(b).

l Table 1 presents the average scores for the sample of LERs evaluated for Duane Arnold. The reader is ceutioned that the scores resulting from the methodology used for this evaluation are not directly comparable to the scores contained in NUREG/CR-4178 due to refinements in the methodology.

! In order to place the scores provided in Table 1 in perspective, the scores from other units that have been evaluatee using the current methodology are provided in Table 2. Additional units are added to Table 2 as they are l

evaluated. Table 3 and Appendix Table F-1 provide a summary of the information that is the basis for the average scores in Table 1. For example Duane Arnold's average score for the text of the LERs that were evaluated was 8.0 out of a possible 10 points. From Table 3 it can be seen that the text score actually resulter from the review and evaluation of 17 offferent reouirements ranging from the discussion of plant operating conditions before the event (10 CFR 50.73(b)(2)(ii)(A)) to text presentation. The percentage scorns in the text summary section of Table 3 provide an indication of how well e6ch text reautrement was addressed by the licensee for the 30 LERs that were evaluated, i

l l

i

\

. l a

TABLE 1. SupmARY OF SCORES FOR DUANE ARNOLD Average High Low Text 8.0 9.6 5.4 Abstract 8.5 10.0 5.6 Coded Fields 8.7 9.5 7.4 Overall 8.28 9.5 6.4

a. See Appendix B for a sumary of scores for each LER that was evaluated.
b. Overall Average = 60% Text Average + 30% Abstract Average + 10% Coded Fields Average.

e TABLE 2: AVERAGE SCORE COMPARISON .

10 9- -

g_ _

oucNC ARNOLD 0

P '7- 7 7

/ /

/ / _

A s_ / / /

/ / /

w / 7 /

O' 5- f /

/ / / / /

/ / / / /

$ 4_

/ / / / /

a / / /

2 / /

D 3- r r / / / / /

2 / / / / / / /

/ / / / / / /

  • ' / / / / 77 / / /

. . . / / / / / / / / /

1- c- / r / / / r / / r / r / r / r r .

/ / / / / / / '/ / / / / / / / / / /

// / / / / / / / / / / / / / / / / .

0 , , , , , , , , , ,,,,,,,,,,,,,,,,,,,,,,,,,,,

9.5 9.0 B.5, B.D 7.5 7.0 S.5 8.0 GRADE

TABLE 3. LER REQUIREMENT PERCENTAGE SCORES FOR DUANE ARNOLD TEXT Percentage Reauirements [50.73(b)] - Descriptions Scores ( )*

(2)(ii)(A) - - Plant condition prior to event 95(28)

(2)(11)(B) - - Inoperable eculpment that contributed b (2)(ii)(C) - - Date(s) and approximate times 96 (28)

(2)(li)(D) - - Root cause and intermediate cause(s) 88 (28) f2)(ii E -- Mode, mechanism, and effect 100 (12) 02)(11 F -- EIIS Codes 66 (28)

(2)(ii)(G) - - Secondary function affected b (2)(ii)(H) - - Estimate of unavailability 92(13)

(2)(11)(1) - - Methoo of ciscovery 100(28)

(2)(ii)(J)(1) - Operator actions affecting course 100 8) f,2 11 (2) - Personnel error (procedural deficiency) 92 11)

L2 11 - - Safety system responses 95 11) 2)(ii)(L) - - Manuf acturer and model no. information 54(12)

3) ----- Assessment of safety consecuences 60(28)
4) ----- Corrective actions 83(28)
5) ----- Previous similar event information 36(28) 2)(1) - - - - Text presentation 76 (28)

ABSTRACT Percentage Reauirements [50.73(b)(1)] - Descriptions Scores ( )a

- Major occurrences (Immediate cause and effect 99 (28) information)

- Description of plant, system, component, and/or 96 (7) personnel responses ,

- Root cause information 81(28)

- Corrective Action information 76 (28)

- Abstract presentation 79 (28)

TABLE 3. (continued)

CODED FIELDS Percentage Item Number (s) - Description Scores ( )*

1, 2, and 3 - Facility name (unit no.), docket no. and 99 (28) page number (s) 4 - - - - - - Title 58(28) 5, 6, and 7 - Event date LER No., and report date 99 (28) 8 - - - - - - Other f acilities involved 100(28) 9 and 10 - - Operating mode and power level 100(28) 11 - - - - - Reporting recuirements 95 (28) 12 - - - - - Licensee contact information 100 (28) 13 - - - - - Coded component f ailure information 83(28) 14 and 15 - - Supplemental report information 93(28)

a. Percentage scores are the result of dividing the total points for a reoutrement by the number of points possible for that reautrement.

(Note: Some reautrements are not applicable to all LERs, therefore, the number of points possiDie was adjusted accordingly.) Tne number in parenthesis is the number of LERs for which the requirement was considered applicable.

b. A percentage score for this reautrement is meaningless as it is not possible to determine from the information available to the analyst whether this requirement is applicable to a specific LER. It is always given 100%

if it is provided and is always considered "not applicable" when it is not.

l l

l L

Discusion of Specific Deficiencies A review of the percentage scores presented in Table 3 will cuickly point out where the licensee is experiencing the most difficulty in preparing LERs. For example, recuirement percentage scores of less than 75 indicate that the licensee probably needs additional guidance concerning these recuirements. Scores of 75 or above, but less than 100, indicate that the licensee probably understands the basic reoutrement but has either: (1) excluded certain less signficant information from a large number of the discussions concerning that recuirement or (2) totally failed to address the recuirement in one or two of the selected LERs. The licensee should review the LER specific coments presented in Appendix 0 in order to determine why he received less than a perfect score for certain recuirements. The text recuirements with a score of less than 75 are discussed below in their order of importance, in addition, the primary deficiencies in the abstract and coded fields are discussed.

The first reoutrement to be discussed is the safety assessment (Recuirement50.73(b)(3)]. Twenty of the safety assessments were found to have some deficiency which resulted in a score of 60 percent. Six of the LERs were found not to include a safety assessment. A safety assessment is reautred in all LERs and is supposed to include three items as follows:

1. An assessment of the event including specifics as to why there was no safety problem. It is inadeouate to state "this event had no safety consecuences or implications" without explanation as to why.
2. A safety assessement should indicate whether or not other systems were available to perform the function of the system which was lost.

r

3. Finally, a safety assessment should consider whether the event could have occurred under a different set of conditions where the safety implications would have been more severe. If the

~

conditions during the event are considered the worst probable, the LER should state so.

Duane Arnold did a reasonable job on the first two items recuired in a safety assessment, but fourteen LERs were found to be lacking the third item.

Five cf the twelve LERs involving component f ailures f ailed to adeouately identify the failed component in the text (Reauirement50.73(b)(2)(11)(L)]. Adecuate identification is usually considered to be manufacturer name and model number. This information is important for the identification of possible generic problems in the nuclear industry, and should be included when a component f ails or when a component malfunction contributes to the cause of the event.

Twelve of tne twenty-eight LERs reviewed, had no discussion of previous similar events, and therefore, f ailed to satisfy this recyirement. Previous similar events should be referenced appropriately (LER number if possible), and if there are none, tne text should state tnis.

Finally,13 of the LERs reviewed failed to include the Energy Industry Identification System (EIIS) codes. Reautrement50.73(b)(2)(ii)(F) reouires inclusion of the appropriate EIIS code for each system and component referred to in the text.

The text presentation had a marginally acceptable score of 76 percent. An outline format as suggested in Reference I woulr1 probably result in improved presentations, as well as, improved consistency in meeting the text reouirements.

In the abstract, the root cause sumary with an adeounte score of 81%

and the corrective actions sumary with a marginally acceptable score of 76% could both be improved. While the abstract is not supposed to be as detailed as the text, enough detail about root cause and corrective actions should be included to outline these iten.s to the reader. While the abstract was acceptable, the two areas mentioned above could be improved by including more of the major points discussed in the text.

The main deficiency in the area of coded fields involves the title, Item (4). Twenty-six of the titles did not indicate root cause, four f ailed to include the link (i.e., circumstances or conditions which tie the root cause to the result), and six f ailed to provide information concerning the result of the event (i.e., why the event was reouired to be reporteo).

While result is considered the most important part of the title, cause and link must be included to make a title complete. An example of a title that only addresses the result might be " Reactor Scram". This is inadeouate in that the cause and link are not provided. A more appropriate title might be " Inadvertent Relay Actuation During Surveillance Test LOP-1 Causes Reactor Scram". From this title, the reader knows the Cause was either personnel or procedural and testing contributed to the event.

Tabic 4 provides a summary of the areas that need improvement for Duana Arnold LERs. For more specific information concerning deficiencies, the reader should refer to the information presented in Appendices C and D. General guidance concerning these reouirements can be found in NUREG-1022, Supplement No. 2.4

1 TABLE 4. AREAS MOST NEEDING IMPROVEMENT FOR DUANE ARNOLD LERs Areas Comments Safety assessment information Be sure to include a complete safety assessment in all LERs. The text should discuss whether or not the event could have been worse under different circumstances.

Manufacturer and model number Component identification information information should be included in the text for each component failure or whenever a component is suspected of contributing to the event because of its design.

Previous similar events Previous similar events should be referenced (LER Number) or the text should state there are none.

EIIS codes Be sure to include the EIIS codes for all systems and components which are referred to in the text.

Abstracts Root cause and corrective action are basically good, but could be improved by including more details from the text.

Coded fields

a. Titles Titles should be written such that they more accurately describe the event. In particular, include the root cause of the event in all titles.

4

REFERENCES

1. B. S. Anderson, C. F. Miller, B. M. Valentine, An Evaluation of Selected Licensee Event Reports Prepared Pursuant to 10 CFR 50.73 (DRAF T ), NURLG/CR-4 I /5, March 1955.
2. Office for Analysis and Evaluation of Operational Data, Licensee Event Report System, NUREG-1022. U.S. Nuclear Regulatory Commission, September 1983,
3. Office for Analysis and Evaluation of Operational Data, Licensee Event Report System, NUREG-1022 Supplement No.1. U.S. Nuclear Regulatory Commission, February 1984.
4. Office for Analysis and Evaluation of Operational Data, Licensee Event Report System, NUREG-1022 Supplement No. 2. U.S. Nuclear Regulatory Commission, September 1985.