ML20203G522

From kanterella
Jump to navigation Jump to search
Forwards AEOD Input to SALP Review Based on Assessment of LERs for Nov 1984 - Mar 1986.Info Provided Prior to Issuance of SALP 5 Board Rept as Basis by Which Future Submittals Should Be Patterned
ML20203G522
Person / Time
Site: Big Rock Point File:Consumers Energy icon.png
Issue date: 04/24/1986
From: Norelius C
NRC OFFICE OF INSPECTION & ENFORCEMENT (IE REGION III)
To: Buckman F
CONSUMERS ENERGY CO. (FORMERLY CONSUMERS POWER CO.)
References
NUDOCS 8604290146
Download: ML20203G522 (14)


Text

DM6 APR 2 41986 Docket No. 50-155 Consumers' Power Company ATTN: Dr. F. W. Buckman Vice President Nuclear Operations 212 West Michigan Avenue Jackson, MI 49201 Gentlemen:

The NRC's Office for Analysis and Evaluation of Operational Data (AE00) has recently completed an assessment of your Licensee Event Reports (LERs) from Big Rock Point as part of the NRC's Systematic Assessment of Licensee Performance (SALP).

In general, your submittals were found to be of average quality based on the requirements of 10 CFR 50.73 and in comparison to other facilities that have been evaluated using the same methodology.

We are providing you a copy of AE0D's assessment prior to the issuance of the SALP 5 Board Report so that you might be aware of their findings and to also provide you a basis by which future submittals should be patterned.

We~ appreciate your cooperation with us.

Please let us know if you have any

-questions.

Sincerely,

" Original Signed by E. G. Greenman" for Charles E. Norelius, Director Division of Reactor Projects

Enclosure:

AE0D Assessment cc w/ enclosure:

Mr. Kenneth W. Berry, Director Nuclear Licensing D. P. Hoffman, Plant Superintendent DCS/RSB (RIDS)

Licensing Fee Management Branch Resident Inspector, RIII Ronald Callen, Michigan Public Service Commission Nuclear Facilities and Environmental Monitoring Section g

RIIIh i[

RIII Ril RIII R

gr

(&

\\,

Pears /qg Schweibinz

.Boyd Greenryan No lius 4/23/86 y_3 3_ f 6 pyy.pf, Y/t/

fAf gg 8604290:46 860424 PDR ADGCK 05000155~

P PDR

)

AE0D INPUT TO SALP REVIEW FOR BIG ROCK POINT Introduction In order to evaluate the overall quality of the contents of the Licensee Event Reports (LERs) submitted by 2ig Rock Point during the November 1, 1984 to March 31, 1986 Systematic Assessment of Licensee Performance (SALP) assessment period, a representative sample of the licensee's LERs was evaluated using a refinement of the basic methodology presented in NUREG/CR-4178.1 The sample consists of 10 LERs, which is

)

all of the LERs that were on file at the time the evaluation was started.

See Appendix A for a list of the LER numbers in the sample.

a It was necessary to start the evaluation before the end of the SALP assessment period because the input was due such a short time after the end of the SALP period. Therefore, not all of the LERs prepared during the SALP assessment period were available for review.

Methodology The evaluation consists of a detailed review of each selected LER to determine how well the content of its text, abstract, and coded fields meet the requirements of NUREG-1022, and S'upplements 1 and 2 to

~

NUREG-1022.

The evaluation process for each LER is divided into two parts.

The first part of the evaluation consists of documenting comments specific to the content and presentation of each LER. The second part consists of determining a score (0-10 points) for the text, abstract, and coded fields of each LER.

The LER specific comments serve two purposes:

(1) they point out what the. analysts considered to be the specific deficiencies or observations concerning the information pertaining to the event, and (2) they provide a

basis for a count of general deficiencies for the overall sample of LERs that was reviewed. Likewise, the scores serve two purposes:

(1) they serve to illustrate in numerical terms how the analysts perceived the content of the information that was presented, and (2) they provide a basis for the overall score determined for each LER. The overall score for each LER is the result of combining the scores for the text, abstract, and coded fields (i.e., 0.6 x text score + 0.3 x abstract score + 0.1 x coded fields score - overall LER score).

l The results of the LER quality evaluation are divided into two cate'gories:

(1) detailed information and (2) summary information. The detailed information, presented in Appendices A through D, consists of LER sample information (Appendix A), a table of the scores for each sample LER (Appendix B), tables of the number of deficiencies and observations for the text, abstract and coded fields (Appendix C), and comment sheets containing narrative statements concerning the contents of each LER (Appendix D).

When referring to these appendices, the reader is cautioned not to try to directly correlate the number of comments on a comment sheet with the LER scores, as the analyst has flexibility to consider the magnitude of a deficiency when assigning scores.

Discussion of Results A discussion of the analysts' conclusions concerning LER quality is presented below.

These conclusions are based solely on the results of the evaluation of the contents of the LERs selected for review and as such represent the analysts' assessment of each units performance (on a scale of 0 to 10) in submitting LERs that meet the requirements of 10 CFR 50.73(b).

Table 1 presents the average scores for the sample of LERs evaluated for Big Rock Point.

The reader is cautioned that the scores resulting from the methodology used for this evaluation are not directly comparable to the scores contained in NUREG/CR-4178 due to refinements in the methodology.

In 6rder to place the scores provided in Table 1 in perspective, the distribution of the overall score for all licensees that have been

evaluated using the current methodology is provided on Figure 1 Additional scores are added to Figure 1 each month as other licensees are evaluated. Table 2 and Appendix Table 8-1 provide a summary of the information that is the basis for the average scores in Table 1.

For example, Big Rock Point's average score for the text of the LERs that were evaluated was 7.1 out of a possible 10 points. From Table 2 it can be seen that the text score actually resulted from the review and evaluation of 17 different requirements ranging from the discussion of plant operating conditions before the event [10 CFR 50.73(b)(2)(11)(A)] to text presentation. The percentage scores in the text summary section of Table 2 prov.ide an indication of how well each text requirement was addressed by the licensee for the 10 LERs that were evaluated.

Discussion of SDecific Deficiencies A review of the percentage scores presented in Table 2 will quickly point out where the licensee is experiencing the most difficulty in preparing LERs. For example, requirement percentage scores of less than 75 indicate that the licensee probably needs additional guidance concerning these requirements.

Scores of 75 or above, but less than 100, indicate that the licensee probably understands the basic requirement but has either:

(1) excluded certain less significant informatior from a large number of the discussions concerning that requirement or (6) totally failed to address the requirement in one or two of the selected LERs.

The licensee should review the LER specific comments presented in Appendix D in order to determine why he received less than a perfect score for certain requirements. The text requirements with a score of less than 75 are discussed below in their order of importance.

In addition he primary deficiencies in the abstract and coded fields are discussed.

The first requirement to be discussed is the safety assessment

[ Requirement 50.73(b)(3)). The safety assessments for seven of the LERs

~

TA8tF 1.

SUMMARY

OF SCORES FOR 81G ROCK POINT Averaae M

[pw Text 7.1 9.3 5.5 Abstract 6.7 8.5 5.4 Coded Fields 8.4 9.0 8.0 Overall 7.lb 8.7 5.7 a.

See Appendix 8 for a summary of scores for each LER that was evaluated.

b.

Overall Average - 60% Text Average + 30% Abstract Average + 10% Coded Fields Average.

9

Figure 1. Distribution of overall avdrage LER scores 10 i

.,,,i,,i i..

i i

.r i

9-f-- Big R k Point 3

I

.t 8-I E

!l

]

7-p 8

6-8 5-

_g g

4~

~

O f

3-

~

2-d z

2 1-V r;

o

,,.,,,,.4,,,,i,,,,,,,,,,

,i 9.5 9.0 8.5 8.0 7.5 7.0 6.5 6.0 Overall average scores

TABLE 2.

LER REQUIREMENT PERCENTAGE SCORES FOR BIG ROCK POINT TEXT Percentage Reautrements ISO.73(b)1 - Descriptions

- Scores ( )*

(2X11XA) - - Plant condition prior to event 95 (1G)

(2XilnB) - - Inoperable equipment that contributed b

(2)(11)(C) - - Date(s) and approximate times 45 (10)

(2)(11)(D) - - Root cause and intermediate cause(s) 79 (10)

(2)(11)(E) - - Mode, mechanism, and effect 100 (1)

(2)(11)(F) - - EIIS Codes 0 (10)

(2)(ii)(G) - - Secondary function affected b

(2)(11)(H) - - Estimate of unavailability 100 (1)

(2)(ii)(I) - - Method of discovery 100 (10)

(2)(11)(J)(1) - Operator actions affecting course 100 (4)

(2)(11)(J)(2) - Personnel error (procedural deficiency) 79 (6)

(2)(11)(K) - - Safety system responses 92 (6)

(2)(ii)(L) - - Manufacturer and model no. information 100 (1)

(3)

Assessment of safety consequences 60 (10)

(4)

Corrective actions 53 (10' (5) - - - - -, Previous similar event information 60 (10)

(2)(1) - - - - Text presentation 81 (10)

ABSTRACT Percentage a

Reautrements ISO.73fb)(111 - Descriptions Scores ( 1

- Major occurrences (Immediate cause and effect 86 (10) information)

- Description of plant, system, component, and/or 96 (6) personnel respenses

- Root cause infornetion 61 (10)

- Corrective Action information 30 (10)

- Abstract presentation 70 (10)

O

iA8LF ?.

(continued)

[0DED FlELE1 Percentage Item Number (s) - Description Scores (._((,

1. 2, and 3 - facility name (unit no.), docket no. and 100 (10) page number (s) 4 - - - - - - Title 56 (10) 5, 6, and 7 - Event date, LER No., and report date 100 (10) 8 - - - - - - Other facilities involved 90 (10) 9 and 10 - - Operating mode and power level 83 (10) 11 - - - - - Reporting requirements 95 (10) 12 - - - - - Licensee contact information 100 (10) 13 - - - - - Coded component failure information 74 (10) 14 and 15 - - Supplemental report information 100 (10) a.

Percentage scores are the result of dividing the total points for a requirement by the number of points possible for that requirement.

(Note: Some requirements are not applicable to all LERs; therefore, the number of points possible was adjusted accordingly.) The number in parenthesis is the number of LERs for which the requirement was considered applicable, b.

A percentage score for this requirement is meaningless as it is not possible to determine from the information available to the analyst whether this requirement is applicable to a specific LER.

It is always given 100%

if it is provided and is always considered "not applicable" when it is not.

?

I t

f

-,-r n,

r-

were found to be deficient and one of the LERs did not include a safety assessment. A detailed safety assessment discussion is required in all LERs and should include three items as follows:

1.

An assessment of the consequences and implications of the event including specifics as to why it was concluded that there was no safety problem, if applicable.

It is inadequate to state "this event had no safety consequences or implications" without explaining how that conclusion was reached.

. 2.

A safety assessment should discuss whether the event could have occurred under a different set of conditions where the safety implicaticns would have been more severe.

If the conditions during the event are considered the worst probable, the LER should state so.

3.

Finally, a safety assessment should name other systems that were available to perform the function of the safety system that was unavailable during the event.

Big Rock Point did a reasonable job on the first and third 16 ems required in a safety assessment, but six LERs were found to be lacking the second.

Five of the LERs failed to provide an adequate discussion of the

~

corrective actions, Requirement 50.73(b)(4). None of these five LERs provided information concerning what was planned or done to prevent l

recurrence of the event.

In addition, four of these five failed to provide l

information concerning what was done to correct the insnediate problem.

It l

should be noted that four of these five LERs involve essentially the same i

event, electrical noise in a picoammeter that results in an l

upscale /downscale trip.

, Dates and approximate times for major occurrences discussed within the t' ext were not provided in six of the LERs, Requirement 50.73(b)(2)(11)(C).

The date, and time if appropriate, should be provided for occurrences such

as the start or discovery of the initiating problem, reactor scram, when the plant is placed in a safe and stable condition, the start and end of major evolutions or transients, and removing and returning equipment from/to service. The nature of the event should dictate how many dates and times should be provided.

In general, sufficient dates and times must be L

provided so that the reader has a good understanding of the time history of the entire event. Events spanning weeks or months (e.g., recurring events) usually require many datos so that the reader can understand the proper sequence for the occurrences discussed within the text.

  • four of the 10 LERs reviewed failed to mention previous similar events or state that there were none. Previous similar events should be referenced appropriately (LER number if possible), and if there are none, the text should so state.

None of the LERs included the Energy Industry Identification System 1

(EIIS) codes. Requirement 50.73(b)(2)(11)(F) requires inclusion of the i

appropriate EIIS code for each system and component referred to in the text.

i The text presentation had a surginally acceptable score of l

81 percent. A standardized outline format as recommended NUREG-1022, I

Supplement No. 2 would probably result in an improved presentation, as well l

as, improved consistency in meeting the overall text requirements.

I The abstracts are deficient in two major areas; namely, the summarization of cause and corrective action information. Had the cause and corrective action information, which was presented in the text, been adequately summarized in the abstracts, these areas of the abstract would have received scores that better reflected the text scores (i.e., 79%

t and 53%, respectively). Over half of the abstracts failed to adequately summarize this information however, and the remainder failed to mention j

cause or corrective action infornation at all. These are two of the text l

discussions that must be summarized in every abstract in order for the abstract to be considered complete.

The presentation of the abstracts is generally poor in that the abstracts are too short.

Six of the ten abstracts failed to utilize even half of the space available (i.e., the 1400 spaces).

In addition, seven of the abstracts contained information that was not presented in the text discussion.

It is good to provide all necessary information, but if this information is deemed necessary in the abstract, it should always be included in the text as well.

It is understandable that an author may think of an additional fact or two while writing the abstract; however, whers this occurs, the text must be revised such that these facts are included in the text discussion.

The main deficiency in the area of coded fields involves the title, Item (4). Nine of the titles did not indicate root cause, five failed to include the link (i.e., circumstances or conditions which tie the root cause to the result), and three failed to provide information concerning the result of the eve'it (i.e., why the event was required to be reported).

While result is considered the most important part of the title, cause and link must be included to make a title complete. An example of a title that only addresses the result might be " Reactor Scram". This is inadequate in that the caus:s and link are not provided. A more appropriate title might j

be " Inadvertent Relay Actuation During Surveillance Test LOP-1 Causes Reactor Scram". From this title, the reader knows the cause was either

, personnel or procedural and testing contributed to the event.

~

l Finally,'in the coded fieNs, three LERs failed to include the letter "N" in Item (9), Operating Mode and five LERs failed to complete Item (13) i properly.

Four LERs contained information in Item (13) even though no i

component failure occurred and one failed to provide information in Item (13) even though a component failure had occurred.

[ Note:

Faulted components need not be included in Item (13)).

Table 3 provides a summary of the areas that need improvement *for the Big Rock Point LERs.

For additional and more specific information concerning deficiencies, the reader should refer to the information presented in Appendices C and D.

General guidance concerning these I

requirements can be found in NUREG-1022, Supplement No. 2.

l l

a TABLE 3.

AREAS MOST NEEDING IHPROVENENT f0R BIG ROCK POINT LERs Areas Comments Safety assessment A discussion of safety implications of the event should be included in all LERs. The discussion should include the effect of the event on the plant, as well as, the avsilability of backup systems and the consequences of the event had it occurred under a more severe set of initial conditions.

Corrective actions The discussion of corrective actions should always include information concerning wnat was done to fix the immediate problem and what was done or planned to prevent the recurrence of the event.

Dates and times Include sufficient dates and times to allow a reader to visualize the complete time history of the occurrences discussed in the event.

Previous similar events Previous similar events should be referenced (LER number) or the text should state there are none.

EIIS codes Codes for each component and system referred to in the text should be provided.

Text presentation A standardized outline format is recommended such as the one discussed in NUREG-1022, Supplement No. 2.

The text should 7.1 ways include any information contained in the abstract.

Abstracts Cause and corrective action Information from the text are not being adequately summarized / included in the abstract.

The abstracts need l

to be more complete.

l I

L

\\

' ~

TABLE 3.

(continued)

Areas Comments Coded fields a.

Titles Titles should be written such that they more accurately describe the event.

In particular, include the root cause of the event in all titles.

b.

Failed component Include information in Item (13)

,inf orma tion only for those events involving

+

failed components (not faulted components).

e l

1 v~-

w u

ng,- -,,p.

+ -,,,

,_.w,

,,-.-,-e.

-,-,--~--,,-w,,,-

,-.,_n

-n

~

REFERENCES 1.

8. S. Anderson, C. F. Miller, 8. M. Valentine, An Evaluation of Selected Licensee Event Reports Prepared Pursuant to 10 CFR 50.73 1 DRAFT), NUREG/CR-4178, March 1985.

2.

Office for Analysis and Evaluation of Operational Data, Licensee Event Report System, NUREG-1022, U.S. Nuclear llegulatory Commission.

Septemoer 1983.

3.

Office for Analysis and Evaluation of Operational Data Licensee Event ReDort System, NUREG-1022 Eupplement No.1. U.S. Nuclear Regulatory Commission, February 1984.

4.

Office for Analysis and Evaluation of Operat'ional Data, Licensee Event Report System, NUREG-1022 Supplement No. 2. U.S. Nuclear Regulatory Commission, September 1985.

f

-n-

,,,,. ~-

..-,-------e-

,ww-

,