ML20127C274
| ML20127C274 | |
| Person / Time | |
|---|---|
| Issue date: | 03/12/1992 |
| From: | Taylor J NRC OFFICE OF THE EXECUTIVE DIRECTOR FOR OPERATIONS (EDO) |
| To: | The Chairman NRC COMMISSION (OCM) |
| Shared Package | |
| ML20126L238 | List: |
| References | |
| FOIA-92-272 NUDOCS 9203300300 | |
| Download: ML20127C274 (20) | |
Text
_ _ _ _ _ - _ - - _ _ _ _ _ _ _ _
P ass,
UNITED 5 TAT ES
- g NUCLE AR REGULATORY COMMISSION 4
/
$'3 te AiHlfv01DN, D. C. 20555 March 12, 1992 MEMORANDUM POR:
The Chairman Cnnuissioner Rogers couen!6storier Curtiss Coamissioner Res.ick Ctais.iissioner de Planque FROM:
Jan.es H. Taylur Executive Director for Operations
SUBJECT:
SYSTEMA 11CASSESSMENTOPLICENSEEPERFORMANCE(SALP) PROGRAM (he Consnission has directed the staf f to undertake 6 ccur.prehensive review of the Systen.atic Assessment of Licensee Perforn ance (SALP) prograni (SRM dated Deced>ber 20,1991).
fhe staff's actions in response to the SRM, including results of the review to date, actions currently planned, plans f or further analysis and review, and the timetable f or cornpletion of 1Ae project are suw.arized below.
Results of Staff Review To Date The Cunnist. ion directed the staf f to uridertake a conprehensive review of SALP results to detero.ine whether appropriate QA controls are in place to ensure consistent and reliable evaluations.
In Novonber 1991, as part of its ongving review of the SALP program and in response to the Co.unission's direction, the staff completed a study of recent SALP results to detersnine what conclusions could be drawn concerning uniforin application of SALP Manual Chapter (MC) 0516.
The report of the study is Enclosure 1.
The study and the staff's ongoing over:1ght of the SALP program indicate that the regions are performing accurate assessinents of licensee perforg.ance, and that there is consistent implenenta-tion of MC 0516. However, the staff is working to improve the SALP process as outlined below.
Actions Currently Planned lhe staff will increase senior a.anagedient involvet.ent and oversight of the SALP process to ensure agency policy and procedures are appropriately ju. plea >ented and that apeticy objectives f or the SALP prograaii are u.et.
Special attention will be given to ensure that an individual inspector is not able to innproperly influence nor appear to influence the SALP results.
Each NRR manager at the i
M gj
%d}2C y-1 t
The Consnissioners 2
division director level and above arill attend at least one SALP Board ineting a year, in addition, each regional Division Director and Deputy Division Director who sei'ves as a SALP Soard Chairen will attend at least one SALP i
Board heeting a year in a region other than their own.
The staff will solicit infornation from, licensees regarding their views on the SALP process and their reconinendations for program improvesoents. The staff will consider and iuplenient, as appropriate, those recoan.endations which are consistent with agency policy and objectives for the SALP prograso. Reconsi.enda-tions which go beyond current agency policy will be forwarded to the Coninission with a staff reconn.endation.
Thestaffeyectstoperiodically(e.g.,18-24 months) solicit such industry feedback as a part of the staff's assessaent of the effectiveness of the SALP process.
Further Staff _ Analysis and Review The staff desires to improve the applicatem of the SALP process and provide for a niore consistent understanding of the SALP results by the industry and the public.
Therefore, the staff will undertake an overall reexamination of the SALP Progra,a with the intent of providing a clear statenient of the prograan
. objectives as well as the parameters and u.easures that will be used to monitor the ef fectiveness of the program. Aaong other things, the staff will, (1) consider reducing the current nuinber of functional areas (seven) to a suialler nuinber, each with approtirately equal significance,)(2) review the attributes to ensure consistency with the functiunal eens, (3 develop standard language for the SALP cover letter to ensure that the overall assessnient is properly and consistently expressed, and (4) review the SALP Board voting e.ea.bership.
The results of the above efforts and recoaaendations for any changes to the SALP Prograu will be submitted tu the Con.nissior. in September 1992.
A sumary of the staf f recou nendations and proposed schedule are shown in.
ensnal Sigud Bh James IL Taytor 'c.
Janes H. Taylor Decutive Director for Operations Enclosures 1.
Review of Unifonuity in the SALP Process 2.
Staff Recon 4endations and Proposed Schedule cc: SECY OGC ACRS DISfRibul10N l
CentraTTITE LPE8 R.F.
SECY OGC Regiunal Adihinistrators l
CHolden GZech COTho ds JWRue ffG111espie l
WiRussell TEMurley MhTaylor
(
- See previous concurrence
- LPE8:DLPQ*
BC LPEB:DLPQ* : TECH E01 TOR *
- DD:DLPQNRR
- D:DLPQNAR
- D:PMAS:NRR nAME #CHoldentcct GGIech a
tCOThuinas
- JWRoe s/PGillespie l
l DATE. 02/26/92 02/27/92 802/26/92 102/28/92 102/28/92
- 03/03/92 s'kAbe5 b f lor lr N ah2 h( lbuh2 s
lia 11 103/08/9
/ 492 10 /06/
103/0 DAlt.
J RM S1 172 S
Off1CIAL RECORD COPY Document Name:
t REVIEW 0/ UNIFORMITY IN THE SALP PROCESS AHONG THE REGIONS Contents
.P*22 1
1.
BACKGROUND........................................................
1 2.
DISCUSSION........................................................
1 2.1 P r o g r au. O v e r s i g h t.............................................
2.2 Review of SALP Nun,erical Results..............................
2 2.2.1 functional Area SALP Scores............................
3 2.2.2 Distribution of SALP Scores............................
6 7
2.2.3 Regional Average of All functional Areas...............
2.2.3 As signaent of Trends in the SALP Process...............
8 9
3.
CONCLUSIONS.......................................................
figures:
1 to 7 Histogran.s of SALP Scores by runctional Area Novoaber 1991
.1.
BACKGROUND In the SRM responding to SECY-90-189, " Reevaluation of the Systernatic Assessment ofLicenseePerformance(sal,P)Progra,h," dated August 10, 1990, the Comission approved the 1:nplenentation of program changes to the SALP Manual Chapter (MC 0516) and directed the staff to pursue further efforts to increase Uniformiity an,ong the regions with regard to interpretation and application of the SALP program criteria and attributes.
This paper details the results of the staf f's review of the SALP prograan which included a study of recent SALP scores to deteruine if the SALP prograa is uniforinly bplemented.
~
2.
DISCUSSION HC 0516 describes the basic structure and overall procedures for kpleu.enting the NRC progra.h for assessing licensee perforniance. The SALP Board assesses licenset perforenance in the following seven functional areas Operations, Radiation-Protection, Maintenance / Surveillance, Eniergency Preparedness, Secu-rity, Engineering / Technical Support, and Safety Assessa,ent/ Quality Verifica-tion (SA/QV). The f;.llowing six criteria are used to evaluate the seven func-tional areas: (1) assurance of quality, including o anagement involven,ent and control l (2) approach to the identification and resolution of technical issues from a safety standpoints (3) enforcenient history (4) operational events; (S) staffing, and (6) effectiveness of training.
Regional staff have significant experience implen.enting the SALP program.
2.1 Program Oversight Additional oversight of the SALP function was initiated following issuance of the niost recent changes to the SALP MC on Septen.ber 28, 1990. A SALP program uanager (PM) was appointed within the Division of Licensee Perforniance and Quality Evaluation (DLPQ).
Ihe SALP PM overtees the SALP prograa. by review-ing SALP reports, attending SALP nieetings in the five regional of fices, and coordinating resoluttor, of SALP issues Wth the regions.
The SALP PM or his section chief attended resident counterpart meetings in each of the five regional of fices to discuss the September 19W changes to the SALP program.
The SALP PM attends selected SALP Board,beetings in each of the five regions to a.onitor the iinplementation of the SALP program and to interact directly with the regional 6. anagen.ent and staff involved in SALP implementation. Additionally, plant evaluators fru the Perforaiance and Quality Evaluation Branch (LPEB) routinely attend SALP Board meetings.
The evaluators are knowledgeable of the SALP process since some of. then. participated in the SALP process as resident inspectors or project managers.
The evaluators and the SALP PM compare obser-vations on the regional iniplea.cntation of the SALP program. LPEB staff have attended 29 SALP Board meetings in the past year.
Observation of this large san.ple of SALP Board c.eetings, review of SALP reports, and discussions with regional n anagen.ent and staff directly involved in SALP iaiplementation provide confidence that the SALP prograin is being unifortaly bpleiner.ted in accordance with the MC.
1
l LPEB sponsored a ceeting in Juns 1991 uith the five regional SALP coordinnors
.to discuss the SALP prograsu and exchange inforg.ation on the techniques that work best within each region. LPEB staff used the knowledge gained froni review of SALP reports and attendance at SALP Board u,eetings to forgiulate the nieeting agenda. During the cesting LPEB staff also discussed sonne minor problen,s with the iluplea<ntation of the SALP MC, such as format changes to the SALP report and Board voting procedures. Regional coordinators exchanged ideas about (1) the mechanics of draf ting a SALP report, issuing it for consent, and incorpo-rating the consients into a useful report for the Boards (2) problen.s associated with using trends and (3) the benefits of u.aking further changes to the attri-bute, trend, and functional area definitions. The n.eeting was successful because ideas were exchanged and sooie minor adiuinistrative differences an.ong the regions were identified and.esolved.
On July 18, 1991, attheDirector,DivisionofReactorProjects(DRP) counter-part ineeting in Region IV, the SALP PM and the Section Chief discussed son.e of the issues addressed at the coordinators' uneeting to understand the perspective of regional u.anaged.ent with regard to the SALP process and to reinforce SALP program goals.
- 2.2 Review of SALP Numerical Results Because of the iniportance of nui..urical SALP scores to the NRC and to the licensees, the nuinerical results of the SALP program were reviewed. The staf f used scores for the last two SALP assesso.ent periods for each plant in this study.
The w>t recent SALP assessa:ent periods, which began in mid-1989, were designated as Group A.
The second u. cst recent SALP periods, which began in late 1967 and early 1988, were designated as Group B.
fo facilitate the nuu.erical anal */ sis the staff used the assumption that perforn.ance of all plants is siuitlar and therefore the distribution'of SALP scores was expected to be censistent fruia one region to the next.
2.2.1 Functional Area SALP Score To obtain an average of SALP scores for each functional area by region, each SALP score was u.ultiplied by the nuinber of plants receiving that score, the results were totalled and that total was then divided by the nuu,ber of plants.
=
The data was obtained froin hUREG-1214, Revision 8, " Historical Data Suauary of the Systeniatic Assessaent of Licensee Perfnemance," which contains data that was current as of June 30, 1991.
To obtain the national average score, the regional scores for each functional area were added and divided by 5.
(1) Group A SALP Scores (iuid-1989 to early 1991)
Table 1 gives the average scores within all functional areas for all regions.
Of the 511 SALP scores reviewed, 31 percent were assigned by Region I, 25 percent were assigned by Region II, 26 percent were assigned by Region III, 11 percent were assigned by Region IV, 7 percent were assigned by Region V.
2
y Table-1; Group;A functional Area Average-SALP-Scores-
-Region 1.
III IV V
functional Area Operations 1.5 1.6 1.5 1.5-1.4 Radiation Protection 1.8 1.3 1.7 1.6
-1.6 Maintenance / Surveillance 1.6 1.9 1.6-1.9
-2.0-
' En.ergency Preparedness 1.3 1.5 1.3 1.4 1.4-Security 1.3 1.6 1.5-1.4 1.8-Engineering / Technical Support 1.8 1.7
- 2.1-2.0 2.0-SA/QV 1.7 1.7 1.9
_1.8 1.8 Table-2 gives-the national average for each functional area calculated f ro.h the results given in Table 1.
Table 2 National Average for Group A.SALP Scores functional Area Average-Scure Operations 1...
1~.5 :
Radiation Protection 1.6
-Maintenance / Surveillance 1.8
- Energency Preparedness 1.4 1.5 Security
- fngineering/ Technical Support-
-1.9 SA/QV 1.8
-The largest difference between ~a functional area' average and its corre-
.sponding national-average was 0.3i the Region 11 functionaliarea of Radiation Protection had a lower score (higher rating)~than:the national l average by 0.3 and.the Region V functional-area of-Security had a higher score-(lower rating) than the nattunal average by'O.3.
3 y
r-
,,r--
y,n-
.-,.,,,,,-w,-,5.
In the larger regions (Regions I, !! and 111), a change of one SALP grade for one licensee would result in a difference of about 0.05 for the regional functional area average. For Region IV one SALP category change would result in a 0.12 change, and-for Region V one SALP category change would result in a 0.2 change.
In order for Region !! to have its Radia-tion Protection functional area average agree within 0.3 of the national average, only one licensee wuuld need to achieve a lower grade in that j
one functional area, for exauple, a change frcus a SALP Category I to a SALP Category 2.
Siniilarly, if one grade unange were Linde in the Security functional area for Region V, the regional average would agree within 0.3 of the national average.
Therefore, no real statistical abnoru.ality exists between each regtun's implenientation of the SALP prograa..
(2) Group B SALP Scores (late 1987 to shid-1989) l The sane calculations perforned above were repeated using the Group B SALP scores.
Tables 3 and 4 give the results. Of the 483 SALP scores reviewed, 32 percent were assigned by Region I, 23 percent were assigned by Region !!,
28 percent were assigned by Region !!!, 10 percent were assigned by Region IV, and 7 percent were assigned by Region V.
Table 3 Group B functional Area Average SALP Scores Region I
IV V
functional Area Operations 1.7 1.6 1.6 1.7 1.8 Radiation Protection 1.7 1.6 1.7 1.7 1.6 Maintenance / Surveillance 1.7 1.7 1.7 2.1 2.2 Ea.ergency Preparedness 1.5 1.6 1.2 1.4 1.2 Security 1.2 1.7 1.6 1.7 2.0 l
l Engineering / Technical Support 1.6 2.1 2.0 2.0 2.4 SA/QV 1.8 1.8 1.9 2.1 2.6 1
l 4
l l
1
Table 4 National Average for Group B SALP Scores functional Area Average Score 1.7 Operations Radiation Protection 1.7 Maintenance / Surveillance 1.9 Euergency Preparedness 1.4 Security 1.6 Engineering / Technical Support 2.1 SA/QV 2.0 Two regions had six functional areas that differed frou. the national average by 0.3 or greater: Region I ha; two functional areas with average scores lower (higher rating) than the national average (Security by 0.4 and Engineering / Technical Support by 0.3) and Region V had four functional areas with average scores higher than the national average by at least 0.3 (Maintenance / Surveillance by 0.4, Security by 0.4, Engineering / Technical Supportby0.3andSafetyAssessment/QualityVerification(SA/QV)by0.6).
A change of one functional area category for one licensee would result in agreetcent between the individual regional functional area average and the national average (within 0.3) for all but two of the above results. It would take category changes of five licensees for Region I Security area and category, changes of two licensees in the SA/QV area for Region V to agree with the national average.
These results indicated that Region Y was rating licensee performance lower in the Group B SALP period than other regions. There are a number of f actors that may explain this difference between Region V functional area averages and the averages of the other regions. The first is that Region V has fewer plants. One grade change in one functional area for a Region V plant would change the regional average in that functional area by 0.2.
A sia,ilar change in Region I would result in a change of 0.05 to its regional average.
This example illustrates the problems inherent in using the san,e acceptance criteria to couipare a large data base to a small data base. The second factor that may have affected the Region V average were the SALP program changes in June 1988 and again in August 1989. As with any change, there is a learning curve associated with it. While these changes also were implemented by the other four regions, Region V uses the SALP hanual Chapter less frequently than the other regions and therefore had less opportunity to gain experience working with the a.anual chapter. The third factor may be that Region V managers spend more time at their assigned sites than u.anagers at a similar level in the larger regions. As a result, there are uore voting menters at the SALP Board with first-hand knowledge of the site in Region V than in other regions.
These managers may be rnore critical in their review because they have first hand knowledge.
5 1
Th1 results also suggested that Region I tay have graded licenstes lower
~
in the Security functional area than the cther regicns. Sou.e f actors for r
t?is difference suay be the old age of hiany of the plants in Region I and the f act that all plant security plans are custon, s.ade to the f acility.
The above ratings n.ay have resulted in an industry perception that SALP ratings vary froin region to region. However, Group A SALP scores reveal that presently there is a sacre uniforni rating of licensee performance than in Group B SALP scores.
2.2.2 Distribution of SALP Scores Computations of average scores for SALP do not give a cunplete description of the SALP process since the distribution of SALP Scores can vary without a significant change in the average score.
Therefore, graphic displays of the SALP results by percent for each region were developed, figures 1 through 7 graphically display these results for Group A and Group B SALP scores by functional areas. When utilizing these figures for comparisons, it is inipor-tant to keep in suind that one licensee changing one category rating in a functional area represents a 5 percent change for Regions 1, 2, and 3, a 12-percent change for Region IV, and a 20-percent change for Region V.
The figures show that the functional areas of Emergency Preparedness and Security had a larger percentage of Category 1 ratings than the other func-tional areas. Also, Engineering / Technical Support and Safety Assessinent/
Quality Verification both had significantly n. ore Category 2 ratings than the other functional areas. These results are similar to those for the functional area averages.
The two Group A functional areas (lw Region II Radiation Protection scores and high Region V Security sciares) that differed froin the national functioral area average by 0.3 also are apparent in the figures.
fwo other Group A rating dis-tributions that stand out on the figures are the larger nuater of Category 2 ratings in Maintenance / Surveillance in Region II and the larger nuuber of Category 1 ratings in Engineering / Technical Support in Region III. For the Group B results, the figures indicate a distribution difference in the follow-ing areas: Region I had a larger nuoter of Category 3 ratings in Operations, Region I!! had a larger nuuber of Category 1 ratings in Energency Preparedness, and Region II had a iarger nunber of Category 2 ratings in Einergency Preparedness.
One assumption of this coa.parison was that licensee performance is sinilar an.ong the regions. This assumption was not always true. For exanple, a nuinber of licensees froia Region I were on the problera-plant list during the period when Region I experienced a larger than normal nuaber of plants with a Category 3 rating in Operations. While the distribution suggests this is an abnormality, actual plant performance confirms the SALP results.
In sunnary, the figures indicate that the regions are fairly uniform in their assignu.ent of SALP ratings. Although the older Group B SALP data suggests that thu Region V plants had higher average scores, the hiore current Group A SALP results shcw Region V ratings consistent with the other regional ratings.
These conclusions agree with the conclusions drawn froia the functional area averaging.
6 I
P.2.3-Regional Average of All functional Areas
,4 A coa.putatio'n of a regional average for all functional ~ areas was conducted.
This calculation crosses functional areas and-is not considered a good repre-sentetion of SALP perforasance. Even though-1t is true that-the sanse_ criteria apply to all functional areas, these criteria siay not_ receive the san,e weight in each functional area. For exanple, enforceu,ent history aisy be more-impor-tant in one functional area than in another functional area. Sone functional areas, such as Einergency Preparedness anc Security, receive far fewer inspec-tion hours than other functional areas, such as Operations.
Inspection hours for a given SALP are typically less than 3 percent for Energency Preparedness and less than 5 percent for Security, while Operations inspection hours could approach 40 percent of the inspection effort. Additionally, a cou,putation of a regional average for all functional areas assunies that licensee perforg.ance-is consistent from one region to the next.
Historically, plant perfordiance ansong the regions was not always consistent, for these reasons,_the coa. pari-sons of SALP scores should be liniited to within a functional area.
Too siany _
variables exist in plant perfur.uance and inspection activities to support evaluattun.of SALP scores across functional areas for each region. However, licensees have been known to calculate SALP score averages, ano this'calcula-tion may indicate why soa.e licensees believe there is a-disparity an.ong the regional SALP results.
Table 5 gives_the results of-this review.
Table 5 Average SALP scores for the Regions Region Group A Group B I
1.6 1.6 II 1.6 1.7 Ill 1.7 3.7 IV 1.7 1.8 V
1.7 2.0 l
l l
This table suggests that Region V had higher ratings (lower scores) in the-l earlier averages and scores that saore closely follow the other regions in the c, ore recent (Group-A) averages. This data is also consistent with the pre __
vious data for functional area averages and national averages of SALP scores by functional area.
l 7
t 2.2.4 Assignment of frends in the SALP Process DLPQ had indicated in previous observations of the SALP process that the determination of a trend was a potential problein area.
A trend, in some cases, had been used to indicate a high or a low rating for a functional area. This erroneous use of trends resulted fron, SALP Board's recognition of licensee performance that was above or below the norm for a given functional area. The Group A and Group B SALP results were compared to determine how well the pre-dictive nature of the trend determination was functioning in the SALP process.
Table 6 gives the results.
Table 6 Trend Results in Percent Declined Times Improved frooi Reinained trend was Predictions beyond one last the Region used.
becair,e true*
category score s ame 1
22 46 5
36 14 II 15 47 20 20 13 III 13 54 15 15 15 IV 2
0 50 50 0
V 3
100 0
0 0
- Performance improved or declined as shown by a change in rating.
Nationally, 49 percent of the trend designations resulted in the anticipated change in the functional area category.
Only one of the decliriing trends resulted in a lower SALP score in the next assessment.
In the other six cases of-declir.ing trends, licensee action was sufficient and the SALP Board concluded that perfurii.ance had improved.
It could be concluded that in almost all cases the a5>1gnment of a declining trend was effective in stimulating licensee action to currect the performance in the functional area.
The use of the improving trend had not been as ef fective as the declining trend and was subject tu misuse as an indication of a "high" category rating.
The use of trends in the Group A SALP results has increased approximately 52 percent over the Group B SALP results. For improving trends only, the predictive abil-ity of the improving trend was 56 percent accurate.
The issue of SALP trends was discussed with the regions during the SALP coor-dinators' meeting and again at the Division of Reactor Projects Director's meeting. The regions use trends in planning for inspections.
The regions and the progren office are continuing to review and discuss trending to determine if any changes to the SALP prugran are necessary. To date, no changes have been identified, but these discussions have heightened the awareness of the regions to potential n.isuses of trending. The progrea-office has observed 6, ore deliberate discussions at SALP Board ineetings with regard to trending determinations.
8 e
3.
C0hCLU510NS As a resu's of the oversight functions and the study discussed above, the staff concludes shat the regions are performing accurate assessnents of licensee perfori.ance and there is relatively consistent iguplen.entation of the SALP program. Although there were son.e u.inor differences identified between sone regional functional area averages and national averages, these differences were small and uiinor changes in a functional area rating would normally result in agreentent between regional and national functional area averages.
Use of the declining trend is an effective SALP tool as shown by the iviprove-ment in perforwance in six of seven cases where declining trends were assigned.
However, the use of improving trends resulted in an improved SALP category in only 50 percent of the cases in which it was used. Although the regions uti-lize trends to assign priorities and inspection resources, the poor accuracy of the improving trend and the large increase in the use of trends indicate a need for further suonitoring of the use of improving trends.
O e
9 I
-a a
4 8+
a e
j l-3 c
m O
(D
'[l W iM E g i
.i 9
00
'o O.
O a
u s. numwmusuww O
Oi! *)k i
& A
,k L
E II IIairCGM
=
..-_.f 00 i
h$@$3M#ii!??
n g
pJ am-g a
s-Q wiMs%swww x s i N.
IE o
o o
o o
o O,-
eD A8 4
N 0-I" C
e y
~
E ff bb 0
e O.
4 m
- (t: swum +witu O
i.
' AN51%@TM b
a
' p' y s: w s m s s w m
.S
+
l l
r
- in.mmmiimm!!mmietum x
a h
N i
j
_j 2..
zI Ep5McWD5WMebat a
i i
Y' Y
[
o o
o o
o o-
{
o 80 M
N O
l l'
n, t
Radiation Protection Radiation Protection 100%
I 100%
8 I
I l
l i
6 i
-t 80%-
sos _
l
._f I
}
I j
i:ii g'hb f C
jl4 f
60%-
?
h I
80%-
V.
hj
- 1:
Ea
$ Thij f
[
7;;. [
-li*
I
+ -
40%-
4m
{
40%-
e 7,
3 l.
f 7
P$
j f
h
~
.==
f k
..Y 20% -
- 0
~
5)
- 7ii I
CI
(
20% -
~
k lit i
h g#.1
~
gi EM h[
1-I 4
=
8+-
p e.
?;.
m ip
- y g
i p;
y
- e l
e.
a.
m 4
- 2 b
L-f -. -
0%--
l 0%--
Category 1 Category 2 Category 3 Catagory 1 Category 2 Category 3 M wationes 8 pes on t O neston 2 M Netso,s.:
O nect.<i s O nos on 2 s
s E Reglon 3 C Region 4 O Region 5 E Megion 3 C Reston 4 O Meeton 8 Group B g
Oroup A
Figure ^
4 Maintenance
. Maintenance 100%
100 %
i a
80% -
!!h 80%-
- -g di g
60%-
ec 60%-
51 KW:' k
{
st i 5
?
d l5 f:
40%-
[f if 40%-
- q g-
~ ~ -
ff N
IE g
I:I E
k k:.k I
I T6 w
n-e y
c w
4 3...
f
(
h k
I I
L w
s e
20%-
20% -
~~~---w
@(P Ri d
?? w c
f:fi C
it
@6,. -
LTi @
4
$ E M
t.*' :=
9 k-3 0%
0% --
Category 1 Category 2 Category 3 Category 1 Category 2 Category 3 E Nationet E Region 1 O Region 2 E Nationel Region 1 Reglen 2 Region 3 Region 4 C Region 5 Region 3 Reglon 4 O Reglen 5 b
Oroup A Group B
i rigue-t f
Emergency Preparedness Emergency Preparedness 100%
100%
W 80%-
80% -
q e
. [g T N) m b
60%-
=
60% -
~
s e
}
51 h ?
$ c e
[f
,yl
~
j -- -
40%-
9-l
~
l'r
~s ge k.
i;:
s-40%-
g y;
h
'i h,,
L
- ,y M
7 i
g; r
=:
=
20%-
p 20% -
^
&g[
- - - - ~ -
p 3
-m g[Ij k
}t h l g
- p k
N I
O ;
0% -- -
0% --
Category 1 Category 2 Category 3 Category 1 Category 2 Category 3 E National C Region 1 Region 2 E National C Region 1 Region 2 E Region 3 C Region 4 C Region S E Region 3 C Reglon 4 C Megion S Oroup B Group A
Figte '
^
m Security Security 100%
i 100%
l l
1 i
~
f i
t l
80%-
i q!
B0%-
w 9
i FA
}
9 f/
I I
=
d
?
4 m
l 91 60% -
Tlo 80%- jfp 9
l
?
o
}-.
Q i
c 7
q m
e l
.$3 [f I
w k~d,=
f 40%-
1 40%-
1+;
- m
.= m w
3 d!h
[3 h Sc?
- 1
+
- 1+j =
,'i i
?
'3 J
i
- - aig x
p g
=
t rg c=
'. i 7
20%-
4 nc gg 20%-
~
2d
=
m w
E l
(
a N
1-lj:
R E
^
t i
b-U 0% --
1 0% --
Category 1 Category 2 Category 3 Category 1 Category 2 Category 3 E Mstional C Region 1 O Region 2 E National C Reglon 1 C Region 2 l
E Region 3 C Region 4 C Region 5 m negion s O negion 4 O nosioa s Orowp B Group A
s.
- {
j:.
- ji '
t
- ' ; j.
, [I:I j
L
,g n
g e
n 3
a 8 i
yr
. n re o
. e o
i
, g g g e
e o t
nm e
a C
L OO n
ig a
2 E
T3}'. -
smE i $ azg= F. :?s. ~A n
m 42fl i
c i 4 1
y r
n n 1
.y$5ssEwQ jn o
e e 4
?
L i
g g g l
l e
e e i1 j
n p t
_ $1:
ki a
p aC
,i i I!!
1 3
y s
[
e ro n n e o g
i it g
e e e mpE t
$+
~ N a
wR C
MW B
0 0
0 0
0 0
w 0
8 6
4 2
o 1
r G
gn 3
2 5 i
y r-1 n n r
o o e e
g ig lg e
e e t
nR e
aC n
O i
.1 I
!j-lli g
?
t i
2 n
i" ( I# f E:d$ 1:=.%=
- # p-t 4
y
$j1qS)c '
E E ;y $:4&[i r
n s l
3c ' E _s, o
o o f
i l
s g
g g 9<
e e
t i 0e 2e I'"
a i
p 8
C
.Il) 1 3
y s
r e
o n n o o g
e i
~
e it g
e e ta wR (gffb C
M A
0 0
0 0
0 0
e 0
8 6
4 2
m 1
O l
i i i!
- .. 'g 4
[
a t.o
..CM
. b' W
^
C b--
~*
9 wiirwdM
. to
~) }I E
! E8 00 L
h i
' rpmem[j ewmtsw W
W (Ei?qfGEMMM"RWIS&A%3tKE.i 5{
A.
O I&
f;,w wa sam uwamawef t v i,! -
l
@ h' y
Q) e-
-, s og wow (Y
Ehlb
}i Om...m. basaw; oEI I
s 4
Y Y
o o
O O
O O
O G
G O
N C
e.
in Q
D
.. i.
h a
1 2
, m, g
8, "
00 e
l h
N 4.nh;Usm:i HNhhimM :!Nh W :5!NONi'$
l W
w &ju. memmarmaw+a g
ii t
N 2
ff t_bamuuu===e* o-r
- g
.e j
Q e
psn umum y
.,jj'r IENEd g
l
- D -
e sa a QpuenWNggts e
i I
gg l
i i
h o
o o
o o
o R
o e
e e
'~
o I
i.
i STAFF REC 0KMEADATIONS AhD PROPOSED SCHEDULE The schedule for staff actions in response to SRM 91-172 dated Decemb 1991, is as follows:
Conipletion Date Action May 1, 1992 Increase senior NRR and regional 1.
managentent oversight of the SALP program.
June 30, 1992 2.
Conplete further staff analysis and review.
July 31, 1992 Obtain industry input on the SALP 3.
process and recouanendations for prograin improvements, September 30, 1992 Provide a Coarnission paper on results i
4.
of staff review with recos.4iendations for changes, as appropriate.
l l