ML102500655: Difference between revisions
StriderTol (talk | contribs) (Created page by program invented by StriderTol) |
StriderTol (talk | contribs) (Created page by program invented by StriderTol) |
||
(2 intermediate revisions by the same user not shown) | |||
Line 2: | Line 2: | ||
| number = ML102500655 | | number = ML102500655 | ||
| issue date = 03/23/2009 | | issue date = 03/23/2009 | ||
| title = | | title = NRC000053-Manual Chapter 0307 Reactor Oversight Process Self-Assessment Program Appendix a | ||
| author name = | | author name = | ||
| author affiliation = NRC/OI | | author affiliation = NRC/OI | ||
Line 17: | Line 17: | ||
=Text= | =Text= | ||
{{#Wiki_filter: | {{#Wiki_filter:NRC000053 APPENDIX A REACTOR OVERSIGHT PROCESS SELF-ASSESSMENT METRICS I. PERFORMANCE INDICATOR PROGRAM METRICS PI-1 Consistent Results Given Same Guidance Definition: Independently verify PIs using Inspection Procedure (IP) 71151, API Verification.@ Count all PIs that either (a) result in a crossed threshold based on a data correction by the licensee (as noted in the resultant inspection report), or (b) have been determined to be discrepant by the staff in accordance with IP 71150, ADiscrepant or Unreported Performance Indicator Data.@ | ||
Criteria: Expect few occurrences, with a stable or declining trend. | |||
Lead: Regions, NRR/DIRS Goals Supported: Objective, Predictable PI-2 Questions Regarding Interpretation of PI Guidance Definition: Quarterly, count the number of frequently asked questions (FAQs). | |||
Criteria: Expect low numbers, with a stable or declining trend. | |||
Lead: NRR/DIRS Goals Supported: Understandable, Risk-Informed, Predictable PI-3 Timely Indication of Declining Plant Performance Definition: Quarterly, track PIs that cross multiple thresholds (e.g., green to yellow or white to red). Evaluate and characterize these results to allow timely indication of declining performance. | |||
Criteria: Expect few occurrences, with a stable or declining trend. | |||
Lead: NRR/DIRS Goals Supported: Risk-Informed, Effective PI-4 PI Program Provides Insights to Help Ensure Plant Safety and/or Security Issue Date: 03/23/09 A-1 0307 | |||
PI- | NRC000053 Definition: Survey external and internal stakeholders asking whether the PI Program provides useful insights, particularly when combined with the inspection program, to help ensure plant safety and/or security. | ||
Criteria: Expect stable or increasingly positive perception over time. | |||
Lead: NRR/DIRS Goals Supported: Effective, Risk-Informed, Open PI-5 Timely PI Data Reporting and Dissemination Definition: Within 5 weeks of the end of each calendar quarter, track (count) late PI postings on the NRC=s external Web site. Also note the number of late submittals from licensees that did not meet the 21-day timeliness goal. | |||
Criteria: Expect few occurrences, with a stable or declining trend. | |||
Lead: NRR/DIRS Goals Supported: Effective, Open, Predictable PI-6 Stakeholders Perceive Appropriate Overlap Between the PI Program and Inspection Program Definition: Survey external and internal stakeholders asking if appropriate overlap exists between the PI program and the inspection program. | |||
Criteria: Expect stable or increasingly positive perception over time. | |||
Lead: NRR/DIRS Goals Supported: Effective, Open PI-7 Clarity of Performance Indicator Guidance Definition: Survey external and internal stakeholders asking if NEI 99-02, ARegulatory Assessment Performance Indicator Guideline,@ provides clear guidance regarding performance indicators. | |||
Criteria: Expect stable or increasingly positive perception over time. | |||
Lead: NRR/DIRS Goals Supported: Understandable, Open, Objective Issue Date: 03/23/09 A-2 0307 | |||
Definition: | NRC000053 PI-8 PI Program Contributes to the Identification of Performance Outliers In an Objective and Predictable Manner Definition: Survey external and internal stakeholders asking if the PI program effectively contributes to the identification of performance outliers based on risk-informed, objective, and predictable indicators. | ||
Criteria: Expect stable or increasingly positive perception over time. | |||
Lead: NRR/DIRS Goals Supported: Risk-Informed, Objective, Predictable, Open II. INSPECTION PROGRAM METRICS IP-1 Inspection Findings Documented In Accordance With Requirements Definition: Audit inspection reports in relation to program requirements (IMC 0612, APower Reactor Inspection Reports@) for documenting green findings, greater-than-green findings, and violations. Report the percentage of findings that meet the program requirements. | |||
Criteria: Expect a stable or improving trend in the percentage of findings documented in accordance with program requirements. | |||
Lead: NRR/DIRS Goals Supported: Objective, Risk-Informed, Predictable IP-2 Completion of Baseline Inspection Program Definition: Annual completion of baseline inspection program. | |||
Criteria: Defined as per IMC 2515, ALight-Water Reactor Inspection Program - | |||
Operations Phase.@ | |||
Lead: NRR/DIRS, Regions Goals Supported: Predictable, Effective IP-3 Inspection Reports Are Timely Definition: Obtain RPS data on the total number of reports issued and the number issued within timeliness goals as stipulated in IMC 0612, APower Reactor Inspection Reports.@ | |||
Issue Date: 03/23/09 A-3 0307 | |||
Lead: | NRC000053 Criteria: Expect 90 percent of inspection reports to be issued within program's timeliness goals. | ||
NOTE: For inspections not conducted by a resident inspector, inspection completion is normally defined as the day of the exit meeting. For resident inspector and integrated inspection reports, inspection completion is normally defined as the last day covered by the inspection report. | |||
Lead: NRR/DIRS, NSIR, Regions Goals Supported: Effective, Open, Predictable IP-4 Temporary Instructions (TIs) Are Completed Timely Definition: Audit the time to complete TIs by region or Office. Compare the completion status in RPS to TI requirements. Report by region or Office the number of TIs closed within goals. | |||
Criteria: Expect all TIs to be completed within TI requirements. | |||
Lead: NRR/DIRS, NSIR, Regions Goals Supported: Effective, Predictable IP-5 Inspection Reports Are Relevant, Useful, and Written in Plain Language Definition: Survey external and internal stakeholders asking whether the information contained in inspection reports is relevant, useful, and written in plain English. | |||
Criteria: Expect stable or increasingly positive perception over time. | |||
Lead: NRR/DIRS Goals Supported: Effective, Understandable, Open IP-6 Inspection Program Effectiveness and Adequacy in Covering Areas Important to Plant Safety and/or Security Definition: Survey external and internal stakeholders asking whether the inspection program adequately covers areas that are important to plant safety and/or security and is effective in identifying and ensuring the prompt correction of performance deficiencies. | |||
Criteria: Expect stable or increasingly positive perception over time. | |||
Lead: NRR/DIRS Issue Date: 03/23/09 A-4 0307 | |||
Goals Supported: | NRC000053 Goals Supported: Effective, Risk-Informed, Open IP-7 Analysis of Baseline Inspection Procedures Definition: Annually, review each baseline inspection procedure to determine its effectiveness and contribution to the overall effectiveness of the baseline inspection program. The objectives of the review are: (1) to determine if changes in scope, frequency, or level of effort are needed based on recent experience, (2) to determine if a change to the estimated hours for completion is needed, (3) to define or change what constitutes minimum completion of each inspectable area, if needed, and (4) to critically evaluate all of the inspectable areas together along with the PI program to ensure that the inspectable areas are adequately monitored for safety performance. In addition, a more detailed review and realignment of inspection resources will be performed at least biennially in accordance with Appendix B to this Chapter. The focus of this effort is to adjust existing inspection resources to improve the effectiveness of the inspection program in identifying significant licensee performance deficiencies. | ||
Criteria: None; trend only. Summarize and evaluate the individual inspection procedure reviews and propose program adjustments as necessary to address noted inefficiencies. Provide basis for any meaningful increase or decrease in procedure scope, frequency, or level of effort as a result of the review. | |||
Lead: NRR/DIRS, NSIR, Regions Goals Supported: Effective, Risk-Informed III. SIGNIFICANCE DETERMINATION PROCESS METRICS SDP-1 The SDP Results Are Predictable and Repeatable and Focus Stakeholder Attention on Significant Safety Issues Definition: Annually, audit a representative sample (up to four per region) of inspection findings against the standard criteria set forth in IMC 0609, ASignificance Determination Process,@ and its appendices. To the extent available, samples should include potentially greater-than-green findings that were presented to the Significance Determination Process/Enforcement Review Panel (SERP). Findings should contain adequate detail to enable an independent auditor to trace through the available documentation and reach the same significance color characterization. | |||
Criteria: The target goal is that at least 90% are determined to be predictable and repeatable. Any SDP outcomes determined to be non-conservative will be evaluated and appropriate programmatic changes will be implemented. | |||
Issue Date: 03/23/09 A-5 0307 | |||
NRC000053 Lead: NRR/DRA Goals Supported: Risk-Informed, Predictable SDP-2 SDP Outcomes Are Risk-Informed and Accepted by Stakeholders Definition: Track the total number of appeals of final SDP results. | |||
Criteria: Expect zero appeals of SDP significance that result in a final determination being overturned across all regions. All successful appeals will be assessed to determine causal factors and to recommend process improvements. | |||
Lead: Regions, NRR/DIRS Goals Supported: Risk-Informed, Objective, Predictable SDP-3 Inspection Staff Is Proficient and Find Value in Using the SDP Definition: Survey internal stakeholders using specific quantitative survey questions that focus on training, effectiveness, and efficiency. | |||
Criteria: Expect stable or increasingly positive perception over time. | |||
Lead: NRR/DIRS Goals Supported: Effective, Understandable, Risk-Informed SDP-4 The SDP Results in an Appropriate Regulatory Response to Performance Issues Definition: Survey external and internal stakeholders asking if the SDP results in an appropriate regulatory response to performance issues. | |||
Criteria: Expect stable or increasingly positive perception over time. | |||
Lead: NRR/DIRS Goals Supported: Understandable, Objective, Predictable, Open SDP-5 The Resources (Direct Charges and Support Activities) Expended Are Appropriate Definition: Track the percentage of total resource expenditures attributed to SDP activities to determine the effort expended by the regions in completing SDP evaluations as a percentage of the total regional direct inspection effort. | |||
Issue Date: 03/23/09 A-6 0307 | |||
Definition: | NRC000053 Criteria: Total SDP expenditures should not exceed 10 percent of the total regional direct inspection effort (DIE) with a stable or declining trend. | ||
Lead: NRR/DIRS Goals Supported: Effective, Predictable SDP-6 Final Significance Determinations Are Timely Definition: Conduct a quarterly audit of RPS data to identify the total number of inspection items finalized as greater than green that were under review for more than 90 days since: | |||
(1) the date of initial licensee notification of the preliminary significance in an inspection report, or (2) the item was otherwise documented in an inspection report as an apparent violation pending completion of a significance determination and not counted in the above category. | |||
Criteria: At least 90% of all SDP results that are counted per the criteria above should be finalized within 90 days. All issues greater than 90 days will be assessed to determine causal factors and to recommend process improvements. | |||
Lead: NRR/DIRS Goals Supported: Effective, Open, Predictable IV. ASSESSMENT PROGRAM METRICS AS-1 Actions Are Determined by Quantifiable Assessment Inputs (i.e., PIs and SDP Results) and are Commensurate With the Risk of the Issue and Overall Plant Risk Definition: Audit all assessment-related letters and count the number of deviations from the Action Matrix. Evaluate the causes for these deviations and identify changes to the ROP, if any, to improve the guidance documents. | |||
Criteria: Expect few deviations, with a stable or declining trend. | |||
Lead: NRR/DIRS Goals Supported: Objective, Risk-Informed, Open AS-2 The Number And Scope of Additional Actions Recommended as a Result of the Agency Action Review Meeting (AARM) Beyond Those Actions Already Taken Are Limited Issue Date: 03/23/09 A-7 0307 | |||
Criteria: Expect low | NRC000053 Definition: Review the results of the Agency Action Review Meeting (AARM). | ||
Criteria: Expect few additional actions, with a stable or declining trend. | |||
Lead: NRR/DIRS Goals Supported: Understandable, Predictable, Objective AS-3 Assessment Program Results (Assessment Reviews, Assessment Letters and Public Meetings) Are Completed in a Timely Manner Definition: Track the number of instances in which the timeliness goals stipulated in IMC 0305, AOperating Reactor Assessment Program,@ were not met for: (1) the conduct of quarterly, mid-cycle, and end-of-cycle reviews; (2) the issuance of assessment letters; and (3) the conduct of public meetings. | |||
Criteria: Expect few instances in which timeliness goals were not met, with a stable or declining trend. | |||
Lead: Regions, NRR/DIRS Goals Supported: Effective, Open, Predictable AS-4 The NRC's Response to Performance Issues Is Timely Definition: Count the number of days between issuance of an assessment letter discussing an issue of more than very low safety significance and completion of the supplemental inspection (by exit meeting date, not issuance of the inspection report). | |||
Criteria: Expect a stable or declining trend. | |||
Lead: Regions, NRR/DIRS Goals Supported: Effective, Predictable AS-5 NRC Takes Appropriate Actions To Address Performance Issues Definition: Survey external and internal stakeholders asking whether the NRC takes appropriate actions to address performance issues for those plants outside the Licensee Response Column of the Action Matrix. | |||
Criteria: Expect stable or increasingly positive perception over time. | |||
Lead: NRR/DIRS Goals Supported: Effective, Understandable, Open Issue Date: 03/23/09 A-8 0307 | |||
Lead: NRR/DIRS | NRC000053 AS-6 Assessment Reports Are Relevant, Useful, and Written in Plain Language Definition: Survey external and internal stakeholders asking whether the information contained in assessment reports is relevant, useful, and written in plain English. | ||
Criteria: Expect stable or increasingly positive perception over time. | |||
Lead: NRR/DIRS Goals Supported: Understandable, Effective, Open AS-7 Degradations in Plant Performance Are Gradual and Allow Adequate Agency Engagement of the Licensees Definition: Track the number of instances each quarter in which plants move more than one column to the right in the Action Matrix (as indicated on the Action Matrix Summary). | |||
Criteria: Expect few instances in which plant performance causes a plant to move more than one column to the right in the Action Matrix. Provide a qualitative explanation of each instance in which this occurs. Expect a stable or declining trend. | |||
Lead: NRR/DIRS Goals Supported: Risk-Informed, Predictable AS-8 Perceived Effectiveness of Safety Culture Enhancements to ROP Definition: Survey external and internal stakeholders asking whether the ROP safety culture enhancements help in identifying licensee safety culture weaknesses and focusing licensee and NRC attention appropriately. | |||
Criteria: Expect stable or increasingly positive perception over time. | |||
Lead: NRR/DIRS Goals Supported: Effective, Open V. OVERALL ROP METRICS O-1 Stakeholders Perceive the ROP To Be Predictable and Objective Issue Date: 03/23/09 A-9 0307 | |||
Goals Supported: | NRC000053 Definition: Survey external and internal stakeholders asking if ROP oversight activities are predictable (i.e., controlled by the process) and reasonably objective (i.e., | ||
based on supported facts, rather than relying on subjective judgment). | |||
Criteria: Expect stable or increasing positive perception over time. | |||
Lead: NRR/DIRS Goals Supported: Objective, Predictable, Effective, Open O-2 Stakeholders Perceive the ROP To Be Risk-informed Definition: Survey external and internal stakeholders asking if the ROP is risk-informed, in that actions and outcomes are appropriately graduated on the basis of increased significance. | |||
Criteria: Expect stable or increasingly positive perception over time. | |||
Lead: NRR/DIRS Goals Supported: Risk-Informed, Effective, Open O-3 Stakeholders Perceive the ROP To Be Understandable Definition: Survey external and internal stakeholders asking if the ROP is understandable and if the processes, procedures, and products are clear and written in plain English. | |||
Criteria: Expect stable or increasingly positive perception over time. | |||
Lead: NRR/DIRS Goals Supported: Understandable, Effective, Open O-4 Stakeholders Perceive That the ROP Provides Adequate Regulatory Assurance That Plants Are Operated and Maintained Safely and Securely Definition: Survey external and internal stakeholders asking if the ROP provides adequate regulatory assurance, when combined with other NRC regulatory processes, that plants are being operated and maintained safely and securely. | |||
Criteria: Expect stable or increasingly positive perception over time. | |||
Lead: NRR/DIRS Issue Date: 03/23/09 A-10 0307 | |||
NRC000053 Goals Supported: Effective, Open O-5 Stakeholders Perceive the ROP To Be Effective (e.g., High Quality, Efficient, Realistic, and Timely) | |||
Definition: Survey external and internal stakeholders asking whether NRC actions related to the ROP are high quality, efficient, realistic, and timely. | |||
Criteria: Expect stable or increasingly positive perception over time. | |||
Lead: NRR/DIRS Goals Supported: Effective, Open O-6 Stakeholders Perceive That the ROP Ensures Openness Definition: Survey external and internal stakeholders asking if the ROP ensures openness in the regulatory process. | |||
Criteria: Expect stable or increasingly positive perception over time. | |||
Lead: NRR/DIRS Goals Supported: Open, Effective O-7 Opportunities for Public Participation in the Process Definition: Survey external and internal stakeholders asking if there are sufficient opportunities for the public to participate in the process. | |||
Criteria: Expect stable or increasingly positive perception over time. | |||
Lead: NRR/DIRS Goals Supported: Open, Effective O-8 Stakeholders Perceive the NRC To Be Responsive to its Inputs and Comments Definition: Survey external and internal stakeholders asking if the NRC is responsive to the public's inputs and comments on the ROP. | |||
Criteria: Expect stable or increasingly positive perception over time. | |||
Lead: NRR/DIRS Issue Date: 03/23/09 A-11 0307 | |||
Definition: | NRC000053 Goals Supported: Open, Effective O-9 Stakeholders Perceive That the ROP Is Implemented as Defined Definition: Survey external and internal stakeholders asking if the ROP has been implemented as defined by program documents. | ||
Criteria: Expect stable or increasingly positive perception over time. | |||
Lead: NRR/DIRS Goals Supported: Predictable, Understandable, Open O-10 Stakeholders Perceive That the ROP Does Not Result in Unintended Consequences, Open Definition: Survey external and internal stakeholders asking if the ROP results in unintended consequences. | |||
Criteria: Expect stable or increasingly positive perception over time. | |||
Lead: NRR/DIRS Goals Supported: Effective, Open O-11 Analysis of NRC=s Responses to Significant Events Definition: Review reports from incident investigation teams (IITs) and augmented inspection teams (AITs) to collect lessons learned regarding ROP programmatic deficiencies (i.e., did the baseline inspection program inspect this area? did the SDP accurately characterize resultant findings?). IITs already have the provision to determine NRC program deficiencies. AITs will be reviewed by NRR/DIRS to identify any weaknesses. | |||
Criteria: Expect no major programmatic voids. | |||
Lead: NRR/DIRS Goals Supported: Effective, Predictable O-12 Analysis of Inspection Hours and Resource Expenditures Definition: Annually, collect and analyze resource data (e.g., direct inspection effort, preparation/documentation, plant status hours) for Baseline, Supplemental/Plant-Specific, and Safety Issues Inspections, and other ROP activities. | |||
Issue Date: 03/23/09 A-12 0307 | |||
Criteria: | NRC000053 Criteria: (1) Significant deviations are not expected on an annual basis. Explore reasons for any deviations that may be evident. | ||
(2) Track and trend resource usage for the baseline inspection program and supplemental/plant-specific inspections. Analyze causes for any significant departure from established trend. | |||
(3) Track and trend resource usage for preparation, documentation, and other ROP activities, and assess the effects on budgeted resources. | |||
NOTE: This metric is intended primarily for tracking and trending resource usage for the ROP. The results are used to improve the efficiency and effectiveness of the ROP and to make management and budget decisions. A detailed ROP resource analysis is included in the annual ROP self-assessment Commission paper. | |||
Lead: NRR/DIRS Goals Supported: Effective, Predictable O-13 Analysis of Resident Inspector Demographics and Experience Definition: Annually, collect and analyze data in order to determine the relevant inspection experience of the resident inspector (RI) and senior resident inspector (SRI) population. The following four parameters will be measured and analyzed for both RIs and SRIs to ensure that the NRC maintains a highly qualified resident inspection staff: | |||
(1) NRC time - the total time the individual has accumulated as an NRC employee. | |||
(2) Total resident time - the total time the individual has accumulated as an RI or SRI. | |||
(3) Current site time - the total time the individual has spent as an RI or SRI at the current site. | |||
(4) Relevant non-NRC experience - the total time the individual has gained relevant nuclear power experience outside of the NRC. | |||
Examples of relevant non-NRC experience are operation, engineering, maintenance, or construction experience with commercial nuclear power plants, naval shipyards, Department of Energy facilities, and/or the U.S. Navy nuclear power program. | |||
Criteria: None; trend only. Provide reasons for any meaningful increase or decrease in these resident demographic metrics. | |||
NOTE: This metric is intended primarily for tracking and trending resident inspection experience. The results are used to make any necessary modifications to the RI and/or SRI programs in order to attract and retain highly qualified Issue Date: 03/23/09 A-13 0307 | |||
Lead: | NRC000053 inspectors to the respective programs. A detailed resident demographic and staffing analysis is included in the annual ROP self-assessment Commission paper. | ||
Lead: NRR/DIRS with assistance from HQ and regional HR staff Goals Supported: Effective, Predictable O-14 Analysis of Site Staffing Definition: Annually, collect and analyze data in order to measure the permanent inspector staffing levels at each of the reactor sites for both RIs and SRIs in order to evaluate the agency=s ability to provide continuity of regulatory oversight. | |||
Criteria: The criteria is set at 90% program-wide. Any single site that falls below 90% | |||
will be individually evaluated. Provide reasons for any meaningful increase or decrease in the inspector staffing level at reactors sites. | |||
NOTE: Inspectors assigned to the site permanently or through a rotation with a minimum duration of 6 weeks shall be counted. Inspectors on 6 week or longer rotational assignments will be identified as such. Inspectors assigned to the site for less than six weeks will not be counted, but should be indicated as such. | |||
Additionally, the regions shall indicate sites where permanently assigned resident or senior resident inspectors are away from the site for an extended period of time (one continuous time period which is greater than 6 weeks). Only inspectors who have attained at least a basic inspector certification status, as defined by Appendix A to Inspection Manual Chapter 1245, shall be counted. | |||
Data will indicate number of days a qualified resident and senior resident inspector are permanently assigned to the site during the year divided by the number of days in the year. Number of days spent on training; meetings away from the site; participation in team inspections; leave; or other temporary duties (e.g. acting for branch chiefs in his/her absence) will not be counted against the metric unless the absence exceed 6 continuous weeks. | |||
Lead: Regions, NRR/DIRS Goals Supported: Effective, Predictable O-15 Analysis of ROP Training and Qualifications Definition: Annually, evaluate the implementation of IMC 1245, AQualification Program for the Office of Nuclear Reactor Regulation Programs,@ particularly as it pertains to ROP implementation. | |||
Issue Date: 03/23/09 A-14 0307 | |||
Goals Supported: | NRC000053 Criteria: None; trend only. Summarize and evaluate the training accomplished over the previous year and propose program improvements as necessary to address noted concerns. | ||
NOTE: This metric is intended primarily for tracking and trending the effectiveness of the ROP training and qualifications programs. An evaluation of training effectiveness is included in the annual ROP self-assessment Commission paper. | |||
Lead: NRR/DIRS with assistance from regional staff Goals Supported: Effective, Predictable, Understandable O-16 Analysis of Regulatory Impact Definition: Annually, collect and analyze licensee feedback and develop a summary of regulatory impact forms that are critical of the ROP. | |||
Criteria: None; trend only. Summarize and evaluate the feedback received and propose program improvements as necessary to address common concerns. | |||
NOTE: This metric is intended primarily for tracking and trending regulatory impact. A detailed regulatory impact summary is included in the annual ROP self-assessment Commission paper. | |||
Lead: NRR/DIRS with assistance from regional staff Goals Supported: Effective, Open, Understandable Issue Date: 03/23/09 A-15 0307 | |||
NRC000053 NRC000053 ATTACHMENT 1 Revision History For IMC 0307, Appendix A Commitment Issue Date Description of Change Training Training Comment Tracking Needed Completion Resolution Number Date Accession Number N/A 12/12/02 Revised significantly to include a more None N/A N/A detailed discussion of the role of inspectable and program area leads, the annual review of the baseline inspection program, and other aspects of the self-assessment program. The specific metrics for these roles were added to Appendix A. | |||
N/A 12/12/03 Revised to provide greater detail for None N/A N/A documenting the results of the annual inspection procedures reviews, and some metrics in Appendix A were modified to better align with the operating plan metrics and other program commitments. | |||
N/A 01/14/04 Based on a decision at the DRP/DRS None N/A N/A counterpart meeting held on December 17-18, 2003, metric IP-5 was revised to change the inspection report timeliness to 45 calendar days for all inspection reports, with exception of reactive inspection reports, which will stay at 30 days. | |||
Issue Date: 03/23/09 Att1-1 0307 Appendix A | |||
NRC000053 N/A 02/20/06 Revised to support the new safety None N/A ML060110235 performance measures of the NRC=s Strategic Plan, to better define the ROP goals and intended outcomes, and to consolidate and clarify several of the performance metrics. | |||
None N/A | |||
=s Strategic Plan, to better define the ROP goals and intended outcomes, and to consolidate and clarify several of the performance metrics. | |||
Completed 4 year historical CN search. | Completed 4 year historical CN search. | ||
N/A 11/28/06 Revised to measure the effectiveness of the None N/A safety culture enhancements to the ROP, to clarify expectations regarding the resident demographics and staffing metrics, and to include a discussion of the consolidated response to external survey questions. | |||
N/A 01/10/08 Revised to eliminate and consolidate several None N/A ML073510410 CN 08-002 metrics, to separate Appendix A from the base IMC to serve as a stand-alone document, and to summarize and link to Appendix B on the ROP realignment process. | |||
11/28/06 Revised to measure the effectiveness of the safety culture enhancements to the ROP, to clarify expectations regarding the resident demographics and staffing metrics, and to include a discussion of the consolidated response to external survey questions. | Revised to address the Commission SRM None N/A W200800299 03/23/09 dated June 30, 2008, to reflect the recently ML090300620 CN 09-010 issued Strategic Plan for FY 2008 - 2013, to reincorporate the security cornerstone in the ROP self-assessment process, and some metrics were revised for clarification purposes while others were removed to eliminate redundancy or unnecessary burden. | ||
Issue Date: 03/23/09 Att1-2 0307 Appendix A}} | |||
N/A 01/10/08 | |||
None N/A | |||
03/23/09 | |||
Latest revision as of 17:03, 6 December 2019
ML102500655 | |
Person / Time | |
---|---|
Site: | Prairie Island |
Issue date: | 03/23/2009 |
From: | NRC/OI |
To: | Atomic Safety and Licensing Board Panel |
SECY RAS | |
Shared Package | |
ML102500654 | List:
|
References | |
50-282-LR, 50-306-LR, ASLBP 08-871-01-LR-BD01, RAS 18574 | |
Download: ML102500655 (18) | |
Text
NRC000053 APPENDIX A REACTOR OVERSIGHT PROCESS SELF-ASSESSMENT METRICS I. PERFORMANCE INDICATOR PROGRAM METRICS PI-1 Consistent Results Given Same Guidance Definition: Independently verify PIs using Inspection Procedure (IP) 71151, API Verification.@ Count all PIs that either (a) result in a crossed threshold based on a data correction by the licensee (as noted in the resultant inspection report), or (b) have been determined to be discrepant by the staff in accordance with IP 71150, ADiscrepant or Unreported Performance Indicator Data.@
Criteria: Expect few occurrences, with a stable or declining trend.
Lead: Regions, NRR/DIRS Goals Supported: Objective, Predictable PI-2 Questions Regarding Interpretation of PI Guidance Definition: Quarterly, count the number of frequently asked questions (FAQs).
Criteria: Expect low numbers, with a stable or declining trend.
Lead: NRR/DIRS Goals Supported: Understandable, Risk-Informed, Predictable PI-3 Timely Indication of Declining Plant Performance Definition: Quarterly, track PIs that cross multiple thresholds (e.g., green to yellow or white to red). Evaluate and characterize these results to allow timely indication of declining performance.
Criteria: Expect few occurrences, with a stable or declining trend.
Lead: NRR/DIRS Goals Supported: Risk-Informed, Effective PI-4 PI Program Provides Insights to Help Ensure Plant Safety and/or Security Issue Date: 03/23/09 A-1 0307
NRC000053 Definition: Survey external and internal stakeholders asking whether the PI Program provides useful insights, particularly when combined with the inspection program, to help ensure plant safety and/or security.
Criteria: Expect stable or increasingly positive perception over time.
Lead: NRR/DIRS Goals Supported: Effective, Risk-Informed, Open PI-5 Timely PI Data Reporting and Dissemination Definition: Within 5 weeks of the end of each calendar quarter, track (count) late PI postings on the NRC=s external Web site. Also note the number of late submittals from licensees that did not meet the 21-day timeliness goal.
Criteria: Expect few occurrences, with a stable or declining trend.
Lead: NRR/DIRS Goals Supported: Effective, Open, Predictable PI-6 Stakeholders Perceive Appropriate Overlap Between the PI Program and Inspection Program Definition: Survey external and internal stakeholders asking if appropriate overlap exists between the PI program and the inspection program.
Criteria: Expect stable or increasingly positive perception over time.
Lead: NRR/DIRS Goals Supported: Effective, Open PI-7 Clarity of Performance Indicator Guidance Definition: Survey external and internal stakeholders asking if NEI 99-02, ARegulatory Assessment Performance Indicator Guideline,@ provides clear guidance regarding performance indicators.
Criteria: Expect stable or increasingly positive perception over time.
Lead: NRR/DIRS Goals Supported: Understandable, Open, Objective Issue Date: 03/23/09 A-2 0307
NRC000053 PI-8 PI Program Contributes to the Identification of Performance Outliers In an Objective and Predictable Manner Definition: Survey external and internal stakeholders asking if the PI program effectively contributes to the identification of performance outliers based on risk-informed, objective, and predictable indicators.
Criteria: Expect stable or increasingly positive perception over time.
Lead: NRR/DIRS Goals Supported: Risk-Informed, Objective, Predictable, Open II. INSPECTION PROGRAM METRICS IP-1 Inspection Findings Documented In Accordance With Requirements Definition: Audit inspection reports in relation to program requirements (IMC 0612, APower Reactor Inspection Reports@) for documenting green findings, greater-than-green findings, and violations. Report the percentage of findings that meet the program requirements.
Criteria: Expect a stable or improving trend in the percentage of findings documented in accordance with program requirements.
Lead: NRR/DIRS Goals Supported: Objective, Risk-Informed, Predictable IP-2 Completion of Baseline Inspection Program Definition: Annual completion of baseline inspection program.
Criteria: Defined as per IMC 2515, ALight-Water Reactor Inspection Program -
Operations Phase.@
Lead: NRR/DIRS, Regions Goals Supported: Predictable, Effective IP-3 Inspection Reports Are Timely Definition: Obtain RPS data on the total number of reports issued and the number issued within timeliness goals as stipulated in IMC 0612, APower Reactor Inspection Reports.@
Issue Date: 03/23/09 A-3 0307
NRC000053 Criteria: Expect 90 percent of inspection reports to be issued within program's timeliness goals.
NOTE: For inspections not conducted by a resident inspector, inspection completion is normally defined as the day of the exit meeting. For resident inspector and integrated inspection reports, inspection completion is normally defined as the last day covered by the inspection report.
Lead: NRR/DIRS, NSIR, Regions Goals Supported: Effective, Open, Predictable IP-4 Temporary Instructions (TIs) Are Completed Timely Definition: Audit the time to complete TIs by region or Office. Compare the completion status in RPS to TI requirements. Report by region or Office the number of TIs closed within goals.
Criteria: Expect all TIs to be completed within TI requirements.
Lead: NRR/DIRS, NSIR, Regions Goals Supported: Effective, Predictable IP-5 Inspection Reports Are Relevant, Useful, and Written in Plain Language Definition: Survey external and internal stakeholders asking whether the information contained in inspection reports is relevant, useful, and written in plain English.
Criteria: Expect stable or increasingly positive perception over time.
Lead: NRR/DIRS Goals Supported: Effective, Understandable, Open IP-6 Inspection Program Effectiveness and Adequacy in Covering Areas Important to Plant Safety and/or Security Definition: Survey external and internal stakeholders asking whether the inspection program adequately covers areas that are important to plant safety and/or security and is effective in identifying and ensuring the prompt correction of performance deficiencies.
Criteria: Expect stable or increasingly positive perception over time.
Lead: NRR/DIRS Issue Date: 03/23/09 A-4 0307
NRC000053 Goals Supported: Effective, Risk-Informed, Open IP-7 Analysis of Baseline Inspection Procedures Definition: Annually, review each baseline inspection procedure to determine its effectiveness and contribution to the overall effectiveness of the baseline inspection program. The objectives of the review are: (1) to determine if changes in scope, frequency, or level of effort are needed based on recent experience, (2) to determine if a change to the estimated hours for completion is needed, (3) to define or change what constitutes minimum completion of each inspectable area, if needed, and (4) to critically evaluate all of the inspectable areas together along with the PI program to ensure that the inspectable areas are adequately monitored for safety performance. In addition, a more detailed review and realignment of inspection resources will be performed at least biennially in accordance with Appendix B to this Chapter. The focus of this effort is to adjust existing inspection resources to improve the effectiveness of the inspection program in identifying significant licensee performance deficiencies.
Criteria: None; trend only. Summarize and evaluate the individual inspection procedure reviews and propose program adjustments as necessary to address noted inefficiencies. Provide basis for any meaningful increase or decrease in procedure scope, frequency, or level of effort as a result of the review.
Lead: NRR/DIRS, NSIR, Regions Goals Supported: Effective, Risk-Informed III. SIGNIFICANCE DETERMINATION PROCESS METRICS SDP-1 The SDP Results Are Predictable and Repeatable and Focus Stakeholder Attention on Significant Safety Issues Definition: Annually, audit a representative sample (up to four per region) of inspection findings against the standard criteria set forth in IMC 0609, ASignificance Determination Process,@ and its appendices. To the extent available, samples should include potentially greater-than-green findings that were presented to the Significance Determination Process/Enforcement Review Panel (SERP). Findings should contain adequate detail to enable an independent auditor to trace through the available documentation and reach the same significance color characterization.
Criteria: The target goal is that at least 90% are determined to be predictable and repeatable. Any SDP outcomes determined to be non-conservative will be evaluated and appropriate programmatic changes will be implemented.
Issue Date: 03/23/09 A-5 0307
NRC000053 Lead: NRR/DRA Goals Supported: Risk-Informed, Predictable SDP-2 SDP Outcomes Are Risk-Informed and Accepted by Stakeholders Definition: Track the total number of appeals of final SDP results.
Criteria: Expect zero appeals of SDP significance that result in a final determination being overturned across all regions. All successful appeals will be assessed to determine causal factors and to recommend process improvements.
Lead: Regions, NRR/DIRS Goals Supported: Risk-Informed, Objective, Predictable SDP-3 Inspection Staff Is Proficient and Find Value in Using the SDP Definition: Survey internal stakeholders using specific quantitative survey questions that focus on training, effectiveness, and efficiency.
Criteria: Expect stable or increasingly positive perception over time.
Lead: NRR/DIRS Goals Supported: Effective, Understandable, Risk-Informed SDP-4 The SDP Results in an Appropriate Regulatory Response to Performance Issues Definition: Survey external and internal stakeholders asking if the SDP results in an appropriate regulatory response to performance issues.
Criteria: Expect stable or increasingly positive perception over time.
Lead: NRR/DIRS Goals Supported: Understandable, Objective, Predictable, Open SDP-5 The Resources (Direct Charges and Support Activities) Expended Are Appropriate Definition: Track the percentage of total resource expenditures attributed to SDP activities to determine the effort expended by the regions in completing SDP evaluations as a percentage of the total regional direct inspection effort.
Issue Date: 03/23/09 A-6 0307
NRC000053 Criteria: Total SDP expenditures should not exceed 10 percent of the total regional direct inspection effort (DIE) with a stable or declining trend.
Lead: NRR/DIRS Goals Supported: Effective, Predictable SDP-6 Final Significance Determinations Are Timely Definition: Conduct a quarterly audit of RPS data to identify the total number of inspection items finalized as greater than green that were under review for more than 90 days since:
(1) the date of initial licensee notification of the preliminary significance in an inspection report, or (2) the item was otherwise documented in an inspection report as an apparent violation pending completion of a significance determination and not counted in the above category.
Criteria: At least 90% of all SDP results that are counted per the criteria above should be finalized within 90 days. All issues greater than 90 days will be assessed to determine causal factors and to recommend process improvements.
Lead: NRR/DIRS Goals Supported: Effective, Open, Predictable IV. ASSESSMENT PROGRAM METRICS AS-1 Actions Are Determined by Quantifiable Assessment Inputs (i.e., PIs and SDP Results) and are Commensurate With the Risk of the Issue and Overall Plant Risk Definition: Audit all assessment-related letters and count the number of deviations from the Action Matrix. Evaluate the causes for these deviations and identify changes to the ROP, if any, to improve the guidance documents.
Criteria: Expect few deviations, with a stable or declining trend.
Lead: NRR/DIRS Goals Supported: Objective, Risk-Informed, Open AS-2 The Number And Scope of Additional Actions Recommended as a Result of the Agency Action Review Meeting (AARM) Beyond Those Actions Already Taken Are Limited Issue Date: 03/23/09 A-7 0307
NRC000053 Definition: Review the results of the Agency Action Review Meeting (AARM).
Criteria: Expect few additional actions, with a stable or declining trend.
Lead: NRR/DIRS Goals Supported: Understandable, Predictable, Objective AS-3 Assessment Program Results (Assessment Reviews, Assessment Letters and Public Meetings) Are Completed in a Timely Manner Definition: Track the number of instances in which the timeliness goals stipulated in IMC 0305, AOperating Reactor Assessment Program,@ were not met for: (1) the conduct of quarterly, mid-cycle, and end-of-cycle reviews; (2) the issuance of assessment letters; and (3) the conduct of public meetings.
Criteria: Expect few instances in which timeliness goals were not met, with a stable or declining trend.
Lead: Regions, NRR/DIRS Goals Supported: Effective, Open, Predictable AS-4 The NRC's Response to Performance Issues Is Timely Definition: Count the number of days between issuance of an assessment letter discussing an issue of more than very low safety significance and completion of the supplemental inspection (by exit meeting date, not issuance of the inspection report).
Criteria: Expect a stable or declining trend.
Lead: Regions, NRR/DIRS Goals Supported: Effective, Predictable AS-5 NRC Takes Appropriate Actions To Address Performance Issues Definition: Survey external and internal stakeholders asking whether the NRC takes appropriate actions to address performance issues for those plants outside the Licensee Response Column of the Action Matrix.
Criteria: Expect stable or increasingly positive perception over time.
Lead: NRR/DIRS Goals Supported: Effective, Understandable, Open Issue Date: 03/23/09 A-8 0307
NRC000053 AS-6 Assessment Reports Are Relevant, Useful, and Written in Plain Language Definition: Survey external and internal stakeholders asking whether the information contained in assessment reports is relevant, useful, and written in plain English.
Criteria: Expect stable or increasingly positive perception over time.
Lead: NRR/DIRS Goals Supported: Understandable, Effective, Open AS-7 Degradations in Plant Performance Are Gradual and Allow Adequate Agency Engagement of the Licensees Definition: Track the number of instances each quarter in which plants move more than one column to the right in the Action Matrix (as indicated on the Action Matrix Summary).
Criteria: Expect few instances in which plant performance causes a plant to move more than one column to the right in the Action Matrix. Provide a qualitative explanation of each instance in which this occurs. Expect a stable or declining trend.
Lead: NRR/DIRS Goals Supported: Risk-Informed, Predictable AS-8 Perceived Effectiveness of Safety Culture Enhancements to ROP Definition: Survey external and internal stakeholders asking whether the ROP safety culture enhancements help in identifying licensee safety culture weaknesses and focusing licensee and NRC attention appropriately.
Criteria: Expect stable or increasingly positive perception over time.
Lead: NRR/DIRS Goals Supported: Effective, Open V. OVERALL ROP METRICS O-1 Stakeholders Perceive the ROP To Be Predictable and Objective Issue Date: 03/23/09 A-9 0307
NRC000053 Definition: Survey external and internal stakeholders asking if ROP oversight activities are predictable (i.e., controlled by the process) and reasonably objective (i.e.,
based on supported facts, rather than relying on subjective judgment).
Criteria: Expect stable or increasing positive perception over time.
Lead: NRR/DIRS Goals Supported: Objective, Predictable, Effective, Open O-2 Stakeholders Perceive the ROP To Be Risk-informed Definition: Survey external and internal stakeholders asking if the ROP is risk-informed, in that actions and outcomes are appropriately graduated on the basis of increased significance.
Criteria: Expect stable or increasingly positive perception over time.
Lead: NRR/DIRS Goals Supported: Risk-Informed, Effective, Open O-3 Stakeholders Perceive the ROP To Be Understandable Definition: Survey external and internal stakeholders asking if the ROP is understandable and if the processes, procedures, and products are clear and written in plain English.
Criteria: Expect stable or increasingly positive perception over time.
Lead: NRR/DIRS Goals Supported: Understandable, Effective, Open O-4 Stakeholders Perceive That the ROP Provides Adequate Regulatory Assurance That Plants Are Operated and Maintained Safely and Securely Definition: Survey external and internal stakeholders asking if the ROP provides adequate regulatory assurance, when combined with other NRC regulatory processes, that plants are being operated and maintained safely and securely.
Criteria: Expect stable or increasingly positive perception over time.
Lead: NRR/DIRS Issue Date: 03/23/09 A-10 0307
NRC000053 Goals Supported: Effective, Open O-5 Stakeholders Perceive the ROP To Be Effective (e.g., High Quality, Efficient, Realistic, and Timely)
Definition: Survey external and internal stakeholders asking whether NRC actions related to the ROP are high quality, efficient, realistic, and timely.
Criteria: Expect stable or increasingly positive perception over time.
Lead: NRR/DIRS Goals Supported: Effective, Open O-6 Stakeholders Perceive That the ROP Ensures Openness Definition: Survey external and internal stakeholders asking if the ROP ensures openness in the regulatory process.
Criteria: Expect stable or increasingly positive perception over time.
Lead: NRR/DIRS Goals Supported: Open, Effective O-7 Opportunities for Public Participation in the Process Definition: Survey external and internal stakeholders asking if there are sufficient opportunities for the public to participate in the process.
Criteria: Expect stable or increasingly positive perception over time.
Lead: NRR/DIRS Goals Supported: Open, Effective O-8 Stakeholders Perceive the NRC To Be Responsive to its Inputs and Comments Definition: Survey external and internal stakeholders asking if the NRC is responsive to the public's inputs and comments on the ROP.
Criteria: Expect stable or increasingly positive perception over time.
Lead: NRR/DIRS Issue Date: 03/23/09 A-11 0307
NRC000053 Goals Supported: Open, Effective O-9 Stakeholders Perceive That the ROP Is Implemented as Defined Definition: Survey external and internal stakeholders asking if the ROP has been implemented as defined by program documents.
Criteria: Expect stable or increasingly positive perception over time.
Lead: NRR/DIRS Goals Supported: Predictable, Understandable, Open O-10 Stakeholders Perceive That the ROP Does Not Result in Unintended Consequences, Open Definition: Survey external and internal stakeholders asking if the ROP results in unintended consequences.
Criteria: Expect stable or increasingly positive perception over time.
Lead: NRR/DIRS Goals Supported: Effective, Open O-11 Analysis of NRC=s Responses to Significant Events Definition: Review reports from incident investigation teams (IITs) and augmented inspection teams (AITs) to collect lessons learned regarding ROP programmatic deficiencies (i.e., did the baseline inspection program inspect this area? did the SDP accurately characterize resultant findings?). IITs already have the provision to determine NRC program deficiencies. AITs will be reviewed by NRR/DIRS to identify any weaknesses.
Criteria: Expect no major programmatic voids.
Lead: NRR/DIRS Goals Supported: Effective, Predictable O-12 Analysis of Inspection Hours and Resource Expenditures Definition: Annually, collect and analyze resource data (e.g., direct inspection effort, preparation/documentation, plant status hours) for Baseline, Supplemental/Plant-Specific, and Safety Issues Inspections, and other ROP activities.
Issue Date: 03/23/09 A-12 0307
NRC000053 Criteria: (1) Significant deviations are not expected on an annual basis. Explore reasons for any deviations that may be evident.
(2) Track and trend resource usage for the baseline inspection program and supplemental/plant-specific inspections. Analyze causes for any significant departure from established trend.
(3) Track and trend resource usage for preparation, documentation, and other ROP activities, and assess the effects on budgeted resources.
NOTE: This metric is intended primarily for tracking and trending resource usage for the ROP. The results are used to improve the efficiency and effectiveness of the ROP and to make management and budget decisions. A detailed ROP resource analysis is included in the annual ROP self-assessment Commission paper.
Lead: NRR/DIRS Goals Supported: Effective, Predictable O-13 Analysis of Resident Inspector Demographics and Experience Definition: Annually, collect and analyze data in order to determine the relevant inspection experience of the resident inspector (RI) and senior resident inspector (SRI) population. The following four parameters will be measured and analyzed for both RIs and SRIs to ensure that the NRC maintains a highly qualified resident inspection staff:
(1) NRC time - the total time the individual has accumulated as an NRC employee.
(2) Total resident time - the total time the individual has accumulated as an RI or SRI.
(3) Current site time - the total time the individual has spent as an RI or SRI at the current site.
(4) Relevant non-NRC experience - the total time the individual has gained relevant nuclear power experience outside of the NRC.
Examples of relevant non-NRC experience are operation, engineering, maintenance, or construction experience with commercial nuclear power plants, naval shipyards, Department of Energy facilities, and/or the U.S. Navy nuclear power program.
Criteria: None; trend only. Provide reasons for any meaningful increase or decrease in these resident demographic metrics.
NOTE: This metric is intended primarily for tracking and trending resident inspection experience. The results are used to make any necessary modifications to the RI and/or SRI programs in order to attract and retain highly qualified Issue Date: 03/23/09 A-13 0307
NRC000053 inspectors to the respective programs. A detailed resident demographic and staffing analysis is included in the annual ROP self-assessment Commission paper.
Lead: NRR/DIRS with assistance from HQ and regional HR staff Goals Supported: Effective, Predictable O-14 Analysis of Site Staffing Definition: Annually, collect and analyze data in order to measure the permanent inspector staffing levels at each of the reactor sites for both RIs and SRIs in order to evaluate the agency=s ability to provide continuity of regulatory oversight.
Criteria: The criteria is set at 90% program-wide. Any single site that falls below 90%
will be individually evaluated. Provide reasons for any meaningful increase or decrease in the inspector staffing level at reactors sites.
NOTE: Inspectors assigned to the site permanently or through a rotation with a minimum duration of 6 weeks shall be counted. Inspectors on 6 week or longer rotational assignments will be identified as such. Inspectors assigned to the site for less than six weeks will not be counted, but should be indicated as such.
Additionally, the regions shall indicate sites where permanently assigned resident or senior resident inspectors are away from the site for an extended period of time (one continuous time period which is greater than 6 weeks). Only inspectors who have attained at least a basic inspector certification status, as defined by Appendix A to Inspection Manual Chapter 1245, shall be counted.
Data will indicate number of days a qualified resident and senior resident inspector are permanently assigned to the site during the year divided by the number of days in the year. Number of days spent on training; meetings away from the site; participation in team inspections; leave; or other temporary duties (e.g. acting for branch chiefs in his/her absence) will not be counted against the metric unless the absence exceed 6 continuous weeks.
Lead: Regions, NRR/DIRS Goals Supported: Effective, Predictable O-15 Analysis of ROP Training and Qualifications Definition: Annually, evaluate the implementation of IMC 1245, AQualification Program for the Office of Nuclear Reactor Regulation Programs,@ particularly as it pertains to ROP implementation.
Issue Date: 03/23/09 A-14 0307
NRC000053 Criteria: None; trend only. Summarize and evaluate the training accomplished over the previous year and propose program improvements as necessary to address noted concerns.
NOTE: This metric is intended primarily for tracking and trending the effectiveness of the ROP training and qualifications programs. An evaluation of training effectiveness is included in the annual ROP self-assessment Commission paper.
Lead: NRR/DIRS with assistance from regional staff Goals Supported: Effective, Predictable, Understandable O-16 Analysis of Regulatory Impact Definition: Annually, collect and analyze licensee feedback and develop a summary of regulatory impact forms that are critical of the ROP.
Criteria: None; trend only. Summarize and evaluate the feedback received and propose program improvements as necessary to address common concerns.
NOTE: This metric is intended primarily for tracking and trending regulatory impact. A detailed regulatory impact summary is included in the annual ROP self-assessment Commission paper.
Lead: NRR/DIRS with assistance from regional staff Goals Supported: Effective, Open, Understandable Issue Date: 03/23/09 A-15 0307
NRC000053 NRC000053 ATTACHMENT 1 Revision History For IMC 0307, Appendix A Commitment Issue Date Description of Change Training Training Comment Tracking Needed Completion Resolution Number Date Accession Number N/A 12/12/02 Revised significantly to include a more None N/A N/A detailed discussion of the role of inspectable and program area leads, the annual review of the baseline inspection program, and other aspects of the self-assessment program. The specific metrics for these roles were added to Appendix A.
N/A 12/12/03 Revised to provide greater detail for None N/A N/A documenting the results of the annual inspection procedures reviews, and some metrics in Appendix A were modified to better align with the operating plan metrics and other program commitments.
N/A 01/14/04 Based on a decision at the DRP/DRS None N/A N/A counterpart meeting held on December 17-18, 2003, metric IP-5 was revised to change the inspection report timeliness to 45 calendar days for all inspection reports, with exception of reactive inspection reports, which will stay at 30 days.
Issue Date: 03/23/09 Att1-1 0307 Appendix A
NRC000053 N/A 02/20/06 Revised to support the new safety None N/A ML060110235 performance measures of the NRC=s Strategic Plan, to better define the ROP goals and intended outcomes, and to consolidate and clarify several of the performance metrics.
Completed 4 year historical CN search.
N/A 11/28/06 Revised to measure the effectiveness of the None N/A safety culture enhancements to the ROP, to clarify expectations regarding the resident demographics and staffing metrics, and to include a discussion of the consolidated response to external survey questions.
N/A 01/10/08 Revised to eliminate and consolidate several None N/A ML073510410 CN 08-002 metrics, to separate Appendix A from the base IMC to serve as a stand-alone document, and to summarize and link to Appendix B on the ROP realignment process.
Revised to address the Commission SRM None N/A W200800299 03/23/09 dated June 30, 2008, to reflect the recently ML090300620 CN 09-010 issued Strategic Plan for FY 2008 - 2013, to reincorporate the security cornerstone in the ROP self-assessment process, and some metrics were revised for clarification purposes while others were removed to eliminate redundancy or unnecessary burden.
Issue Date: 03/23/09 Att1-2 0307 Appendix A