RIL 2024-01, Research Information Letter (Ril 2024-011) Use of Expert Elicitation Guidance to Inform a Small-Scale Knowledge Judgment Application
| ML24166A231 | |
| Person / Time | |
|---|---|
| Issue date: | 05/31/2024 |
| From: | Baweja J, Coles G, Jefferson B, Jing Xing NRC/RES/DRA/HFRB, Pacific Northwest National Laboratory |
| To: | |
| Shared Package | |
| ML24166A227 | List: |
| References | |
| RIL 2024-011 | |
| Download: ML24166A231 (28) | |
Text
RIL 2024-011 Use of Expert Elicitation Guidance to Inform a Small-Scale Knowledge Judgment Application May 2024 Prepared by:
G. Coles J. Baweja B. Jefferson Pacific Northwest National Laboratory P.O. Box 999 Richland, WA 99352 Jing Xing, NRC Technical Monitor Y. James Chang, NRC Project Manager Research Information Letter Office of Nuclear Regulatory Research
Disclaimer Legally binding regulatory requirements are stated only in laws, NRC regulations, licenses, including technical specifications, or orders; not in Research Information Letters (RILs). A RIL is not regulatory guidance, although NRCs regulatory offices may consider the information in a RIL to determine whether any regulatory actions are warranted.
ii ABSTRACT This Research Information Letter (RIL) documents the process and lessons learned when applying NUREG-2255, Guidance for Conducting Expert Elicitation in Risk-Informed Decisionmaking, to a small-scale expert knowledge elicitation effort. The RIL demonstrates a structured way to gather information from experts. This document can help with deciding, planning, designing, and conducting knowledge elicitation efforts, such as topic evaluations and undertaking the Phenomena Identification and Ranking Table (PIRT) process.
The U.S. Nuclear Regulatory Commission funded Pacific Northwest National Laboratory to evaluate parts of the technical basis used in NUREG-2256, Integrated Human Event Analysis System for Event and Condition Assessment (IDHEAS-ECA). This evaluation included generating insights on ways to determine the probability that the time required for an operator to perform a task exceeded the time available before undesired consequences occur. An elicitation process was used to gather knowledge from experts about experiences and lessons-learned, good practices, limitations, and cautions concerning the estimation of time required and time available to respond to an event. Determining these times includes characterizing their probability distributions to determine the impact to the human error probability for certain human failure events. Although the expert knowledge elicitation described in this report was a small-scale effort, the process followed the guidance on performing expert elicitation in NUREG-2255.
The focus of this document is on describing the process of using guidance to inform an expert knowledge elicitation, rather than on the elicited technical knowledge, which is reported elsewhere.
Although the expert knowledge elicitation described in this report was a small-scale effort, guidance on performing expert elicitation from NUREG-2255 provided a structured, systemic approach to conducting expert elicitation designed to reduce systematic biases such as cognitive and social biases. It should be noted that this application was a knowledge elicitation based on expert knowledge and experience about specific approaches to estimating times and not an elicitation of numerical value judgments, and it was not quantitative or complex enough to require a database. This report provides key insights from use of expert elicitation guidance on a small-scale application and it provides an example on applying NUREG-2255 guidance to small-scale non-numeric knowledge elicitation.
iii TABLE OF CONTENTS ABSTRACT.................................................................................................................................. ii ABBREVIATIONS AND ACRONYMS........................................................................................ iv 1 INTRODUCTION...................................................................................................................... 1 2 APPLICABLE GUIDANCE FROM NUREG-2255 ON CONDUCTING EXPERT ELICITATION........................................................................................................................... 3 2.1 Basic Principles of Eliciting Expert Judgement................................................................. 3 2.2 Steps of the Expert Elicitation Process............................................................................. 4 2.2.1 Phase 1: Planning and Preparation.................................................................... 4 2.2.4 Phase 2: Pre-Elicitation Work............................................................................. 5 2.2.3 Phase 3: Elicitation.............................................................................................. 5 2.2.4 Phase 4: Final Documentation and Sponsor Review.......................................... 5 2.3 List of Questions Addressed in NUREG-2255.................................................................. 5 2.4 Technical Bases for Expert Elicitation............................................................................... 7 3 SPECIFIC APPLICATION OF GUIDANCE ON EXPERT ELICITATION TO A SMALL-SCALE APPLICATION.............................................................................................. 9 3.1 Overview of the Expert Knowledge Elicitation Application................................................ 9 3.2 Developing Expert Knowledge Elicitation Information Requests.................................... 10 3.3 Selecting Experts............................................................................................................ 11 3.4 Expert Elicitation Virtual Workshop - Preparing, Holding, and Documenting................. 12 3.4.1 Preparing for the Workshop.............................................................................. 12 3.4.2 Holding the Workshop....................................................................................... 13 3.4.3 Documenting the Workshop Results................................................................. 14 3.5 Use of Expert Elicitation Guidance.................................................................................. 15 3.5.1 Representation of the Technical Community.................................................... 15 3.5.2 Independent Intellectual Ownership.................................................................. 15 3.5.3 Avoidance of Conflicts of Interest...................................................................... 15 3.5.4 Interaction and Integration................................................................................ 15 3.5.5 Structured Process............................................................................................ 15 3.5.6 Transparency.................................................................................................... 16 3.6 Observations About Insights from Workshop Process and Results................................ 16 3.6.1 Observations About the Process....................................................................... 16 3.6.2 Nature of Insights Gained................................................................................. 17 3.6.3 Importance of Information Elicited..................................................................... 18 3.6.4 Separate Email Solicitation from Industry Leaders........................................... 18 4 INSIGHTS AND CONCLUSIONS.......................................................................................... 20 5 REFERENCES....................................................................................................................... 22
iv ABBREVIATIONS AND ACRONYMS ASME/ANS American Society of Mechanical Engineers/American Nuclear Society EDT Eastern Daylight Time HRA human reliability analysis IDHEAS-ECA Integrated Human Event Analysis System for Event and Condition Assessment NPP nuclear power plant NRC United States Nuclear Regulatory Commission PIRT Phenomena Identification and Ranking Table PNNL Pacific Northwest National Laboratory PRA probabilistic risk assessment RIL Research Information Letter SSHAC Senior Seismic Hazard Analysis Committee TSTF Technical Specification Task Force
1 1 INTRODUCTION The U.S. Nuclear Regulatory Commission (NRC) funded Pacific Northwest National Laboratory (PNNL) to evaluate parts of the technical basis used in NUREG-2256, Integrated Human Event Analysis System for Event and Condition Assessment (IDHEAS-ECA) (NRC 2022a). This evaluation included generating insights on ways to determine the probability that the time required for an operator to perform a required task exceeded the time available before undesired consequences occur. PNNL used an expert elicitation process to gather knowledge from experts about experiences and lessons-learned, good practices, limitations, and cautions concerning the estimation of time required and time available. Determination of these times includes characterization of their probability distributions to determine the impact to the Human Error Probability for certain Human Failure Events. Although the expert knowledge elicitation undertaken in this evaluation was a small-scale effort, a formal, structured process was used to perform an expert elicitation. The focus of this Research Information Letter (RIL) is on the process and benefits of using guidance to inform an expert knowledge elicitation rather than on the technical insights from the elicitation, which are reported elsewhere.
NUREG-2255, Guidance for Conducting Expert Elicitation in Risk-Informed Decisionmaking (NRC 2022b) describes the general principles and an approach for eliciting and integrating expert judgment using a structured process. The guidance in NUREG-2255 was used to inform an expert knowledge elicitation process for estimating the time required and time available and corresponding probability distributions. This application of expert elicitation is relatively limited but illustrates that the guidance in NUREG-2255 can be applied to small-scale efforts as well as large, detailed studies.
The NUREG-2255 guidance is based on cognitive science in literature studies, existing expert elicitation guidance, experience in conducting expert elicitation, and lessons-learned from three pilot applications in which the draft guidance was used to support elicitation of parameter inputs for risk analysis methods and models. The guidance includes an introduction to expert elicitation and its use within the NRC, the basic principles of using expert judgment, guidance, and a recommended process for conducting a formal expert elicitation. The guidance also contains supplemental information based on lessons learned from research literature, existing guidance documents published by the NRC and other organizations, and recent expert elicitations performed by NRC staff.
This RIL describes an example application of the guidance in NUREG-2255 to a small-scale, non-numeric expert knowledge elicitation effort. The example application was also limited in terms of time commitments from experts for preparation and participation in the two-and-one-half hour workshop. The volume and complexity of material sent to the experts in preparation for the workshop were modest: (1) a one-page invitation that defined the purpose of the workshop, (2) a three-page follow-up to the consenting experts providing background information, information requests, and information about workshop logistics, and (3) a three-page final agenda, the finalized information requests, and an introduction to the basic principles of expert elicitation from NUREG-2255.
The workshop consisted of four NRC human reliability analysis (HRA) experts, two probabilistic risk assessment (PRA) industry experts, and three official observers as described in Section 3.4 of this RIL. Two other industry experts participated by email only. One of the NRC managers who also acted as one of the official observers generated the invitations. Workshop preparation and implementation and documentation of the results primarily was done by the PNNL facilitator with some help from the PNNL team (e.g., help with formulating the information requests and providing
2 a workshop notetaker). Accordingly, with minimal extra effort to apply certain principles, the advantages from employing a structured systemic expert elicitation effort were realized.
Section 2 of the RIL briefly summarizes key elements of the general guidance in NUREG-2255 for conducting an expert elicitation process and guidance on commonly questions asked about steps of the process. Section 3 of the RIL describes the expert knowledge elicitation that was performed on determining the probability of exceeding time available and how guidance from NUREG-2255 was implemented. Section 4 provides conclusions and insights.
3 2
APPLICABLE GUIDANCE FROM NUREG-2255 ON CONDUCTING EXPERT ELICITATION This section provides a brief overview of applicable content from guidance found in NUREG-2255 on conducting expert elicitation. This overview provides general guidance for conducting an expert elicitation process and is included to provide a basic level of familiarization to support discussion provided in Section 3, which discusses application of the guidance. The NUREG-2255 guidance primarily is based on existing expert elicitation guidance, experiences conducting expert elicitation, literature studies, and lessons-learned from three pilot applications in which the draft guidance was used to support the elicitation of parameter inputs for risk analysis methods and models. Key predecessor documents are NUREG-1563 (NRC 1996),
NUREG/CR-6372 (NRC 1997), NUREG-2117 (NRC 2012), and NUREG-2213 (NRC 2018).
The material summarized here is presented at a high level and consists of (1) basic principles of eliciting expert judgment, (2) steps of the elicitation process, (4) questions from experience that are addressed in NUREG-2255, and (5) brief discussion of the technical basis for expert elicitation.
2.1 Basic Principles of Eliciting Expert Judgement NUREG-2255 defines the basic principles of eliciting expert judgment. Application of these principles are specifically referred to in Section 3. Section 2.2 in NUREG-2255 states that the ultimate objective of conducting an expert elicitation is to appropriately represent the center, body, and range of the technical communitys views about a technical issue. When expert judgment is used to support decision-making, the elicitation should be performed in a manner that ensures confidence in the results. As such, expert elicitation should conform to the principles discussed below, regardless of the scope, level of effort, and the method or procedures employed for the elicitation process. (The following is taken directly from Section 2.2 of NUREG-2255.)
- 1.
Representation of the technical community - The purpose of expert elicitation is not to create new knowledge, but rather to obtain the center, body, and range of the views of the composite distribution of the technical community about the state of knowledge. While it is impractical to engage an entire technical community in the elicitation process, the expert panel should a) be an adequate sample of the overall technical community, b) have sufficient breadth of knowledge that it can evaluate the available data, and c) include leaders in the technical field who can capture the communitys degree of consensus and diversity. The resultant expert judgment should represent the overall communitys views and beliefs about the state of knowledge for the technical problem.
- 2.
Independent Intellectual Ownership - While the project sponsors have legal ownership of the project deliverables, the expert panel collectively has intellectual ownership of the results. Intellectual ownership means that the expert panel takes responsibility for the robustness and defensibility of the results. To ensure intellectual ownership, all inputs to the elicitation should be shared with every expert. To maintain the independence of intellectual ownership, expert judgment must be based on the experts knowledge and expertise, not the positions of the project sponsors or organizations the experts are associated with. The experts must clearly understand that they are not representing their employer or organization on the panel but are serving in their own right as a recognized leader in their respective field. Each expert should also maintain independence from the other experts in the team in order to avoid (or mitigate) a groupthink bias risk.
4
- 3.
Avoidance of Conflicts of Interest - To minimize bias in the elicitation, careful consideration should be given to potential conflicts of interest prior to selecting experts. Experts should be free from direct and potential conflicts of interest to the extent practical. In all cases, potential conflicts of interest or even the appearance of conflicts of interest should be disclosed up front.
- 4.
Breadth of State of Knowledge - The expert panel should evaluate a range of data and models that are representative of the overall technical community in order to obtain the range of knowledge and interpretations about the technical issue.
- 5.
Interaction and Integration - To represent the knowledge and interpretations of the technical community, experts should interact with each other as they accumulate and evaluate existing knowledge and make interpretations. The expert panel cannot simply accumulate and evaluate inputs from the literature or elicit the judgment of one or more experts. Instead, individual experts should make their interpretations based on the integration of their own knowledge and inputs from other experts. The final results should be the integration of the individual judgments to represent the center, body, and range of the state of knowledge about the technical issue.
- 6.
Structured Process - An expert elicitation should employ a structured process to facilitate interaction and integration, and to reduce biases in the outcomes.
- 7.
Transparency - Often the results of an expert elicitation serve a range of users with different needs. To assure that the results are used appropriately, the information generated must be documented in a transparent way. Transparency includes the input data and models that were considered, the process employed, the results obtained, and the caveats and limitations of the inputs, process, and results. Transparency also helps to demonstrate the stability and integrity of the results as a whole.
2.2 Steps of the Expert Elicitation Process NUREG-2255 guidance explains that when eliciting expert judgement, a structured and systemic process encompassing the basic principles of expert elicitation described above in Section 2.1 should be used. For developing a structured and systemic process, the guidance identifies the following four phases and ten steps.
Phase 1: Planning and Preparation The purpose of this phase is to ensure the elicitation problem is sufficiently defined to address the regulatory application of interest; that the project team, expert panel, and elicitation process are adequate to address the elicitation problem; and that the experts are provided with necessary information prior to the actual elicitation. The following steps are included in this phase.
Step 1 - Define the expert elicitation (i.e., construct the project)
Step 2 - Form the expert panel Step 3 - Develop the project plan
5 Phase 2: Pre-Elicitation Work The purpose of this phase is to ensure that the expert panel is involved in compiling the dataset and that all team members understand the project, the technical problems, the individual's role/responsibilities, and the theories of probabilities and uncertainties. The following steps are included in this phase.
Step 4 - Assemble and disseminate the dataset Step 5 - Familiarize and refine the technical issues Step 6 - Conduct training and piloting Phase 3: Elicitation The purpose of this phase is to elicit expert judgments through interactive workshops. The expert panel interacts to evaluate the data and models, make interpretations, form initial judgments, and integrate the judgments to represent the distribution of the views of the technical community. The following steps are included in this phase.
Step 7 - Elicit expert judgments Step 8 - Integrate expert judgments Phase 4: Final Documentation and Sponsor Review The purpose of this phase is to develop final documentation of the process and results and have the technical staff of the sponsor organization to review the documentation for use in regulatory decision-making. The following step is included in this phase.
Step 9 - Document the process and results and conduct sponsor technical review All Phase: Participatory review Step 10 - Participatory review While referred to as Step 10, a participatory review is not actually a step but rather involves one or more experts who participate to monitor the process to avoid significant systemic biases and enhance the technical breath of the knowledge on which the judgments are based. NUREG-2255 recommends having a participatory review throughout all the phases. The expert knowledge elicitation, described in Section 3, was not resourced enough to include a participatory review.
2.3 List of Questions Addressed in NUREG-2255 NUREG-2255 also provides what it calls supplemental guidance regarding good practices, lessons-learned, and examples for implementing an expert elicitation process. The material is organized by the 10 process steps discussed above in Section 2.2 and presents answers to common questions asked for those steps. This list provides a resource from which to get guidance about the same or similar questions. Therefore, the following list provides those questions because they give a sense of applicable issues and concerns. Because the example application is a knowledge elicitation and not an elicitation of numerical value judgments and not quantitative or complex enough to require a database, many of these questions do not apply to the small-scope example of an expert knowledge elicitation presented in Section 3. Reviewing
6 this list before or during planning of an expert elicitation helps design the process and prepare for potential challenges and pitfalls.
- 1.
Are the levels of effort in the current process comparable to the Senior Seismic Hazard Analysis Committee (SSHAC) levels?
- 2.
How should the technical issues be identified and prioritized?
- 3.
What if the technical issue to be addressed by the expert panel requires a variety of areas of expertise?
- 4.
Can one expert serve multiple roles?
- 5.
How should experts conflict of interest be addressed?
- 6.
How many proponent experts should a formal expert elicitation have?
- 7.
How should time be estimated for expert participation?
- 8.
What is a reasonable duration for a workshop session?
- 9.
Can the workshop be combined to reduce cost?
- 10. How relevant and adequate are existing data?
- 11. When should the datasets be made available to the expert panel?
- 12. How can experts achieve common understanding of the technical issues or questions being asked?
- 13. Can experts be trained to avoid cognitive biases?
- 14. What should be included in training about anchoring biases?
- 15. What are some strategies for training experts to estimate probabilities?
- 16. Why are the workshops important?
- 17. Why are facilitated interactive workshops preferred over dynamic workshops?
- 18. Can one person fulfill all of the lead technical integrator responsibilities?
- 19. Why is challenging and defending important during the workshops?
- 20. What cautions should be observed when screening out data?
- 21. What is subjective probability?
- 22. What strategies are useful to express likelihood in probabilities?
- 23. How Should Time be Managed During the Workshops?
- 24. How should mental fatigue be managed during the workshops?
- 25. How should social pressure be managed during the elicitation process?
- 26. How should deviations from the workshop procedures be addressed?
- 27. How should expert no shows be managed during the workshops?
- 28. What are good practices for running the workshops?
- 29. How to avoid negative group dynamics in a workshop?
- 30. Why is the integration approach important?
7
- 31. How should the available resources be balanced with the thoroughness of the integration?
- 32. What are some mathematical and behavioral approaches for combining individual judgments?
- 33. Which combination rule is betterlinear (equal-weight) or geometric?
- 34. What are some techniques or methods for mathematical aggregation?
- 35. Can software tools be used to combine probability distributions?
- 36. What are some challenges in integrating expert judgments?
- 37. Should the experts be identified in the documentation?
2.4 Technical Bases for Expert Elicitation NUREG-2255 also provides a condensed discussion of the technical bases associated with expert judgment. It states that the concepts of human cognition, decision-making, and cognitive biases are reviewed and structured into a coherent framework to inform the formal expert elicitation process. Accordingly, NUREG-2255 (1) discusses the concept of expert judgment, (2) provides the definition of experts, (3) discusses cognitive bases for generating expert judgment, (4) discusses cognitive biases and debiasing strategies, (5) discusses review of guidance in expert elicitation, and (6) describes consensus versus community distribution.
The following discussion is a summary of Section 4 of NUREG-2255.
Expert judgement is information provided by an expert in response to a technical question. It represents an informed opinion or belief about the state of knowledge of a technical issue based on the experts training and background. Expert judgment can be either qualitative or quantitative.
Experts are those who have the knowledge and capability to support assessment and informed judgment about the technical issue of interest. An expert must have (1) specific knowledge and expertise in the technical area of interest and (2) the ability to assess the qualitative or quantitative aspects of the technical issue. An expert uses mental frames to interpret the evidence and relate the evidence to the questions asked.
The basis for making judgments resides in the human cognitive process for understanding a problem and making decisions based on that understanding. Key elements in the decision-making process are the individuals mental models. An individual organizes knowledge into structured, meaningful patterns, referred to as mental models; stores the patterns in their memory, and uses them to interact with the environment. Mental models help people describe, explain, and predict system behaviors. Inputs to the decision-making process are the questions being asked, data relevant to the question, and models that relate the data to the question. One then uses mental models to process the inputs. The output of the process is the belief about the state of the inputs (i.e., the decision or judgment).
Humans are not equipped with a competent mental statistical processor. Rather, in making judgments in the face of uncertainty, humans unconsciously use a variety of cognitive heuristics.
As a consequence, when asked to make probabilistic judgments, either in a formal elicitation or in a less formal setting, peoples judgments are often biased. Cognitive biases have been extensively studied and reported in the literature, and many cognitive biases have been identified.
8
9 3
SPECIFIC APPLICATION OF GUIDANCE ON EXPERT ELICITATION TO A SMALL-SCALE APPLICATION Section 3 describes an expert knowledge elicitation to produce insights on ways to determine the probability that the time available to perform a task is exceeded. The focus was on determining ways to estimate the time required and time available and their corresponding probability distributions. The sections below provide discussion concerning a small-scale application and include: (1) an overview of the expert knowledge elicitation application, (2) selection of experts for that process, (3) description of the workshop held for that process, (4) use of expert elicitation guidance for that process, (5) observation and insights about the elicitation process and results, and (6) use of email solicitation to supplement the workshop process and those results.
3.1 Overview of the Expert Knowledge Elicitation Application There are different purposes for performing an expert elicitation. The purpose of this elicitation was to gather and evaluate the experience and knowledge of a larger technical community on a technical issue. There are many possible purposes of elicitation, including identifying available technical evidence relevant to a technical issue and disseminating and sharing common databases by experts. These later elicitation efforts are often formal and time-consuming but can generate important information on a topic. Though nominally an expert elicitation, this effort was limited in scope and available resources (e.g., the experts donated their time for a 21/2 hour workshop and time to review and the think about the information requests). However, as discussed below, expert elicitation principles were used to help assure the quality and transparency of results and to reduce systemic biases such as social biases.
The expert knowledge elicitation process involved a primary activity consisting of virtual online workshop augmented by email feedback from two prominent PRA and HRA leaders. The workshop participants consisted of industry and NRC HRA experts while the email feedback came from experts with significant history in and perspective on the development of HRA modeling. Two experts participated separately by email because it was difficult to schedule their time for the session and because their participation may have biased the workshop given their prominenceboth are prior members of the NRC Advisory Committee on Reactor Safeguards.
The selection of experts and associated details are discussed further in Section 3.3.
This expert knowledge elicitation process consisted of the following elements:
- 1. Developing the information requests from experts needed to address basic questions about development of point estimates and probability distributions for time available and time required to inform the IDHEAS-ECA guidance
- 2. Identifying and recruiting experts
- 3. Disseminating the expert knowledge elicitation workshop information requests and preliminary instructions by email
- 4. Following up with each workshop expert after sending the email messages
- 5. Holding the expert knowledge elicitation workshop
- 6. Sending each expert a record of the responses drafted by the workshop facilitator and observers and giving them a chance to refine their responses
- 7. Finalizing the workshop responses and distilling insights
10
- 9. Consolidating the final insights and recommendations These steps are consistent with the four phases of the expert elicitation process described in Section 2.3 of NUREG-2255 (NRC 2023b) discussed in Section 2.2 above:
Planning and preparation Pre-elicitation work Elicitation Final documentation and sponsor review Certain steps from Section 2.3 of NUREG-2255 and described or cited in Section 2 of this RIL were not applicable or were less important for this elicitation because it was a knowledge elicitation based on the experts knowledge and experience about specific approaches to estimate times and not an elicitation of numerical value judgments and not quantitative or complex enough to require a database (as indicated in Step 4 in Section 2.3 of NUREG-2255).
Also, the effort was limited so there were no resources available for piloting and training (as discussed in Step 6 in Section 2.3 of NUREG-2255). Regardless, a very brief primer on basic principles was provided 2 days before the workshop in preparatory material sent to participants and then verbally at the start of the workshop.
The following subsections describe the steps identified above and how guidance from NUREG-2255 was applied.
3.2 Developing Expert Knowledge Elicitation Information Requests Information requests were developed to gather knowledge to inform IDHEAS-ECA guidance intended for HRA analysts about how to estimate the probability of exceeding of time available for required operator tasks at nuclear power plants (NPP). NRCs HRA methodology, IDHEAS-ECA, uses a time uncertainty model to estimate the human error probability contribution of the probability of exceeding the time available determined by considering the probability distributions of time available and time required for a human action.
The information request included three overarching questions (e.g., how are estimates of time available currently made in support of developing a human error probability), each with three or four more specific questions [e.g., for methods identified, describe the factors that could impact the uncertainty of the estimate].
The information requests were designed to elicit the experts knowledge about obtaining time required and time available estimates. The requests were not focused on judgments about methodology or specific failure rates or probabilities values used in HRA. Rather the information requests were designed to elicit the experts knowledge and experience on considerations in making the time estimates and factors that could impact the uncertainty of those estimates.
As described above in this section, this information request, which can be thought of as questions, was sent to the six workshop experts ahead of the workshop and were the same
11 requests used in the workshop. The information request was also sent in the email solicitation to the two experts who were not participating in the workshop.
3.3 Selecting Experts To identify a set of experts with knowledge representative of the larger technical community for the workshop and email solicitation, participants were selected with the following expertise:
- 1. Knowledge on how NPP operators perform tasks in normal, abnormal, and emergency situations
- 3. NRC staff who use HRA and risk concepts to support risk informed decisions on NPP issues. This includes application of the Significance Determination Process or Accident Sequence Precursor analyses.
- 4. PRA and HRA industry leaders with a significant history and perspective about HRA approaches and its applications. This includes HRA for formally reviewed and approved PRA models per the American Society of Mechanical Engineers/American Nuclear Society (ANS) PRA standard (ASME/ANS 2009), and Regulatory Guide 1.200, Revision 2 (NRC 2009) and Revision 3 (NRC 2020).
The selected HRA industry experts who were willing to help are well known consultants to licensees of NPPs and have been instructors and researchers for the Electric Power Research Institute. They are especially current on the HRA approaches used in support of risk-informed applications to the NRC such as those under Title 10 of the Code of Federal Regulations, Part 50.69, Risk-Informed Categorization of the Treatment of Structures, Systems and Components for Nuclear Power Reactor, and licensee amendment requests to revise technical specifications to adopt risk-informed completion times through NRC Technical Specification Task Force (TSTF)-505, Revision 2 (TSTF 2018). The NRC HRA experts have recent industry experience at NPPs as Senior Reactor Operators or academic credentials specifically related to risk and reliability. The PRA and HRA industry leaders are well known and respected across industry, the NRC, and internationally.
It is important to note that the experts were not compensated by PNNL for their time, so gave their time voluntarily. The experts time commitment was modest as can be seen by the description of expert elicitation process described in the next section.
In summary, eight experts agreed to be part of the elicitation process. Six of the experts agreed to participate in an expert knowledge elicitation workshop. Two of those six workshop experts were industry experts who consult in the areas of HRA and PRA and four of the experts were NRC HRA experts who use HRA in regulatory reviews of NPP risk issues. The final two experts responded to parts of the information requests in email messages, which added value to the process. As described above, the two experts participated by email, separately from the workshop because it was difficult to schedule their time for the session and because their participation might have biased the workshop given their prominence. Also, they both acknowledged that their practice of PRA is not current enough for them to be included as participants in the workshop elicitation. Therefore, these experts chose to provide email insights from their long experience and knowledge of HRA concerns and development.
12 3.4 Expert Elicitation Virtual Workshop - Preparing, Holding, and Documenting This section discusses the preparation for the virtual workshop, the management and logistics of holding the workshop, and the documentation of the results of the workshop. The email solicitation is described later in a separate section.
3.4.1 Preparing for the Workshop In June 2022, the NRC sent a one-page invitation to industry and NRC HRA experts inviting them to a virtual workshop to be held in July 2022 and providing context for the purpose of the workshop. On July 7, 2022, PNNL followed up with the experts who were willing to participate by emailing a three-page attachment that provided information requests, background information, and a brief description of the structure of the virtual workshop. A final advance three-page copy of the workshop agenda, the finalized information requests, an introduction to the basic principles of expert elicitation, and an agenda for the workshop was provided to the experts 2 days before the workshop. Some experts wrote down their responses ahead of the workshop, but this was not a requirement imposed by PNNL. Along with the information request, the experts were told how the workshop would be structured. It was explained to the workshop experts that:
- 1. The workshop would be organized by soliciting responses to the three overarching information requests considering the more specific requests made under each overarching request as described in Section 3.2.
- 2. Each participant would be given a set number of minutes to provide their response, and each participant would provide their response to the requests before the group proceeded to the next request. The experts were told that that up to six experts would participate in the workshop. Participants could choose not to answer certain requests if they felt that they did not have the experience or expertise.
- 3. PNNL would document the responses from each expert.
- 4. PNNL would send the documented transcribed responses to the participating experts after the workshop for verification. This step is considered important given that the insights were intended to be used to enhance the IDHEAS guidance on determining time available and time required.
- 5. The purpose of the workshop was not to achieve consensus about a method (or methods) to determine time available and time required. Rather, the purpose was to gain an understanding of current practice or what might be done that could be used to enhance the IDHEAS guidance.
The PNNL staff member assigned to facilitate the workshop followed up with each expert by telephone and email to address any potential confusion or concern about the requests, the workshop process, or about expectations from the experts. A key objective was to help ensure that the experts and facilitator had a common understanding of the information requests and expectation from the experts during the workshop (referred to in Section 3.5.1 of NUREG-2255).
The experts also were informed about the participation of official observers and gave their consent to the fact that three observers would also attend the workshop.
There were three official observers. Two were industry representatives with broad NPP operational experience and familiarity with HRA who were invited by the NRC staff. The third observer was one of the lead NRC HRA researchers for this project. The observers were given an opportunity to interact with the experts after the elicitation for each of the three information
13 requests. Their role in this case was to seek clarification of responses. Per NUREG-2255, the observers role is that they only observe the workshops and do not participate in workshop discussion to avoid organizational biases. If observers have concerns, then these comments are made to the project team not the experts.
Two days before the workshop, on July 18, 2022, the experts were sent the final agenda for the workshop and some basic notes about the elicitation process itself. The information provided was consistent with earlier information provided to the experts by email and by telephone.
3.4.2 Holding the Workshop The workshop itself was performed on July 20, 2022, starting at 1:00 pm Eastern Daylight Time (EDT) according to the agenda presented in Table 1. The workshop lasted about 21/2 hours and followed the proposed agenda with some real-time adjustments to accommodate timing needs.
The schedule was followed as a way to get through the responses from each expert for all information requests.
Table 1 Expert Knowledge Workshop Schedule Information Request Duration Started at Introductions and Briefing 10 minutes 1:00 pm EDT Information Request 1 Responses (4 minutes per participant) 25 minutes 1:10 pm EDT Follow-up and Observer Comments 5 Minutes 1:35 pm EDT Information Request 2 Responses (4 minutes per participant) 25 minutes 1:45 pm EDT Follow-up and Observer Comments 5 minutes 2:10 pm EDT Break 10 minutes 2:20 pm EDT Information Request 3 Responses (5 minutes per participant) 25 minutes 2:30 pm EDT Follow-up and Observer Comments 5 minutes 2:55 pm EDT Change of opinion and follow-up discussion 20 minutes 3:00 pm EDT Closeout 10 minutes 3:20 pm EDT Meeting Complete 3:30 pm EDT The PNNL facilitator initiated the workshop by presenting the purpose and structure of the workshop and briefing on basic principles of expert elicitation from NUREG-2255. The participants were told that the purpose of this elicitation was to gather and evaluate the experience and knowledge of a larger technical community on a technical issue. It was emphasized that the objective of workshop was not necessarily to come to consensus positions.
The participants were then briefed on the meaning of the following basic elicitation principles:
Representation of technical community Independent intellectual ownership Avoidance of conflicts of interest Breadth of state of knowledge Interaction and integration Structured process Transparency Explanation of how these principles of expert elicitation were addressed in the design of the expert elicitation workshop is presented in Section 3.4 of this RIL.
14 After the introduction and initial briefing, each expert was given a set time (about 4 minutes per expert) to provide response to one of the three information requests. After all experts were finished providing their responses to one of the requests, there was a short time period (i.e., 5 minutes) for summary remarks by the facilitator and follow-up comments by experts.
After all three information requests were addressed, there was a more extended time (20 minutes) for follow-up discussion and comments by experts and observations from the observers.
The workshop knowledge elicitation was structured to solicit information on applicable technical issues and to provide each expert an equal opportunity to share their knowledge and to reduce systemic biases. After the elicitations from the individual experts were finished for all requests, the experts were given a specific opportunity to change or refine their official response. The last half hour following the scheduled elicitation response time was used for follow-up questions by the facilitator and observers and responses and comments by the experts themselves. This discussion was useful to clarify the responses of certain experts.
The structure of the sessions was designed to avoid negative group dynamics, in which experts who might be highly opinionated, dominating, or not interested or willing to hear the perspective of others might have undue influence in the workshop. At the same time, experts were encouraged to express their views from their perspective and within their frame of reference.
The expectation that each expert uses his or her allotted time created the atmosphere for them to be mentally present in the workshop. Avoiding negative group dynamics is discussed Section 3.7.14 of NUREG-2255.
3.4.3 Documenting the Workshop Results Based on PNNL facilitator and observer notes (and a PNNL note-taker that did not participate as an expert or as an official observer), PNNL put together a transcript of the responses and follow-up discussions held during the workshop. This was done without the aid of an audio recording. These results are documented in Appendix A of PNNL-33867 (Coles et al. 2023a).
Approximately one week after the elicitation, PNNL sent a record of the workshop responses (i.e., the transcript) to the six workshop experts and asked them to confirm, correct, add to, subtract from, clarify, or augment the responses so that it reflected the messages they intended to present. PNNL followed up until the experts confirmed the record or send adjustments that PNNL incorporated.
There were no changes made to the original record that significantly changed the insights and conclusions from the workshop based on this post-workshop discussion step. Regardless, this process of showing the experts a record of the workshop responses had two benefits. It provided opportunities (1) for further integration of responses made in the workshop and (2) to make changes if needed with the inputs from other experts at the workshop. This afforded a measure of integration and a high degree of transparency and traceability of the experts inputs. The concepts of integration and transparency and traceability are discussed in Section 2.3.8 of NUREG-2255.
The names of the experts are documented in PNNL-33867 (Coles et. al. 2023a). In general, per guidance from NUREG-2255, Section 3.9.1, users of expert judgment want to know the origin of the information. Therefore, the names and background of the experts should be included in the documentation. Also, it is important for experts to stand behind their views and judgments.
Providing the experts names in the documentation of the expert elicitation, provides a level of transparency and accountability.
15 3.5 Use of Expert Elicitation Guidance The workshop itself was designed to reduce systemic biases such as social and cognitive bias by employing principles presented in NUREG-2255. Guidance in NUREG-2255 was developed based on cognitive research as well as past use of expert judgment in PRA and important applicable NRC guidance documents. The guidance incorporated various debiasing strategies in the literature and past experience. In addition to the explicit discussion here about the basic principles of expert elicitation, references to the application of guidance provided in NUREG-2255 are included throughout Section 3, which describes this small-scale application.
Regarding guidance on the basic principles of expert elicitation, the workshop facilitator provided a briefing at the beginning of the elicitation workshop explaining them. The following subsections provide brief summaries of the basic principles and include explanations of how the basic principles were implemented in the small-scale application.
3.5.1 Representation of the Technical Community The purpose of expert elicitation was not to create new knowledge, but rather to obtain the center, body, and range of the views and judgment of the technical community on the state of knowledge about an issue. The experts were asked to represent (i.e., to best of their ability given the circumstances) the communitys views and practices concerning exceedance of time available to respond to an event.
3.5.2 Independent Intellectual Ownership The expert panel members were told they are not representing their employer or organization on the panel but are providing their own expertise. Each expert was asked to maintain independence from the other experts in the team to avoid (or mitigate) a group-think bias risk.
The objective of the workshop was not to come to consensus positions.
3.5.3 Avoidance of Conflicts of Interest The expert panel were told that they should be representative of the larger technical community to obtain a range of knowledge and interpretations about the technical issue. The experts in this panel were industry consultants and instructors in HRA and NRC staff who worked at NPPs or have related academic credentials. All the experts were asked and declared of no conflicts of interest with the outcomes of the elicitation nor with the intended use of the results.
3.5.4 Interaction and Integration Each expert was given time to express their knowledge as well as an opportunity to augment or amend what they said after hearing the contributions from other participants. Additionally, after the workshop, the experts were given a summary of their response, which they were asked to confirm or amend.
3.5.5 Structured Process An expert elicitation should employ a structured process to facilitate interaction, integration, and reduce biases in the outcomes, The workshop was structured in a way so that each expert had a dedicated time period during which they could express their view or judgment without
16 interruption. Interaction and discussion among the experts occurred at a specific time period at the conclusion of the responses for each of the three questions.
3.5.6 Transparency To assure that the results are used appropriately, the information was generated and documented in a transparent way. This transparency includes description of the process, the results obtained, the intended use of the results, and the caveats and limitations of the process. Transparency helps demonstrate the stability and integrity of the results. PNNL sent the experts a record of their responses during the knowledge elicitation workshop on exceedance of time available based on PNNL observer and facilitator notes. PNNL asked the experts to confirm, correct, add to, subtract from, clarify, or augment the responses so that it reflected the messages they intended to present.
3.6 Observations About Workshop Process and Results As discussed earlier, the purpose of performing the expert knowledge elicitation was to gain specific technical insights. These technical insights are documented in PNNL-33867 (Coles et.
al. 2023a) and in Appendix C of a draft RIL (Coles et. al. 2023b). This section does not provide those specific technical results and conclusions but rather observations about exercising the expert knowledge elicitation process and assessing the results. This section makes observations about how the process worked, the nature of the specific and general insights gained, and value of the information elicited.
3.6.1 Observations About the Process The one-on-one follow-up telephone calls and email messages by the PNNL workshop facilitator were valuable. It appeared that these interactions helped clarify the purpose of the workshop for the participants and the expectation for the individuals role in the workshop.
Assimilation appeared to be increased by following the initial invitation with the specific information request, then telephone calls (in some cases email) to clarify the purpose and expectation of the expert knowledge elicitation workshop, followed by explicit instructions just a few days ahead of the workshop.
As stated in Section 3.4, the experts were not compensated by PNNL for their time, so their involvement was voluntary. PNNL notes that the time commitment was modest as it primarily involved the 21/2 hour workshop itself and some preparation time to understand and think about the information request. However, participants had a genuine professional interest in the questions and how their responses may impact IDHEAS-ECA guidance. For a small-scale approach, asking experts to donate their time for a worthwhile professional cause appeared to work well for the expert knowledge elicitation.
The issue of experts who wanted to change or refine their response after hearing the response of others was explicitly addressed. As explained in Section 3.4, the schedule provided a specific opportunity for experts to make changes after all the elicitations from the individual experts were finished for all requests. Experts made some minor adjustments to their responses during this scheduled time slot but did not make significant changes. PNNL noted that the experts tended to refer to each others response to build their own response. The responses seemed candid and unfiltered. It appears that the structure of the elicitation workshop allowed the experts to remain aligned with their professional opinions but also allowed them to be able to reflect their view to the views of others.
17 PNNL notes that the participation of the observers who asked clarifying questions was valuable, because in certain cases, it helped refine what was meant by the response of certain experts.
Knowing that the observers were listening provided a further sense of formality that helped maintain focus on the structure of the elicitation.
The structure of the elicitation allowed each expert equal and sufficient opportunity to share their knowledge without feedback until the participant was finished. In some cases, an expert needed more time than was scheduled, but these cases seemed to be offset by experts who needed less than the scheduled time. The result was that each element of the workshop session was completed within its total allotted time.
The role of the facilitator was important to the integrity of the sessions and the elicited responses. It was important for the facilitator to keep the responses on topic, within the time window, and to clarify the meaning of responses without projecting a personal interpretation onto the response.
3.6.2 Nature of Insights Gained This section discusses the nature of the insights gained by providing some key examples. The section also discusses how the elicitation process and structure played the role in the nature of the insights gained.
As explained in Section 3.3 the primary difference between the six workshop experts was that two of them were industry experts who consult in the areas of HRA and PRA and four of the experts were NRC PRA/HRA experts who use HRA in regulatory reviews of NPP risk applications. For certain information requests, there tended to be a difference in the response offered by the industry experts versus the NRC experts One source of these differences was their context for applying HRA. The industry perspective is colored by the fact that they are typically involved with supporting NPPs to build PRA models that must meet the rigorous requirements in the ASME/ANS PRA standard (ASME/ANS 2009). Whereas, for certain NRC HRA experts, a formal and complete PRA model may not be needed to assess the risk that involves operator errors because event analysis requires specific event information that is not in the standard PRA models. The difference resulted in better representation of the range of the informed technical communitys views on estimating time required and time available.
Consideration of an experts relationship with the elicited information is important in selection of experts to ensure that the expert panel represents the overall informed technical community.
Many suggestions and recommendations provided by the experts were similar, but in many other cases, opinions differed. In most cases, the aggregate of the responses contributed to a more comprehensive response that was accepted by the group. This was particularly true in cases where the desired outcome was a list of options. In this case having multiple experts representing different informed technical domains is important.
In certain cases, experts engaged in out-of-box thinking, and the response ended up being very helpful. The responses were not directly related to the request but presented a paradigm to illustrate that operator responses to an event could vary depending on certain factors. One of the industry-HRA experts presented a general observation in the first round that was quickly picked up and referred to throughout the rest of the elicitation by the other experts. Specifically, the expert stated that there are certain operator actions (which they called Case 1) for which the time required clearly exceeds the time available, so no further work such as determination of probability distributions is needed. Likewise, it might also be very clear that time required is significantly less than time available, so the probability of exceeding time available is negligible
18 (they called this Case 2), and so no further work such as determination of probability distributions is needed. The remaining case (i.e., Case 3) is when more refined estimates of time available and time required is needed. Because the expert received the information request sufficiently ahead of time, they could work out the paradigm ahead of the workshop. Because of the structure of the workshop, the other experts could build on this concept in expressing their own views.
Another kind of insight had to do with identifying additional challenges to implementing part of the guidance in Section 3.6 of NUREG-2256 (NRC 2022a). Specifically, there was one insight regarding the challenges associated with developing a probability distribution for time available.
Experts pointed out that there are more contributors to the uncertainties associated with making this estimate than acknowledged in Section 3.6 of NUREG-2256 (i.e., the uncertainty of thermal-hydraulic code inputs.) The experts suggested that a new study was needed to develop guidance on how to reliably develop probability distributions for time available. It was not a stated goal of the expert knowledge elicitation to define future work, but it was an outcome of the experts responding to the detailed subpart of the overarching request associated with identification of approaches to estimating time availability distributions.
3.6.3 Importance of Information Elicited The importance of the information obtained in the expert knowledge elicitation is threefold.
First, information from the workshop was used directly to confirm the validity of certain guidance presented in Section 3.6 of NUREG-2256 (NRC 2022a). Second, information from the workshop was used directly in guidance developed by PNNL in a draft RIL (Coles et al. 2023b). Third, information from the workshop was used to identify the need for specific additional research related to determining the probability of exceeding the time available.
Certain Information from the workshop directly confirmed the validity of guidance presented in Section 3.6 of NUREG-2256 primarily as it pertains to the ways for estimating time required and determining a corresponding probability distribution. Information from the workshop that can be used directly for developing supplemental guidance to NUREG-2256 (NRC 2022a, Coles et. al.
2023b) pertained to estimating time required and time available. Moreover, the workshop also identified concepts that can be used as guidance for when to consider using probability distributions versus point estimates in estimating the probability of exceeding the time available to perform needed operator actions.
As stated above, identifying the needs for future research was not a stated goal of the expert knowledge elicitation. However, it was an outcome of the experts responding to the detailed subparts of the request associated with identification of approaches to determine a probability distribution for time available. This insight is viewed as important to deployment of guidance on addressing the probability of exceeding time available using the IDHEAS approach.
3.6.4 Separate Solicitation from Industry Leaders In addition to the expert elicitation, PNNL also solicited insights to the workshop information requests through email communications with two PRA and HRA industry leaders that have perspectives about HRA approaches and their applications. The two experts were suggested by NRC to PNNL as having considerable experience in providing guidance to the NRC in the PRA and HRA domains. As described above, the two experts participated separately because it was difficult schedule their time for the workshop session. More importantly, their participation might have influenced other experts judgment given their prominence and significant history of
19 leadership in the specific technical area of the workshop information requests. Both are prior members of the NRC Advisory Committee on Reactor Safeguards. Also, they both acknowledged that their practice of PRA is not current enough for them to participate in the expert elicitation.
PNNL provided the same information requests to these two industry leaders that were provided to the workshop experts and asked them to provide their insights by either responding to the same information requests as the workshop experts or offering general advice about the topics addressed. These experts choose not to respond to the request directly but to provide insights from their long experience and knowledge of HRA concerns and development. PNNL received email response from one non-workshop expert in August 2022, and from the second expert in September 2022. These responses are included in Appendix B of PNNL-33867 (Coles et. al.
2023a) and served to supplement the results of the expert knowledge elicitation workshop.
The responses from these experts confirmed many of the views offered in the workshop and augmented a few in important results. One of the experts expressed the view that uncertainty associated with estimating time available is not dominated by the uncertainty associated with thermal-hydraulic code inputs as is commonly practiced and as stated in NUREG-2256 (NRC 2022a). Rather the dominant source of uncertainty is related to how the results from limited thermal-hydraulic runs are used in HRA. The other expert also provided a view which augments this view (and the view of workshop experts). Their opinion was that use of simulator experiment data does not account for the differences between a simulator and a real operating plant, which they listed. These differences can impact the estimation of time available (as well as time required). These two comments reinforced the important insights from the workshop about defining future research associated with estimating time available and the impact of the multiple sources of time available uncertainty.
The comment by the second expert about limitations of data from simulator experiments is acknowledged and addressed in detail the draft RIL (Coles et. al. 2023b). Another comment was that time and experience has shown that when time estimations are made using expert judgment, starting with a point value creates a strong bias in favor of the point value as the mean value. The expert expressed that it is was better to start with the minimum and maximum and then approach to the mean. This comment was augmented with the comments by the workshop experts and used in part in the draft RIL cited above.
In summary, the use of comments from experts outside of the panel is considered supplementary to the insights generated in the workshop because these experts chose what to discuss and elaborate in their response as being important rather than providing a complete discussion of the points in the three information requests. Accordingly, their comments were used to confirm, augment, and reinforce the views and insights from the workshop. As it turned out, they communicated no competing views from the views of the workshop experts. The prominence of this separate solicitation adds to the validity their supplemental comments particularly in cases in which their view augmented the views expressed in the workshop.
20 4
INSIGHTS AND CONCLUSIONS Following are summaries of key insights from the use of expert elicitation guidance in this small-scale application:
- 1.
Common Understanding - Iteration and follow-up with the experts before the workshop were important to achieving a common understanding of the information requests and what was expected from the experts during the workshop. In June 2022, the NRC invited HRA experts to a virtual workshop. PNNL followed up with the experts willing to participate, on July 7, 2022, by emailing the information requests developed for the workshop and a brief description of the structure of the virtual workshop. The PNNL facilitator followed up with each expert individually by telephone or email to address any questions or concerns. Two days before the workshop, which was held on July 20, 2022, PNNL transmitted the final information requests, workshop schedule, and workshop notes.
- 2.
Selection of Experts - Consideration of an experts relationship with the elicited information was important in selection of experts to ensure that the expert panel represents the overall informed technical community.
- 3.
Avoiding Negative Group Dynamics - The workshop structure provided dedicated but limited time for each expert to initially present expert views and judgments without interruption or feedback. This helped reduce negative group dynamics in which experts who might be highly opinionated, dominating, or not interested or willing to hear the perspective of others might have undue influence in the workshop.
- 4.
Engagement - The expectation that the experts use their allotted time and encouragement to express their views from their perspective and within their frame of reference created the atmosphere for the experts to be mentally present in the workshop.
- 5.
Integration - After all experts finished providing their responses to an information request, the workshop structure provided dedicated time for summary remarks by the facilitator, follow-up comments and questions by experts, and comments by the observers. This allowed a certain level of information integration to occur. Also, at the end of the workshop there was dedicated time for the experts to modify or refine their responses based on the inputs and discussion.
- 6.
Documentation - A week after the elicitation, a record of the workshop input was sent to the six workshop experts along with a request to confirm, correct, add to, subtract from, clarify, or augment the responses so that it reflected the messages they intended to present. Follow-up was performed until the experts confirmed the record or sent adjustments. This provided a measure of integration and a high degree of transparency and traceability of the experts inputs.
- 7.
Observers - Participation of the observers was valuable to help refine what was meant by the response from experts.
- 8.
Facilitator Role - The role of the facilitator was important to the integrity of the sessions and the elicited responses. It was important for the facilitator to keep the response on topic, within the time window, and to clarify the meaning of response without projecting a personal interpretation onto the response.
21
- 9.
Nature of Results - For this application, the aggregate of the responses seemed to contribute to a more comprehensive response that was accepted by the group. This was particularly true in cases where the desired outcome was a list of options. In those cases, having multiple experts seemed important to getting a sufficient response.
- 10. Significance of Results - The expert knowledge elicitation application described in this RIL provided several important insights that can be used directly to develop supplemental guidance for NUREG-2256 (NRC 2022a). In one instance the need for further study or research was identified.
Although the expert knowledge elicitation described in this RIL was a small-scale effort, use of guidance on performing expert elicitation for a structured, systemic approach helped reduce systemic biases. Small-scale in this case means the example application was a knowledge elicitation based on the experts knowledge and experience about specific approaches to estimating times and not an elicitation of numerical value judgments and not quantitative or complex enough to require a database. Also, it was limited in terms of time commitments from the experts for preparation and participation in the 21/2 hour workshop. However, the results show that with minimal extra effort to apply the expert elicitation principles, the advantages from employing a structured systemic expert elicitation effort were realized.
22 5
REFERENCES ASME/ANS, 2009. Addenda to ASME/ANS RA-S-2008, Standard for Level 1/Large Early Release Frequency Probabilistic Risk Assessment for Nuclear Power Plant Applications, ASME/ANS RA-Sa-2009. New York: American Society of Mechanical Engineers.
Coles et. al., 2023a. Expert Knowdge Elicitation Process on Methods to Determine the Probability of Time Exceeded - In support of Task 4, PNNL-33857 dated Janaury 2023 by Coles GA, Prasad R, Baweja JA, Jefferson BA, Pacific Northwest National Laboratory, Richland Washington.
Coles et. al., 2023b Integrated Human Event Analysis System (IDHEAS) - Consideration for Supplemental Guidance for NUREG-2256, DRAFT RIL dated December 2023 by Coles GA, Jefferson BA, Baweja JA, Prasad R NRC, 1996. Branch Technical Position on the Use of Expert Elicitation in the High-Level Radioactive Waste Program, NUREG-1563, November 1996, U.S. Nuclear Regulatory Commission, Washington DC. https://www.nrc.gov/reading-rm/doc-collections/nuregs/staff/sr1563/index.html.
NRC, 1997. Recommendations for Probabilistic Seismic Hazard Analysis: Guidance on Uncertainty and Use of Experts, NUREG/CR-6372, Vols. 1 and 2, April 1997, U.S. Nuclear Regulatory Commission, Washington DC. Accessed at https://www.nrc.gov/reading-rm/doc-collections/nuregs/contract/cr6372/index.html.
NRC, 2009. An Approach for Determining the Technical Adequacy of Probabilistic Risk Assessment Results for Risk Informed Activities, Regulatory Guide (RG) 1.200, Revision 2, dated March 2009. Accessed at https://www.nrc.gov/docs/ML0904/ML090410014.pdf.
NRC, 2012. Practical Implementation Guidelines for SSHAC Level 3 and 4 Hazard Studies, NUREG-2117, Rev. 1, April 2012, U.S. Nuclear Regulatory Commission, Washington DC.
Accessed at https://www.nrc.gov/reading-rm/doc-collections/nuregs/staff/sr2117/index.html.
NRC, 2018. Updated Implementation Guidelines for SSHAC Hazard Studies, NUREG-2213, 2018, U.S. Nuclear Regulatory Commission, Washington DC. https://www.nrc.gov/reading-rm/doc-collections/nuregs/staff/sr2213/index.html.
NRC, 2020. An Approach for Determining the Technical Adequacy of Probabilistic Risk Assessment Results for Risk Informed Activities, Regulatory Guide (RG) 1.200, Revision 3, dated December 2020. Accessed at https://www.nrc.gov/docs/ML2023/ML20238B871.pdf.
NRC, 2022a, Integrated Human Event Analysis System for Event and Condition Assessment (IDHEAS-ECA), NUREG-2256, October 2022, U.S. Nuclear Regulatory Commission, Washington DC. Accessed at https://www.nrc.gov/reading-rm/doc-collections/nuregs/staff/sr2256/index.html.
NRC, 2022b, Guidance for Conducting Expert Elicitation in Risk-Informed Decisionmaking Activities), NUREG-2255, September 2022, U.S. Nuclear Regulatory Commission, Washington DC.
23 TSTF, 2018. Final Revised Model Safety Evaluation of Traveler TSTF-505, Revision 2, Provide Risk Informed Extended Completion Times - RITSTF Initiative 4B, dated November 29, 2018, Technical Specification Task Force (TSTF) to the Nuclear Regulatory Commission, Washington DC. Accesssed at https://www.nrc.gov/docs/ML1826/ML18269A041.html.