ML22229A500
ML22229A500 | |
Person / Time | |
---|---|
Issue date: | 08/03/2022 |
From: | Dennis M Office of Nuclear Regulatory Research |
To: | |
Dennis M | |
References | |
NRC-2045 | |
Download: ML22229A500 (83) | |
Text
Official Transcript of Proceedings NUCLEAR REGULATORY COMMISSION
Title:
Public Meeting on NRC Draft Artificial Intelligence Strategic Plan for Fiscal Years 2023-2027 Docket Number: (n/a)
Location: teleconference Date: Wednesday, August 3, 2022 Work Order No.: NRC-2045 Pages 1-82 NEAL R. GROSS AND CO., INC.
Court Reporters and Transcribers 1716 14th Street, N.W.
Washington, D.C. 20009 (202) 234-4433
1 1 UNITED STATES OF AMERICA 2 NUCLEAR REGULATORY COMMISSION 3 + + + + +
4 PUBLIC MEETING ON NRC DRAFT ARTIFICIAL INTELLIGENCE 5 STRATEGIC PLAN FOR FISCAL YEARS 2023-2027 6 + + + + +
7 WEDNESDAY, 8 AUGUST 3, 2022 9 + + + + +
10 The public meeting convened via Video 11 Teleconference, at 1:00 p.m. EDT, Brett Klukan, 12 Facilitator, presiding.
13 14 PRESENT:
15 BRETT KLUKAN, Meeting Facilitator 16 THERESA LALAIN, Deputy Director, Division of Systems 17 Analysis, Office of Nuclear Regulatory Research 18 LUIS BETANCOURT, P.E., Chief, Accident Analysis 19 Branch, Office of Nuclear Regulatory Research 20 MATT DENNIS, Reactor Systems Engineer (Data 21 Scientist), Division of Systems Analysis, Office 22 of Nuclear Regulatory Research 23 24 25 NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
2 1 TABLE OF CONTENTS 2 Page 3 Opening Remarks and Agenda . . . . . . . . . . . 3 4 Artificial Intelligence Strategic Plan 5 Presentation . . . . . . . . . . . . . . . . . . 7 6 Public Comment . . . . . . . . . . . . . . . . . 21 7 Adjourn . . . . . . . . . . . . . . . . . . . . . 82 8
9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
3 1 P R O C E E D I N G S 2 1:01 p.m.
3 MR. KLUKAN: Welcome, everyone, again.
4 Next slide, please. So good afternoon, my name is 5 Brett Klukan. Normally I serve as the Regional 6 Counsel for Region I of the U.S. Nuclear Regulatory 7 Commission, but this afternoon I will be acting as the 8 facilitator for this meeting.
9 This meeting is being held to discuss the 10 NRC's Draft Artificial Intelligence Strategic Plan for 11 fiscal years 2023-2027. For this meeting, we are 12 using Microsoft Teams.
13 We hope that the use of Microsoft Teams 14 will allow you, the stakeholders, to participate more 15 freely during the meeting. But this will also require 16 us to continually ensure that we are muted when we are 17 not speaking and to do our best not to speak over each 18 other.
19 To help facilitate the question and answer 20 portion of the meeting, we also recommend that you use 21 the raise-hand feature in Teams so we can more easily 22 identify who has a question or a comment and then call 23 on that individual to then pose their question or 24 comment.
25 You can also use the chat feature to alert NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
4 1 us if you have a technical question. We would ask 2 that you please do not use the chat to address any 3 substantive questions. The chat will not be 4 considered part of the official meeting record and is 5 tended to be used mainly for handing virtual meeting 6 logistical issues. For example, I can't hear the 7 speakers, the slides aren't appearing for me, whatnot.
8 Next slide, please.
9 So before I turn to the meeting agenda, 10 just a few logistical matters. This meeting is being 11 again hosted as a Microsoft Teams meeting and is being 12 recorded and transcribed.
13 Should you have trouble using the 14 Microsoft Teams application, I would recommend that 15 you, one, or first, use the Microsoft Teams link 16 provided in the meeting notice as opposed to the 17 Microsoft Teams app, or application.
18 Second, you can try disconnecting and 19 trying to reconnect to this Teams meeting. Or third, 20 as a final alternative, you can use the teleconference 21 number that has also been provided in the meeting 22 notice to listen to and speak during the meeting.
23 This public meeting has been categorized 24 as a comment-gathering meeting, which means that this 25 meeting is for NRC staff to meet directly with NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
5 1 individuals to receive comments from participants on 2 specific NRC decisions and actions to ensure that the 3 NRC staff understands their views and concerns.
4 Lastly, I would ask again that you please 5 endeavor to limit any interruptions during the 6 meeting. We ask again that you remain on mute until 7 you are ready to engage the participants. Prior to 8 addressing the meeting participants, please speak 9 clearly, identifying yourself and any group 10 affiliations.
11 The slides for this meeting have been 12 posted on the public meeting notice site and can be 13 found in ADAMS, and I'm going to read the ML number to 14 you for those of your participating via the phone.
15 The ML number for the slides within ADAMS is ML-222-16 13A-273. Again, that's ML-222-13A-273.
17 If you did not register for this meeting 18 using the Teams webinar link, please send an email to 19 Matthew Dennis, M-A-T-T-H-E-W dot D-E-N-N-I-S at 20 NRC.gov so we can document your attendance in the 21 meeting summary. Contact meeting for this public 22 meeting -- the public meeting organizing staff can 23 also be found on the meeting's public notice.
24 And now we move on to today's agenda.
25 Next slide, please. So today's meeting will feature NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
6 1 a discussion again of the NRC Draft Artificial 2 Intelligence Strategic Plan and will be followed by a 3 public comment-gathering period for interested 4 attendees.
5 First, the NRC will be providing opening 6 remarks -- as opening remarks, an overview of the 7 Draft AI Strategic Plan. Next, the NRC staff will 8 present on the regulatory purpose, AI definition, 9 preparatory activities, supporting development of the 10 strategy, and then overview of the strategy and next 11 steps.
12 Following the NRC presentation, we will 13 then have an opportunity for the public to provide 14 comments and ask questions of the NRC staff. Other 15 meeting participants are welcome to provide responses 16 but are not obligated to do so.
17 This meeting is scheduled to run from 1:00 18 p.m. to 3:00 p.m. Eastern Time. We will try to allow 19 for as much public input as possible, but again, we 20 will try to adhere to the meeting schedule.
21 At this time, I would now like to turn the 22 meeting over to Dr. Teri Lalain, a Deputy Division 23 Director for the Division of Systems Analysis in the 24 Office of Regulatory Research, who will provide the 25 NRC's opening remarks. Thank you.
NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
7 1 DR. LALAIN: Great, thank you, Brett.
2 Welcome, everybody. I'm please to be here today to 3 discuss our draft strategic plan with you, and thank 4 you for your interest in joining us today for this 5 public meeting. Next slide, please.
6 Artificial intelligence is one of the 7 fastest growing technologies globally and an area the 8 NRC must be prepared to evaluate technological 9 adoption by the nuclear industry.
10 As a result, there's interest, industry 11 interest in deploying these technologies to meet our 12 future national energy needs. And as a modern risk-13 informed regulator, we must keep pace with these 14 innovations while reducing barriers and ensuring safe 15 and secure use of AI.
16 To increase our awareness, during 2021 we 17 held three public workshops bringing together the 18 nuclear community to discuss the current and future 19 state. And one of the takeaways that we'll talk about 20 today is that AI is a very general term and it can 21 span a range of techniques and varying levels of 22 autonomy. So it's important that we're clear in our 23 language to achieve that common understanding when 24 we're discussing AI.
25 To prepare the Agency for this future, NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
8 1 Research prepared the draft AI strategic plan that 2 includes several goals that will be discussed further 3 today. I am also joined today by Mr. Luis Betancourt, 4 the Chief of our Accident Analysis Branch, and Mr.
5 Matt Dennis, our Data Scientist in the Accident 6 Analysis Branch. Next slide, please.
7 All right, Matt, over to you.
8 MR. DENNIS: All right, thank you very 9 much, Teri. As Teri said, my name is Matt Dennis. I 10 am a Reactor Systems Engineer Data Scientist in the 11 Office of Nuclear Regulatory Research. And today I 12 will be presenting an overview of the strategic plan.
13 So first off is the purpose. As you might 14 already be aware, and Teri just mentioned, that AI is 15 becoming present in a lot of our -- in our personal 16 lives and our lives across all the things that we are 17 involved in. And that -- and the nuclear industry is 18 no exception.
19 So it has the potential to transform the 20 industry by providing new and vital insights into vast 21 amounts of data generated during the design and 22 operation of a nuclear facility. And in addition, 23 these technologies offer new opportunities to improve 24 safety and performance, operational performance, and 25 implement autonomous control and operation.
NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
9 1 And as a result, the industry, we are 2 aware, is researching these applications, as Teri 3 mentioned. And through those workshops, which we'll 4 discuss a little bit more in a minute, we got a flavor 5 of what is going on in the industry and what is 6 planned for the future through research projects or 7 other opportunities that are being undertaken.
8 So it's critical for us to determine how 9 these factors are evolving and keep abreast of all of 10 those, again, as Teri mentioned. So we need to look 11 at what is being done. AI is being used over a wide 12 range of nuclear power plant operations, from mining 13 nuclear data for predictive maintenance and 14 understanding core dynamics for more accurate reload 15 fuel planning.
16 So we recognize that there is a regulatory 17 decisionmaking component to this potentially, and we 18 are interesting -- interested in understanding the 19 possible regulatory implications on using AI within 20 the industry and to ensure that these technologies are 21 developed in a safe and secure manner.
22 So we see this as an opportunity to shape 23 the norms and values that enable the responsible use 24 of -- responsible and ethical use of AI. And we must 25 be prepared to effectively and efficiently review and NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
10 1 evaluate the use of AI in NRC-regulated activities.
2 So next we'd like to discuss a little bit 3 of how we envision achieving a common understanding of 4 the term AI, and specifically, more specifically, what 5 that means in the context of the nuclear industry.
6 So you'll notice on the slide we have a 7 few key words there that are underlined in our -- in 8 the definition of artificial intelligence. I will 9 note that this definition is taken as a modification 10 from the National Defense Authorization Act of 2021.
11 So the AI strategic plan considers an 12 evolving landscape where computers use data and unseen 13 behavior to construct underlying algorithmic models, 14 draw inference, and define the rules to achieve a 15 task. So the AI strategic plan focuses on a broad 16 spectrum of sub-specialities, for example, nature 17 language processing, machine learning, and deep 18 learning, which could encompass various algorithms and 19 application examples for which the NRC has not 20 previously reviewed or evaluated.
21 And so some of those words tehre that are 22 underlined include emulate human-like perception, 23 cognition, planning, learning, and communication or 24 physical action. AI is intended to perceive real and 25 -- both real and virtual environments, that's an NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
11 1 important distinction here. And AI is the top level 2 of -- which encompasses machine learning. And so 3 that's why we have the definition of machine learning 4 down there.
5 And the definition, again, taken from the 6 Defense Authorization Act, is an application -- it is 7 an application of artificial intelligence that is 8 characterized for -- by providing systems the ability 9 to learn and improve, those are two key words, on the 10 basis of data. And data is very important for any of 11 these activities.
12 So next I want to give everyone a little 13 bit of background information on how we had been 14 preparing the AI -- the draft AI strategic plan, what 15 we are learning, and how we are getting ready for the 16 future.
17 So the NRC is anticipating that the 18 industry may deploy AI technology that could require 19 some regulatory review and approval within the next 20 five years. And we are proactively developing the AI 21 strategic plan to better position the Agency in AI 22 decisionmaking.
23 The plan includes goals for AI 24 partnerships, cultivating an AI-proficient workforce, 25 building an AI foundation through AI usage, and NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
12 1 assuring NRC readiness for AI decisionmaking. We plan 2 to use the strategic plan as a tool to increase 3 regulatory stability and certainty -- not uncertainty, 4 certainty.
5 This plan will also facilitate 6 communication and enable the staff to provide timely 7 regulatory information to our internal and external 8 stakeholders.
9 In developing this plan, we formed a group 10 -- an interdisciplinary team of AI subject matter 11 experts across the Agency and to increase awareness of 12 AI's technological adoption in the industry, we held 13 three public workshops that Teri mentioned early in 14 2021.
15 There is a link to the public workshops, 16 a hyperlink, and the text at the bottom of this slide 17 if you would like to go to look at them. We have the 18 presentations from all of the workshop participants 19 there.
20 And those workshops brought together the 21 nuclear community to discuss the current and future 22 state of AI. And in 2021, additionally the NRC issued 23 a Federal Register notice on the role of artificial 24 intelligence, tools in the U.S. commercial nuclear 25 power operations.
NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
13 1 The insights gained from that public 2 comment were used to develop NUREG/CR-7294. The title 3 of that report is Exploring Advanced Computational 4 Tools and Techniques within Artificial Intelligence 5 and Machine Learning in Operating Nuclear Plants. The 6 document -- it documents the current state of practice 7 of AI tools in the nuclear industry.
8 And lastly, we are committing to provide 9 -- committed to providing opportunities for the public 10 to participate meaningfully in our decisionmaking 11 process. As we continue the development of this plan, 12 we plan to solicit comments from the public and 13 feedback from the Advisory Committee on Reactor 14 Safeguard in the fall of 2023 -- I'm sorry, 2022.
15 So next, we want to give some time to talk 16 about the individual strategic goals of the strategic 17 plan. And so on the left you see that's the title 18 page of our artificial intelligence strategic plan, 19 NUREG 2261.
20 It covers fiscal years 2023-2027 and 21 establishes the vision and goals for the NRC to 22 continue to improve its skills and capabilities to 23 review and evaluate the application of AI in NRC-24 regulated activities, maintain awareness of 25 technological innovations, and ensure the safe and NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
14 1 secure use of AI in NRC-regulated activities.
2 The AI strategic plan, as you see on the 3 slide, includes five goals. The first goal is to 4 ensure NRC readiness for regulatory decisionmaking.
5 Second is establish an organizational framework to 6 review AI applications.
7 Third is to strengthen and expand AI 8 partnerships. Fourth, cultivate an AI-proficient 9 workforce. And five is to pursue use cases to build 10 an AI foundation across the NRC.
11 Those goals listed are in order of 12 priority and expected to be initiated during varying 13 timeframes.
14 So I wanted to talk a little bit more 15 specifically about the intent of each one of those 16 goals. So the first strategic goal is the ultimate 17 outcome of the implementation of this strategic plan, 18 which is to continue to keep pace with technological 19 innovations to ensure safe and secure use in NRC-20 regulated activities through existing or new 21 regulatory guidance, rules, inspection procedures, or 22 oversight activities.
23 And for this goal, the successful outcome 24 would be providing, as needed, the regulatory guidance 25 and tools to ensure readiness for reviewing the use of NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
15 1 AI in NRC-regulated activities.
2 The second strategic goal -- sorry. AI 3 goals, strategic goals 2 through 5, directly support 4 preparatory activities that culminate in successfully 5 supporting technical readiness for regulatory 6 decisionmaking activities desired in goal number 1.
7 So goal number 2 -- goals number 2 through 8 5 support goal number 1. The establish -- and goal 9 number 2, the establishment of the organizational 10 framework, would be undertaken, which ensures all 11 aspects of the NRC are represented in the preparation 12 for reviewing AI in NRC-regulated activities.
13 The successful outcome of this goal is an 14 organization that facilitates effective coordination 15 and collaboration across the NRC for those intended 16 purposes of NRC use of AI in NRC-regulated activities.
17 Goal number 3 is strong partnerships, and 18 those are essential to ensuring the safe and secure 19 use of AI in the nuclear industry. The NRC is 20 committed to engaging the industry and relevant 21 stakeholders to maintain awareness of industry efforts 22 and prepare for regulatory reviews.
23 When achieved, this goal will provide a 24 mechanism to, one, maintain aware of industry plans.
25 Two, establish communication forums to discuss future NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
16 1 plans and regulatory needs. And three, effectively 2 partner with other agencies on AI topics of mutual 3 benefit.
4 Strategic goal number 4, the NRC will also 5 engage in workforce development and acquisition to 6 ensure that the NRC staff and contractors have the 7 critical skills required to evaluate AI, the use of 8 AI, in NRC-regulated activities. The goal is to adopt 9 an appropriate training programs and tools to develop 10 the requisite skills in the NRC workforce.
11 A successful outcome of this goal is to 12 ensure appropriate qualifications, training, 13 expertise, and access to tools -- and access to tools 14 exists for the NRC workforce to review and evaluate AI 15 usage in NRC-regulated activities.
16 And then finally, strategic goal number 5, 17 the NRC recognized the establishment of a foundation 18 in AI requires a broad swath of data science 19 principles as a fundamental requirement for evaluating 20 AI applications. And therefore, the NRC will build 21 the necessary AI foundation to pursue use cases across 22 the NRC.
23 This will help build an organizational 24 experience that supports future regulatory reviews and 25 oversight activities. For this goal, a successful NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
17 1 outcome is one in which the NRC staff possesses an 2 ecosystem that supports AI analysis, integration of 3 emerging AI tools, and hands-on talent development for 4 reviewing AI applications from the nuclear industry.
5 As was mentioned earlier, the public 6 meeting notice has a link to the actual strategic 7 plan. It is also linked here. It's also linked in 8 the PDF of those slides. You can click on the actual 9 NUREG to the left there. Or it's available if you 10 search in ADAMS with the accession number ML-221-75A-11 206.
12 Next, we want to provide a little more 13 discussion on what we mean by notional AI -- we want 14 to discuss a notional AI and autonomy levels in 15 commercial nuclear activities. So as we're aware, AI 16 technology provides the underlying capability for 17 autonomous systems. While AI enables autonomy, 18 though, not all cases of AI are autonomous. And we 19 want to make that distinction.
20 So for example, many AI capabilities may 21 be used to augment human decisionmaking rather than 22 replace it. And so the table shown here provides a 23 notional AI and autonomy level framework in potential 24 commercial nuclear activities. Higher autonomy levels 25 indicate less reliance on human intervention or NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
18 1 interaction, and therefore may require greater 2 regulatory scrutiny of the AI system.
3 AI strategic goal number 1 will assess the 4 current regulatory framework and in part establish the 5 appropriate regulatory requirements for varying 6 degrees of AI and autonomy.
7 In the figure that you see on the screen 8 here, with increasing machine independence, you have 9 decreasing human involvement. And so at level one, we 10 have name these each category level one, two, three, 11 and four, with notional AI autonomy levels of insight, 12 collaboration, operation, and full autonomy.
13 It's worth noting to point out that at 14 level one, that would be the lowest level of machine, 15 human-machine interaction, where the human 16 decisionmaking is merely assisted by a machine, all 17 the way up to level four, which would be fully 18 autonomous, where the machine is making decisionmaking 19 with no human intervention.
20 And so our regulatory framework will need 21 consider a graded approach in which each one of these 22 autonomy levels, four different systems, may be 23 applied at varying levels with regulatory scrutiny.
24 Lastly, in conclusion, we want to talk a 25 little bit more about the next steps for the NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
19 1 artificial intelligence strategic plan and where we 2 can -- where you can be involved or where to look out 3 for more information.
4 We'd like to discuss where we currently 5 are with the draft strategic plan in development, and 6 that's what we've been presenting here today at this 7 public meeting, and our plans moving forward. Our 8 ultimate goal is to issue the AI strategic plan in 9 spring 2023.
10 Our early coordination dialog and 11 preplanning are key to increasing our regulatory 12 stability and certainty for the industry to deploy AI 13 technologies. This includes engagements across the 14 federal government, industry, and international 15 organizations.
16 And early engagement and information 17 exchange supports staff knowledge management and 18 development, which enables timely development and 19 execution of the AI strategic plan.
20 To ensure stakeholder engagement, we have 21 developed the timeline shown in this slide of our 22 current for the remainder of the year, and we 23 encourage your participation and comments.
24 As part of our communication strategy, we 25 recently unveiled -- unveiled our AI public website, NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
20 1 making it easier for members of the public to obtain 2 information about the staff's AI activities, such as 3 public meetings like this, workshops, and other 4 activities.
5 We recognize that there is a public 6 interest in the potential regulatory implications of 7 AI in nuclear activities, and we want to provide 8 opportunities for the public to be heard.
9 So as you can see on the screen here, in 10 November we plan to brief the Advisory Committee on 11 Reactor Safeguards Joint Subcommittee on AI. I 12 already mentioned that in spring 2023, we plan to 13 issue the final strategic plan.
14 And as a reminder, this is a public 15 meeting on -- for comment-gathering on our AI 16 strategic, so we would direct you, you can use the 17 hyperlink in the slides.
18 Or it is also referenced in the public 19 meeting notice to go to the Federal Register notice at 20 regulations.gov using the docket ID number for this 21 FRN and submit your written public comments. So 22 again, you can use the link here on the slides or find 23 that in the public meeting notice.
24 And also the hyperlink there to the NRC AI 25 public website is provided, as it is down at the NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
21 1 bottom of the slide in text form.
2 So that concludes our prepared comments 3 and slides on the AI strategic plan. I will turn it 4 back over now to Brett.
5 MR. KLUKAN: Great, thank you. So again, 6 this brings us to the public question and comment 7 portion for this meeting. I would ask individuals 8 when they begin speaking to please identify 9 themselves, name and any affiliation if you so choose 10 to list an affiliation, before you begin with your 11 questions or comments.
12 The NRC welcomes comments from the public 13 on any areas they believe are relevant to the topics 14 we've discussed today. With that said, the NRC is 15 particularly interested in receiving input on the 16 following questions, and I will read those out loud 17 for those of you on the phone.
18 Are there any specific recommendations for 19 improvements to consider in the development of the AI 20 strategic plan? What goals, objectives, or strategies 21 within the NRC's current strategic plan should be 22 added, enhanced, or modified in the AI strategic plan?
23 What are the potential near-term or far-24 term AI activities that the NRC should be aware when 25 finalizing and prioritizing the AI strategic plan or NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
22 1 associated supporting research? And finally, what are 2 potential challenges the NRC should be aware of when 3 preparing to review the potential use of AI in nuclear 4 applications?
5 So again, for those of you using the Teams 6 application or in the browser, please use the raise-7 hand function, and then a presenter will allow you to 8 unmute. And then you will need to unmute yourself and 9 then begin your comment.
10 For those of you calling in, the process 11 is a little different. In order to raise your hand, 12 press *5, again, that's *5. When you are called upon, 13 the presenter will unmute you. You will then need to 14 press *6 to unmute yourself within the system.
15 Again, that is *6 to unmute yourself 16 within the system. And then separately, if you've 17 muted yourself separately on your phone, you would 18 need to unmute that as well, okay. So again, that's 19 *5 to raise your hand and *6 to unmute yourself.
20 And for those participating via the app or 21 via browser, use the raise-hand function, which is in 22 the top right-hand corner of your screen under 23 reactions or react. And I think it's located in the 24 same, generally the same area for those of your 25 participating in the browser.
NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
23 1 So with that said, we are ready to begin 2 public comments and questions. So please feel free to 3 raise your hands, and I'll call on the order of 4 individuals in terms of the order in which they raise 5 their hands. Thank you.
6 So James Slider, I am going to -- go right 7 ahead, James Slide.
8 MR. SLIDER: Thank you, Brett. This is 9 Jim Slider, I just want to verify you can hear me.
10 MR. KLUKAN: We can hear you.
11 MR. SLIDER: Okay, great, thank you. I'm 12 Jim Slider from the Nuclear Energy Institute. And I 13 have lead responsibility for our work in supporting 14 innovations for the operating plants, including 15 adapting artificial intelligence for use in our 16 operating facilities.
17 Just a couple of questions to get the 18 conversation started. Matt, the timeframe given for 19 this plan is fiscal years '23 through '27, and I'm 20 curious, you're going to be halfway through fiscal '23 21 before you have the final plan out.
22 And I'm wondering, there's nothing in this 23 document that indicates what's going to be happening 24 over those succeeding couple of years in the scope of 25 this document. Can you describe or give any sort of NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
24 1 overview of what we can expect the NRC to focus on in 2 the latter half of fiscal '23 out through '27?
3 MR. DENNIS: I can -- I will give a brief 4 response, and then actually ask Teri to give the more 5 fulsome answer. But yes, in the latter part of '23 6 once this is finalized, one of the first action items 7 is going to be that prioritization of organizational 8 framework.
9 And so there is some mention of that with 10 the strategic goal number 1, and that is standing up 11 the, either the working groups or community of 12 practice for that AI, to support the remainder of the 13 AI and do the prioritization of activities within 14 budget constraints and such.
15 So Teri, if you want to elaborate on that.
16 DR. LALAIN: Yeah. Hey, Jim. So the 17 strategic plan years map to the NRC's overarching 18 strategic plan. So that's where the year increments 19 tie back to. And as Matt said, you know, we'll be 20 kicking off, we'll be getting the pieces in place, 21 we'll be working on each of those tasks.
22 So we anticipate, for example, in the 23 external awareness one, we're going to continue with 24 having workshops and other activities, so we'll 25 continue having the dialogs in a public forum. And NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
25 1 that we're tracking together.
2 So all of the things will start to launch 3 out when we officially kick off in '23. Over.
4 MR. SLIDER: Okay, thank you, Teri. I'm 5 just trying to get a mental picture of what we can 6 expect to see from the NRC as you start to move out on 7 the actions that fall under the scope of this plan.
8 Thank you.
9 MR. KLUKAN: Well, thank you, Jim, for 10 your questions and for participating today.
11 Next we'll turn to Tyler Cody. Tyler 12 Cody, whenever you're ready, please feel free to 13 unmute yourself.
14 DR. CODY: Thank you, yes, I'm Tyler. I 15 am with the Virginia Tech National Security Institute.
16 I'm a research assistant professor, and my background 17 is in systems engineering and where systems 18 engineering meets artificial intelligence.
19 I had a comment on the question one here 20 about specific recommendations or improvements. On --
21 (inaudible) pretty specific, but on line 34, page 4-2 22 in the strategic goal 1 section, like 4.1, the 23 statement reads, The NRC will undertake research to 24 develop an AI framework to determine the approach to 25 assess areas such as, but not limited to, NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
26 1 explainability, trustworthiness, bias, robustness, 2 ethics, security risks, and technical readiness of AI.
3 So while these topics are definitely top 4 of mind in the computer science community and like the 5 people who are very close to artificial intelligence, 6 from a systems engineering perspective, I think that 7 these various topics are akin to properties we would 8 like an AI solution to have. And that there are other 9 concerns which are fundamental to engineering 10 generally, but may take on different forms in AI that 11 it's important to consider.
12 So I just want to mention too, basic and 13 related ones. And the first is test and evaluation, 14 which is you need to just even see if these properties 15 are actually possessed by the AI. So it's one thing 16 to say, oh, we have a method for explainability, but 17 how do we even know that method's working? What kind 18 of scenarios will we test that method in?
19 So under test and evaluation, there are 20 two sort of issues I think about and I just want to 21 highlight them for the sake of being a little thought-22 provoking. But the first is that there's this 23 prevailing focus on using held-out but identically 24 distributed test sets of data.
25 And identically distributed testing checks NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
27 1 that the algorithm's working, but like the learning 2 algorithm or the AI algorithm, but it's not actually 3 testing if, say, the function approximation that it 4 produces, right. Because the AI produces some 5 function that tells us, okay, give me these inputs, 6 I'll give you those outputs.
7 It's not actually telling us if that's 8 going to work as intended in the variety of scenarios 9 we're going to face during operation. So the first 10 part is this identically distributed testing.
11 But the second is sort of built on that, 12 because while the AI solutions themselves will be like 13 these input-output components for most part, they also 14 have these system-level influences. So, and they 15 create system-level effects, and you know, system-16 level outcomes that we look at.
17 And so just the input-output component 18 level testing might lack the scope to properly 19 identify operating envelopes of AI solutions in terms 20 of their system or the environment they're in, like 21 what the operating conditions are, etc.
22 So backing out, I think testing evaluation 23 has these dual challenges where the -- right, 24 currently the component-level tests are insufficient.
25 But also, the component-level tests won't be all that NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
28 1 we need to make sure that, you know, we're producing 2 the outcomes we want from the AI.
3 And I know I'm going on here, but briefly, 4 the second related issues is on the life cycle, and 5 it's very closely related, right. So AI solutions do 6 have life cycles. But that's kind of an understudied 7 topic. So whatever framework that's proposed should 8 involve some plans for test and evaluation over the 9 life cycle to sort of monitor the health of the AI, if 10 you will.
11 And also address the concepts of, like, 12 system maintenance that we have generally, right. So 13 for the AI, what does is it to retrain it or 14 recalibrate it, etc. As well as retirement, you know, 15 what is the process for retiring an AI model.
16 So I will concede that, you know, this is 17 really relevant in dynamic settings. So you know, in 18 -- in a nuclear setting, things are very constrained.
19 So it's important to say okay, how much variance can 20 we expect, where will that variance come from. And 21 that's where life cycle and test and evaluation I 22 think should be focused on.
23 So in short, my specific recommendation is 24 to consider an emphasis on test and evaluation and 25 life cycle management, in addition to these pillars or NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
29 1 these properties that are currently listed on line 34.
2 MR. KLUKAN: Tyler, thank you so much for 3 participating and for your comments today. And 4 please, no worries, the purpose of this meeting is to 5 hear from -- from you and from others about, you know, 6 your comments on the report and on the questions that 7 we pose here today. So again, thank you.
8 And again, NRC staff, please feel free to 9 raise your hands as well if you want to jump in at any 10 point to respond to anything that anyone said. But 11 next, we will turn to Ian Davis. Ian Davis, whenever 12 you're ready, please feel free to unmute yourself and 13 begin your comments.
14 MR. DAVIS: Hi, thank you, Brett. This is 15 Ian Davis from X-energy. My questions are related to 16 the level of industry involvement with these 17 activities, and specifically in two areas.
18 One, there's a mention of an AI community 19 of practice, and I was curious if this was an internal 20 community or if this was going to be open to the 21 public.
22 And another section talks about pilot 23 projects and proof-of-concept projects, and I was 24 wondering if that was also going to be entirely 25 internal to the NRC, or if that would involve NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
30 1 partnerships with industry, and if so, what would that 2 look like. Thank you.
3 MR. KLUKAN: Thank you for your -- for 4 your questions. I'll turn it over to Matt.
5 MR. DENNIS: Thanks, yeah, I was going to 6 -- I unmuted myself and I'll turn my camera back on, 7 so.
8 Yes, in response to Ian's first question 9 about the AI community of practice that is referenced 10 in, I was trying to flip back through and find the 11 exact place in the document. I think that's in 12 Section 4.2 somewhere around line 15, To support staff 13 engagement, the NRC will establish an AI community of 14 practice.
15 And that is intended to be comprised of 16 internal Agency staff members. So that's really 17 focusing on much like we have other communities of 18 practice for other specific technical topic areas in 19 the Agency, for us to have a community of practice 20 internal to the Agency for AI-related development.
21 So I hope that answers the first question.
22 The second question referencing the pilot projects, 23 that is -- and I will ask that Luis or Teri please 24 chime in on this answer as well. But that was 25 intended to be a pilot project with industry or other NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
31 1 external entity to ultimately pilot whatever may be 2 potentially determined as a framework for assessment 3 and evaluation.
4 MR. BETANCOURT: Yeah, hi, this is Luis 5 Betancourt, I am the Branch Chief for AI. So really 6 good question. So on the second one, the idea behind 7 the pilot is that once we're developing the framework 8 and we just see these levels of autonomy, we want to 9 basically have an early (inaudible) to go to that 10 framework.
11 And it will be ideal for industry to let 12 us know like what are the areas that they want us to 13 focus for each one of those levels so we can better 14 tailor our priorities so we can better start 15 identifying these early pilots.
16 Because the idea is that we need to start 17 baby steps in a way to be able to see whether or not 18 the framework is effective in order to provide greater 19 stability to the industry. And whether or not this 20 could be part of the licensing framework.
21 So that's the idea behind those pilots and 22 that's what we would like to further engage with 23 industry on. What are those things that we need to be 24 looking upon now so we can be looking upon the 25 framework.
NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
32 1 And I see Teri just put her camera on, so 2 Teri.
3 DR. LALAIN: Yeah, I'm going to go back to 4 the first part. So while the community of practice is 5 going to be internal, that's to help us with all the 6 different areas that NRC has responsibility for for 7 that communication and coordination inside.
8 One of the goals in the strategic plan is 9 to strengthen and expand those AI partnerships. So 10 I've mentioned that, you know, we be continuing with 11 the workshops and talking about the workshops from 12 '21. So there will be public engagements to where we 13 can bring the community together in a public 14 environment to have those types of continuing dialogs 15 in that.
16 So there will be inclusion through goal.
17 Over.
18 MR. DAVIS: Thank you.
19 MR. KLUKAN: Thank very -- oh, don't want 20 to jump in there if the NRC has more to say on this 21 topic. Okay.
22 MR. DAVIS: No, sorry, Brett, that was me.
23 I was just saying thank you.
24 MR. KLUKAN: Okay, great, well, thank you 25 for the questions. And again, if any of our speakers NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
33 1 have followup, please feel free to raise your hand 2 again, and you know, we can go through a second round 3 of questions or comments if anything that you hear 4 said by another speaker or by the NRC staff sparks 5 something.
6 So our next speaker will be Yadav Vaibhav.
7 And I am mispronouncing anyone's name, I apologize 8 profusely in advance. But our next speaker again will 9 be Yadav. Feel free to unmute yourself whenever 10 you're ready and then state your name and any 11 affiliation.
12 DR. YADAV: Thank you, Brett, can you hear 13 me?
14 MR. KLUKAN: Yes, yes, we can.
15 DR. YADAV: Thank you. My name is Vaibhav 16 Yadav. I am a Senior Scientist in the Nuclear Safety 17 and Regulatory Research Division at the Idaho National 18 Lab.
19 And my question is related, actually, to 20 the previous question by Ian. Is there -- is -- does 21 NRC have a structure or a mechanism to enable that, 22 you know, industry engagement? Is there a -- you 23 know, we're talking about 2027.
24 Are there any tangible objectives and 25 goals and timelines defined, or does NRC has a vision NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
34 1 to, you know, have a like a competitive engagement 2 released in next year or so that will kick off that 3 type of industry engagement have a research 4 development or demonstration or use case demonstrated 5 to achieve those objectives over the course of next 6 three to four years?
7 MR. DENNIS: I will --
8 MR. KLUKAN: No, go ahead, Matt.
9 MR. DENNIS: Sorry, Brett. I will -- I 10 will answer that the best I can and say that the --
11 this is -- so the strategic plan is written to be high 12 level objective with implementing actions that will 13 come about once we finalize the plan.
14 And that is, to answer your question, 15 Vaibhav, that is part of the goal, is that in 16 strategic goal -- sorry, I always have to go back and 17 look at these to remember which's one which.
18 Strategic goal number 5 with the pilot applications.
19 There would be an -- so one of the first 20 steps of the pilot engagement would be, as Luis 21 mentioned, we first need to find out what do we want 22 to go after first and what is the area that -- or what 23 is the use case, the area, the application, all of 24 that kind of stuff that would make a good pilot.
25 And then of course the follow-on to that NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
35 1 is planning it and scheduling it. And there is 2 precedent for this for pilot analyses that have been 3 done for other applications. So this isn't new 4 territory for us.
5 However, we are just starting this and so 6 as part of this five-year plan, which would come about 7 in hopefully the next fiscal year, is that we would 8 start to coalesce on what kind of pilot we would want 9 to look at, what framework.
10 We've also, as we mentioned, we've had the 11 three previous data science and regulatory 12 applications workshops, and we kind of got some 13 indication of what might make a good application. But 14 again, we would need engagement to further hone in on 15 that. And we are planning to have another biannual 16 workshop next year, so in 2023.
17 And so that would also be supporting 18 strategic goal number 5 in that area of the use case 19 development. And so again, to kind of summarize that, 20 that yes, there would be something within that --
21 within the horizon of our strategic goal fiscal years 22 to look out and have identified a pilot project, what 23 that's going to be, and then go about the minutiae of 24 implementing something like that.
25 MR. KLUKAN: Thank you, Matt. And again, NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
36 1 Yadav, thank you for the question.
2 Before we go to our next speaker I just 3 want to remind, because we do have several people 4 participating via phone, that in order to raise your 5 hand press *5. Again, if you are participating via 6 phone and you'd like to speak or, you know, make a 7 comment, pose a question, please press *5, and that 8 lets me know that you'd like to, you know, to speak 9 during the meeting.
10 So our next speaker with be Catherine 11 Buchaniec. And I again apologize if I'm 12 mispronouncing your last name. Again, our next 13 speaker will be Catherine Buchaniec. So whenever 14 you're ready, feel free to unmute yourself and state 15 your name and any affiliation.
16 MS. BUCHANIEC: No, you actually got my 17 name 100% correct, you're good. My name is Catherine 18 Buchaniec, I'm a reporter with Defense News, where I 19 cover artificial intelligence.
20 And so my question is in the draft 21 framework, it said that the NRC is committed to 22 ensuring that the use of new technologies is safe and 23 secure, and increasing AI with like any organization 24 or mechanism can lead to increased risks.
25 And so I'm just wondering what NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
37 1 coordination, if any, is being done with like the 2 country's national security agencies or departments to 3 address any security concerns with the strategic plan 4 or creating kind of like an across-the-board approach 5 framework for ethical AI. Just, what's going on with 6 cooperation or coordination?
7 MR. KLUKAN: Thank you for the question.
8 I'm just verifying who from the NRC -- all right, 9 Matt, it looks like you turned your camera on first, 10 so we'll go to you.
11 MR. DENNIS: There we go, I'm unmuted.
12 I'll let Luis take it. I was letting him get his 13 camera back on. But I will mention that strategic 14 goal number 3 in response to the question discusses 15 strengthening and expanding AI partnerships.
16 And that is, again, as this is a kickoff 17 of our AI strategic plan, that is a component across 18 our engagement -- across the federal agencies, 19 international counterparts, and within industry 20 through our memorandum -- our formal research and 21 development activities that are -- I should say our 22 formal research activities through NRC MOUs, 23 memorandums of understanding. We do have some that 24 exist.
25 Also, participation with other activities NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
38 1 that are already ongoing in the federal government 2 where the NRC has representatives on those activities 3 at other federal agencies who are implementing the 4 federal response to AI planning.
5 So that is -- that is already thought 6 about a little bit, and more will come within 7 strategic goal as this is -- as the AI strategic plan 8 is implemented in next fiscal year. So I'll let Luis 9 answer so more.
10 MR. BETANCOURT: No, Matt, I think you 11 answered basically what I want to mention. But to 12 answer your question, Catherine, yes, we are engaging 13 with other federal agencies at the moment trying to 14 learn about how they're deploying basically safe and 15 secure uses of AI.
16 At the same time, we're not the only 17 agency, not from the nuclear industry, that is facing 18 similar challenges. The other international 19 regulators that are basically on the same situation as 20 we are, and we are talking with the international 21 community to better understand how we can better 22 safely as well as secure these activities in our 23 regulatory activities.
24 MR. KLUKAN: Well, thank you, Matt and 25 Luis. And Catherine, thank you for your question.
NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
39 1 And again, if you'd like to pose a comment or raise a 2 question, raise your hand within the Teams app, on the 3 Teams browser. If you're having difficulties with 4 that, please let know within the chat window, and then 5 I can -- that lets me know that you'd like to be 6 called upon.
7 And again, if you're participating via 8 phone, it's *5, again, that is *5 to raise your hand.
9 So thanks again, Catherine. Our next 10 speaker will be Dan Solitz, again, that is Dan Solitz.
11 Feel free to unmute yourself whenever you're ready and 12 state your name and the affiliation.
13 MR. SOLITZ: This is Dan Solitz, I'm a 14 private citizen. My background was the Naval Nuclear 15 Propulsion Program, and then later on I went and 16 worked in a particle board plant.
17 The purpose of my question is try to get 18 a better understanding of where you're going with this 19 AI. And let me go back to the particle board plant.
20 It was analog system. The press had 14 openings and 21 a cold place where coolant was -- solder (inaudible) 22 were pushed in and then pressed then pulled out.
23 And before they were pressed, the operator 24 made sure the push arm was clear and all the cold 25 plates had been cleared out. And then the operator NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
40 1 pushed the press button. They figured out that they 2 could save a -- they could probably do two to three 3 more loads a day if they just relied on the relays 4 that operated all these equipment to go ahead and 5 close the press.
6 And they didn't do it in my plant, they 7 did do it in another plant, and I watched it --
8 watched it happen. It made me very nervous.
9 So moving on to that to a nuclear power 10 plant, are you considering something like calculating 11 rod worth and fuel depletion and then automatically 12 withdrawing the rods to -- to compensate for that? Is 13 that the kind of use of artificial intelligence you're 14 thinking about?
15 MR. KLUKAN: Dan, thank you for the 16 question. I think Luis is going to answer for the 17 NRC.
18 MR. BETANCOURT: Hey, thank you, Dan. So 19 at the moment, like what I would refer to you like in 20 the data science workshops, you can see where the 21 industry is currently using AI applications at the 22 moment. At the moment right now, the majority of 23 applications are under assisted regulatory programs 24 where they can -- they can use AI within their 25 activities.
NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
41 1 Some of the discussions that you mentioned 2 are about different levels of autonomy, and those are 3 the areas that we need to working upon with industry 4 to figure out where they want to use AI for, as part 5 of their licensing strategy as well as oversight.
6 At they moment, we have some idea of where 7 they want to -- they want to use it, but it isn't 8 clear at this point which application is going to come 9 into that (inaudible), and that's why we need to 10 engage with industry to decide which area we want to 11 tackle first as part of our regulatory activities.
12 I hope that answered your question, Dan.
13 MR. SOLITZ: Well, if you could give me a 14 range of things you're considering, it would be 15 helpful.
16 MR. BETANCOURT: Range of where the 17 industry wants to use it? At this time, like that's 18 why we need to engage with industry. Because some of 19 the areas that you mentioned are under regulatory 20 activities, and I cannot speak on behalf of industry 21 of where they want to use AI.
22 MR. SOLITZ: Is there anyone from industry 23 on the call here that could?
24 MR. BETANCOURT: I think like since this 25 is like an NRC call (inaudible), correct me if I'm NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
42 1 wrong, Brett, the industry cannot answer that 2 question. Like the NRC is the only one who can 3 provide that answer, unless they want to provide that 4 voluntarily.
5 MR. KLUKAN: Yeah, Dan, thank you very 6 much for the question. Recognize you're interested in 7 -- you know, from our perspective at the NRC, we're 8 collecting comments at this time. The question you 9 asked, as Luis has indicated, is really one, you know, 10 of, you know, the industry figuring itself how it 11 wants to use this AI.
12 So again, if there are any speakers who 13 would like to, you know, from industry, who would like 14 to address that comment, please feel free, you're 15 welcome to do so during your portion to speak. So but 16 again, Dan, thank you very much for the question.
17 Well, next actually turn to Jim Slider 18 again. Again, that -- Jim Slider, whenever you're 19 ready.
20 MR. SLIDER: Thanks, Brett. Yeah, this is 21 Jim Slider from NEI. Thank you. Just a follow-up 22 question to some of the previous comments in the last 23 few minutes.
24 On the question about levels of autonomy, 25 I don't know if this goes to Matt or Luis, but I'm NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
43 1 curious, the table that you showed implies that 2 there's one criterion that goes into defining those 3 levels.
4 And I think that the answer is more 5 complicated than that, and I'm just wondering if you 6 have any thoughts at this point about how -- how you 7 might be distinguishing, say, levels two and three in 8 more ways than just that description of -- of autonomy 9 versus independence and so forth, as you showed on 10 your chart.
11 And if you don't have an answer to that 12 question, I'm just curious where in the timeframe of 13 this plan you would expect to flesh out those criteria 14 that distinguish, say, AI applications at level two 15 that you might require some regulatory review from, 16 say, levels -- AI level three applications where it 17 might not be -- might not fall under NRC's aegis.
18 I'm just curious if you have any thoughts 19 on that or when that would be defined.
20 MR. DENNIS: Yes, I'll -- this is Matt 21 Dennis again, I'll take -- I will answer that question 22 to the best of my ability. And first say, Jim, I 23 think you make a -- you raise the question that we 24 have ourselves and that is, again, the point of what 25 we're going to do within the scope of the strategic NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
44 1 plan is to better define the level two to three bleed-2 over.
3 And I think I said this a couple times in 4 other presentations, and I'll say it again here, I do 5 agree that the -- the difficulty for us is figuring 6 out what that is and it's at level between 7 collaboration and operation. So your question or 8 comment is very pertinent, and I agree with you that 9 that is an area where we don't have an answer at this 10 point. And we will have to figure that out.
11 I will -- the one comment I will make in 12 response to your question is I do -- I do not think 13 that the table -- and that's a good -- if that is your 14 comment, so maybe I need some clarification on the 15 comment, is that the communication of the table -- is 16 the table communicating that there's a single 17 criteria.
18 And if that's the intent, or if that's the 19 comment that you're providing, I don't think that that 20 was our intent, and that the table should be more 21 nuanced than that. That there is not a single 22 criteria based on the distinction between human making 23 a decision vice machine making a decision.
24 So it is not as clear-cut as that, and 25 there will be multiple components to defining what NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
45 1 that tipping point is from level two to level three.
2 So I guess one -- my returning for just a 3 clarification is, is the comment that the table 4 conveys that there's just a singular criterion?
5 MR. SLIDER: That's right, Matt, that 6 fundamentally that's my question. And the answer I 7 hear you saying is you agree it's more complicated 8 than that. And that's fine. The table is very 9 instructive and very helpful as a place to begin the 10 conversation. I just wanted to make sure that you 11 were thinking about it multi-dimensionally.
12 Matt, if I could ask a follow-up question 13 on your slide 10, on the second bullet, where you 14 talked about the interdisciplinary team of AI subject 15 matter experts across the Agency. I'm curious what --
16 what other disciplines besides expertise in AI are 17 important to the success of that team?
18 MR. DENNIS: Thanks for the question.
19 Yeah, the interdisciplinary team of AI subject matter 20 experts across the Agency was engaged with social 21 scientists, human factors, regional representatives 22 such as inspectors, and a variety of others. So other 23 engineering disciplines or other degreed disciplines, 24 so computer scientists, traditional nuclear engineers, 25 data scientists, computer engineers, application.
NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
46 1 So we have a wide -- we tried to capture 2 -- and people also expert in safeguards and security.
3 So we tried to, in that interdisciplinary team, 4 capture a lot of the subject matter expertise in a 5 variety of areas across the Agency that have some 6 engagement with artificial intelligence so that it 7 wasn't purely just reactor operations or traditional 8 nuclear engineering.
9 So we tried to look across the Agency 10 where AI might be implemented across all of the 11 offices, and then get people involved in that 12 interdisciplinary team that represented a variety of 13 stakeholders internal to the Agency, as well as 14 experienced expertise in a variety of different areas.
15 MR. SLIDER: Okay, thank you. Yeah, it 16 sounds like you really wanted to touch all of the 17 different functional areas across the Agency. And I 18 just wanted to clarify, thank you. Thank you, you 19 answered my question.
20 I have more questions, but I'm going to 21 yield to floor to Tyler or others who have raised 22 their hands and come back later. Thank you.
23 MR. KLUKAN: Well, thank you, again, Jim.
24 Just a reminder before we go to Tyler is 25 that if you are on the phone, please feel free to NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
47 1 press *5 to raise your hand, again, that is *5. And 2 again, if you're having any troubles with the Teams 3 app, it can happen, you know, please feel free to let 4 me know in the chat. Again, just for technical 5 difficulties with the software. And then I can know 6 to call on you based upon that.
7 So next again, we will go to Tyler.
8 Whenever you're ready.
9 DR. CODY: Thank you. Yeah, again, Tyler 10 Cody here, research assistant professor with Virginia 11 Tech National Security Institute.
12 I had a comment for the third question 13 here about potential near-term and far-term AI 14 activities. I just want to bring awareness to this 15 group that the systems engineering research community 16 has a growing and active interest in systems 17 engineering methods and best practices for AI and AI-18 enabled systems.
19 And that these lines of research are 20 directly concerned with the engineering process. So 21 like all the way from need analysis through 22 decomposition to components and recomposition to a 23 system, deployment maintenance, retirement, etc. And 24 so in the near term, I just, there is this ongoing 25 work that I think will address a gap in the literature NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
48 1 as well as the research community in increasingly like 2 the workforce.
3 Because currently there's a lot of focus 4 on the AI algorithms or on those properties like 5 explainability, trustworthiness, privacy, etc. But 6 just not as much research into like what's the 7 engineering process that it followed, will produce the 8 AI solution, you know, that fits our requirements and 9 our needs.
10 So I think that's a near-term area that is 11 being worked on elsewhere and is very relevant to this 12 community. And in the farther term, the same 13 community is looking at the use of digital models as 14 part of digital engineering and model-based systems 15 engineering activities.
16 And those activities will expand to AI-17 enabled systems. Which basically means that 18 increasingly there'll be digital processes for 19 verification and validation and accreditation of AI 20 solutions using things like digital twins, and model-21 based systems engineering to, you know, to make use of 22 those digital twins to test the performance, to test 23 against requirements, you know, to vary the conditions 24 in those virtual environments.
25 So I just, yeah, I wanted to mention the NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
49 1 Department of Defense is funding this research, and 2 they have a growing portfolio of this research. And 3 so with respect to strategic goals 2 and 3, I'd really 4 suggest trying to form some connections with the 5 systems engineering research community.
6 Thank you.
7 MR. DENNIS: This is Matt Dennis again 8 from the NRC, and I know there wasn't necessarily a 9 question in that from Tyler, but I would like to 10 elaborate on an issue that he brings up that I think 11 is very pertinent to the strategic plan, and the 12 components of what we envision going forward. And 13 that is that standards development is quite crucial, 14 and has been at the agency for a long time.
15 The agency does look towards standards 16 development organizations as a cooperative, and 17 collegial way to have applicable standards that the 18 NRC can endorse in certain areas, and we find that 19 very successful. So, as a component of the strategic 20 plan, there is a recognition that we will participate, 21 and potentially endorse standards that are applicable 22 to artificial intelligence, or other areas.
23 So, it is a very pertinent comment, and 24 one that we are very much interested in, and will be 25 a component of the strategic goals. Specifically, NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
50 1 probably strategic goal number three, that's the 2 closest thing to expanding our partnerships, which do 3 include standards development organizations. And then 4 lastly, real quick, maybe not an answer, but just a 5 little elaboration on the digital twin aspect.
6 The agency does have quite an initiative 7 within its Office of Research on digital twins. We 8 have done a number of workshops on the issue related 9 to digital twins for nuclear applications. And so 10 while that is its own stand alone activity within the 11 NRC, there is a nexus, an overlap, because there is an 12 obvious artificial intelligence component to digital 13 twinning, and model based engineering.
14 So, that is something that we are engaged 15 with. That particular group, I will direct you to go 16 to the NRC website, and find out more information 17 about the digital twin activities that have been 18 happening at the agency.
19 MR. KLUKAN: Thank you Matt, and again, 20 thank you Tyler for your comments, and for just 21 participating in the meeting today. Our next speaker 22 will be Marty Murphy, again, that is Marty Murphy.
23 Whenever you're ready. Please feel free to unmute 24 yourself, state your name, and any affiliation. So, 25 I'll turn it over to you Marty.
NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
51 1 MR. MURPHY: This is Marty Murphy with 2 Curtiss-Wright Scientech. So, I think this pertains 3 to item number four for potential challenges. And my 4 question really goes to exactly how is the NRC going 5 to consider the generation of new staff positions as 6 it develops, and works through the implementation of 7 the strategic plan? Especially for those applications 8 that don't require specific NRC approval via 9 amendment, or specific review?
10 And what I'm talking about is inspection 11 procedures, I didn't exactly see within the strategic 12 plan where that was considered, or the use of say CRGR 13 was going to be applied to products that the NRC 14 produces that don't typically go through the committee 15 to review generic requirements.
16 MR. KLUKAN: Thank you very much for the 17 question, I'll turn it over to the NRC staff.
18 MR. BETANCOURT: Hey Marty. I believe we 19 did mention the possibility for us to evaluate our 20 variable framework. Because like you mentioned, there 21 are areas where there are inspection procedures that 22 we may need to update. And I think that's an area 23 that we need to talk about initially first to be able 24 to start identifying okay, which guidance is 25 available.
NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
52 1 It's insufficient at this point, that's 2 one of the reasons that we're engaging with our 3 reactor inspectors, to figure that out. But you bring 4 up a really good point, so I'll take note on that, and 5 see if we can also do better, refine that aspect. But 6 one of the things that we need to do is to do 7 potential cap analysis of our regulatory framework.
8 Do we need new guidance?
9 Do we need new special procedures? And 10 that's an area that we need to talk over initially.
11 I hope that answered your question Marty.
12 MR. MURPHY: Yeah, thanks very much Luis, 13 I appreciate it. And I think it's great that the NRC 14 is proactively developing this plan, and engaging with 15 its stakeholders to get that kind of feedback.
16 Because like I said, I do think there's the potential, 17 and it would be great to have that openness, and 18 transparency as we move forward with this. So, thank 19 you very much.
20 MR. BETANCOURT: Yeah, and one thing that 21 I forgot to mention, on goal number three, which I 22 think is the one on cultivating our work force, one 23 other that we want to do is to have our staff to be 24 more AI savvy. Because there's not many people at the 25 NRC that are knowledgeable about AI, and one of the NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
53 1 things that we need to do is to train our staff, 2 including the resident inspectors. So, that's an area 3 that we can also be talking about eventually.
4 MR. MURPHY: Yeah, thanks very much.
5 MR. KLUKAN: All right, Marty thank you 6 again for the questions, and for participating today.
7 Our next speaker will be Brian Golchert. Brian 8 Golchert, whenever you're ready, please state your 9 name, and any affiliation.
10 DR. GOLCHERT: Good afternoon, my name is 11 Brian Golchert, I'm with Westinghouse. I have some 12 concerns about a long range for the NRC regulation.
13 It's a broad topic AIML, whatever you want to call it, 14 it covers a lot of things. And from a practical point 15 of view, I'm worried about any future regulations 16 will, I'll use the term what if us to death, that it 17 will be cost ineffective for us to use these tools if 18 you say do this model, check that model, do these 19 uncertainty studies.
20 I was just wondering for the strategic 21 vision, if there had been thought to have a more 22 intimate relationship with the potential applicants to 23 have discussions on future regulations, future 24 guidance to say you can't ask us to do this, it'll 25 kill us cost effectively. Has that been considered as NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
54 1 part of the strategic plan?
2 MR. BETANCOURT: Yeah Brian, that's an 3 area that we cannot be a barrier, to your point if 4 industry wants to use these technologies, we need to 5 be able to enable these technologies. To your point, 6 we have been thinking for example, that we need to do 7 confirmatory analysis in order to have industry 8 provide us the data, yes, or no. I don't see AI being 9 different than other types of technology.
10 Where the system framework can be used to 11 get it out the door, or we need to have that 12 discussion with industry to -- what are the things 13 that we really need to require for AI applications?
14 Because this is a new area that is constantly 15 evolving. So, to your point Brian, that's something 16 that is in the back of our minds while we're 17 developing this AI strategy.
18 DR. GOLCHERT: Thank you.
19 MR. BETANCOURT: Good question.
20 MR. KLUKAN: Thank you again for the 21 question. And again, just as a reminder to everyone, 22 if you'd like to pose a comment, or raise a question, 23 raise your hand within the Teams app, or the Teams 24 browser, or press star five on your phone if you're 25 participating via phone. We will next go back to Jim NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
55 1 Slider.
2 MR. SLIDER: Thanks Brett. Just to follow 3 up on Brian Golchert's last comment Luis, I just want 4 to exhort you, and Matt, and Terry to please engage 5 industry early, and often as this work unfolds. I 6 think what I hear in Brian's comment, and others is a 7 high level of interest in working with the NRC 8 collaboratively to find an appropriate level of 9 regulatory scrutiny that assures public confidence in 10 these tools.
11 But at the same time, doesn't kill them in 12 the crib. Because they do offer the prospect of 13 improving safety in our operating plants, and we very 14 much want to do those things that enhance safety, but 15 it's important that the level of regulation be 16 appropriate, and fitting for the technological 17 challenges that these represent.
18 DR. LALAIN: Agree Jim, and that's why 19 engagements like this to get comments today, the 20 workshops that we had last year, the future workshops, 21 all of those engagements are key, so that we keep the 22 communication going, and with that awareness, can work 23 for that future together. So, point's well taken, 24 thank you.
25 MR. SLIDER: Thank you. A follow up NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
56 1 question on the five year horizon that's posited at 2 the beginning of this plan. On the one hand I would 3 urge the NRC to be as prompt as possible in getting 4 down the road, and developing the regulatory guidance.
5 Because the feedback I've gotten from my industry 6 stakeholders is that in the absence of clear 7 regulatory guidance, that they're unlikely to tackle 8 AI applications that require NRC review.
9 So, you'll see -- but on the other hand, 10 that's also a choice to help industry build experience 11 with AI applications on business precesses, for 12 example. Like AI applied to the corrective action 13 program, and so forth. But we would, as quickly as we 14 can, to begin to extend the reach of these AI tools 15 for the benefit of safety, and it's going to be 16 important for the NRC to get its regulatory guidance 17 out in a timely way that helps industry to understand 18 what the challenges are -- the regulatory challenges 19 are, and move ahead with those more ambitious 20 applications.
21 MR. BETANCOURT: That's a great question.
22 I think that's where the table on the levels of 23 autonomy will come into play, where you like 24 mentioned, the majority of the AI applications that we 25 have been seeing are being used within the system NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
57 1 regulatory framework (inaudible) but if industry, if 2 we can know a sector of industry, where do they see 3 the first applications, we can have that conversation 4 just on the levels of autonomy on the table.
5 That can help us to better develop that 6 guidance. So, yeah, that's the reason that we wanted 7 to have this strategy out the door, so we can start 8 having this conversation with industry.
9 MR. SLIDER: Thanks Luis.
10 MR. KLUKAN: There we go. Jim, thank you 11 again for your comments thus far in the meeting. Next 12 we're going to turn to Bradley Fox. Again, that is 13 Bradley Fox, whenever you're ready, state your name, 14 and any affiliation.
15 MR. FOX: Hey there, thanks for the 16 questions. I'm Brad Fox, I'm the CEO, and co-founder 17 of NuclearN.ai, and we actually have, I guess what 18 would be categorized as level one solutions deployed 19 at operating plants across the U.S., and Canada right 20 now. My only comments here would be to be cautious of 21 the verbiage could impact. I think it would be good 22 to clarify that a little further.
23 Kind of looking back to my years in 24 engineering, when we had that word could, there was a 25 wide variety of what that could mean, and how that was NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
58 1 interpreted by the inspectors at the plants, or 2 various individuals, and that led to evaluations, and 3 discussions. So, I think looking at how that would be 4 interpreted up front would be great. I'd also love if 5 you could look into impacts of AI on software quality 6 assurance.
7 And the different ways that might interact 8 with the requirements for SQA at plants. Because we 9 do get a lot of SQA impact questions from the 10 operating facilities, and the folks involved in SQA 11 there. That's it from me.
12 MR. KLUKAN: All right, well thank you.
13 I think the NRC staff, here comes Matt.
14 MR. DENNIS: Yeah, I did have a clarifying 15 question back at Brad just to get a better sense of 16 what the comment was. Because we appreciate the 17 comments, and I want to get a good handle on what 18 exactly, with a little more specificity, if you can 19 provide it, when you're mentioning, are you just 20 talking about the language of the word could in 21 general about what we're discussing?
22 Or specifically something within the text 23 of the strategy at this point? And the only reason I 24 ask is because if you had a specific thing where it's 25 line 14 through 20 of something talks about this, and NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
59 1 you have a comment about that, or if it's just a 2 general wanting more specificity about applications, 3 or guidance, that would be appreciated, so thanks.
4 MR. FOX: Yeah, thanks Matt. I think it 5 would be the difference between your level one, and 6 your level two definitions in the table you showed a 7 couple slides earlier. I think we have product 8 running in level one right now, and you could 9 theoretically make a case that a lot of systems that 10 plants have could affect safety, from everything from 11 search algorithms that might exist today, to 12 corrective action program AI, things like that.
13 And saying things like could affect plant 14 safety can be carried out, or interpreted very 15 differently by whomever is looking at that, or later 16 evaluating that definition, and that classification so 17 to speak. So, clarity around there is always really 18 good. If there's guidance, in engineering world 19 there's NUREGs, and things that would help us 20 understand what that means.
21 MR. DENNIS: Okay, thank you, that 22 clarifies it for the NRC staff I think, and we 23 appreciate it.
24 MR. KLUKAN: Thank you again for 25 participating in the meeting, and for raising that NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
60 1 comment. Next speaker will be Ross. Ross, whenever 2 you're ready, please feel free to unmute yourself, and 3 state your name, and any affiliation.
4 MR. MOORE: Ross Moore, Oklo. I just want 5 to make a quick comment slash question I guess with 6 regard to the table. When I look at the table, and 7 the levels of notional AI, and autonomy levels, level 8 four, which is described as machine decision making 9 with no human intervention, which I think potentially 10 blends automation with AI. Where you can have an 11 automated process that controls machine operability 12 without actually integrating, or incorporating AI 13 characteristics.
14 And I'm curious whether that's the intent 15 of the strategic plan, or if perhaps the strategic 16 plan -- or if there is a potential way to delineate 17 within the strategic plan, the difference between AI, 18 and automation for which the NRC already has a 19 regulatory framework surrounding.
20 MR. DENNIS: I'll take a first crack at 21 the interpretation of this, and I think it is a very 22 good, and appreciated point, which we can go back, and 23 make some improvement on. That the table is not, and 24 rightfully so, should not be conveying that there is 25 a similarity between automatic operation, and NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
61 1 autonomous operation. So, there is a distinction, and 2 there is a component that autonomy -- and I think I 3 kind of mentioned this when I was presenting.
4 And that is that while AI enables 5 autonomy, and I think that's what you were mentioning 6 in your comment, is that not all uses of AI are 7 necessarily autonomous, or vice versa. So, the intent 8 of the table is very much, because this is the AI 9 strategic plan, is to not be focused on the 10 traditional automatic operation of systems that we do, 11 as you pointed out, have a framework in place at the 12 NRC for automatic safety systems, and actuation, and 13 control, and logic, and all of that.
14 But this is layering on top of that the AI 15 component, which may have other consequences. So, I 16 think that is a fair comment, and we appreciate the 17 comment to go back, and make some clarifications on 18 the table's intent.
19 MR. BETANCOURT: Okay Ross, good comment.
20 I don't know if you saw under page 1-2 at the end, as 21 well as 1-3, we did recognize that autonomy, and 22 automation are quite different. So, we didn't want, 23 to your point, to reinvent the wheel for everything, 24 that has already been under the system framework. But 25 we'll take a look back at the table, but there's a NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
62 1 footnote on number three that acknowledges that 2 automation is different than autonomy.
3 MR. KLUKAN: Thank you Ross, again for 4 your questions, and your comment. We will now turn 5 again to Jim Slider, whenever you're ready, feel free 6 to unmute yourself.
7 MR. SLIDER: Thanks Brett. Just, I'm 8 eager for Matt to make all of this scope of work 9 happen immediately, and I say that with tongue in 10 cheek Matt. But whatever the agency can do to 11 accelerate the work that is called for in this plan 12 is, I think important, and beneficial to NRC, and the 13 industry. In that spirit, I would like to ask, in 14 section four the plan mentions working with other 15 federal agencies in the draft to inform the drafting 16 of AI standards, and guidance documents.
17 And I'm curious what other federal 18 regulatory agencies have experience with AI that is 19 relevant to the NRC's mission. I wonder if you might 20 be able to comment on that.
21 DR. LALAIN: Hi Jim, it's Teri. So, NIST 22 has an AI standards working group that has brought 23 together a broad range of participants across the 24 federal government from the implementing agencies like 25 the DOD, to the regulatory agencies, and in that forum NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
63 1 there's a lot of conversation that is happening on 2 what would the standards look like, how would AI be 3 implemented?
4 And it's an opportunity for all the 5 agencies to connect, and talk about their specific 6 roles, and aspects as AI is used more broadly across 7 the federal government. So, that is one of the forums 8 that we get those engagements, us, organizationally, 9 over.
10 MR. SLIDER: So, Teri, just to pick up on 11 that, for clarification, so if the NIST group, or NIST 12 publishes a standard on say explainability, you would 13 see the NRC as -- the NRC would take that guidance on 14 what explainability means, and how you demonstrate it, 15 et cetera, et cetera, and interpret that in the NRC's 16 domain of safety regulation, is that how that would 17 flow down?
18 DR. LALAIN: So, to be determined. I'm 19 sorry that's not a more direct answer to your 20 question. We're waiting to see what comes out of that 21 forum, because ultimately we need to look how it's 22 going to work for our industry as well, and we've got 23 the breakout process, and everything else that we're 24 going to have to go through. But to the question 25 about getting that cross organizational input, that's NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
64 1 one of the forums that we have there.
2 We also have our MOU with DOE, which is 3 another forum to us have conversations of things that 4 are happening there.
5 MR. SLIDER: All right, thank you.
6 DR. LALAIN: Yeah.
7 MR. KLUKAN: Jim, thank you again. Next 8 up will be Tyler.
9 DR. CODY: Thanks. This time I have a 10 comment on the fourth item about potential challenges, 11 and it relates to the question of where to place 12 regulations, I guess. It's a bit academic, but it's 13 pretty important. Regarding operations, there are 14 many no free lunch theorems of statistical learning 15 theory that suggests no single model can be optimized 16 for all conditions at once.
17 So, the concept of test, and evaluation as 18 a pre-deployment activity, or accreditation as an 19 exclusively pre-deployment activity is in pretty 20 strong conflict with the first principles behind 21 machine learning solutions. So, basically as 22 conditions change, for example between operations, or 23 as platforms degrade, or with changes in use, if there 24 is a material difference in the data that flows 25 through the model, then the performance of the model NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
65 1 is expected to change.
2 And if it was well fit originally, then by 3 definition it has to be worse fit now, because it was 4 well fit to conditions that are no longer occurring.
5 So, this all goes to suggest that domain adaptation, 6 like the fact that we have to update our models is the 7 rule, not the exception. And that so called universal 8 models, for example general purpose vision models are 9 the exception, not the rule.
10 So, I think when you think about that 11 table, and the criterion for what level of autonomy 12 are we at, there's other things to consider like is 13 this thing going to need to adapt over time? Is this 14 a stationary environment, or not? Meaning is the 15 distribution of the data changing over time, is this 16 thing doing online learning? And all those little 17 check boxes sort of back you into the quadrant of how 18 worried do I need to be about unexpected changes in 19 the behavior of my model?
20 Because some technician tightened 21 something somewhere, does that need to be folded in to 22 how -- do you need to incorporate the policies for how 23 you change the physical system with the policies for 24 the AI? And do those things need to be coupled, et 25 cetera, I think come down to those kinds of analysis.
NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
66 1 Again, just a comment, but I hope thought provoking.
2 Thank you.
3 MR. DENNIS: I will just -- this is Matt 4 Dennis again from the NRC. I will just confirm that 5 that is a -- thank you for the comment, and I think it 6 goes along with your previous comment on system 7 maintenance, and online learning, which we took note 8 of before, but I think is a good one, and we 9 appreciate it, thank you.
10 MR. KLUKAN: Thank you again Tyler. And 11 I apologize for those little pauses, I just want to 12 make sure I give the NRC staff an opportunity to jump 13 in before I thank you myself. So, thank you again 14 Tyler. Next up we'll have Ian Davis. Ian Davis, 15 whenever you're ready, feel free to unmute yourself, 16 and again your comment, or question.
17 MR. DAVIS: Thank you. Ian Davis, again 18 from X-energy. I also wanted to address potential 19 challenges in line item number four there. INPO 20 regularly releases reports talking about operator 21 induced events, and I can foresee a conflict coming 22 down the road where AI may have a goal of reducing 23 operator induced events further by removing the human 24 from the loop to some degree.
25 And that -- to do so, we must figure out NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
67 1 how we can create trustworthy AI. So, there's going 2 to be a conflict of whether, or not we trust the human 3 more, which we know the human is also infallible, or 4 we trust the AI more, and I think focusing on how to 5 quantify, and evaluate the trustworthiness of both the 6 human, and the AI is going to be really important to 7 make a decision about whether, or not that application 8 is acceptable, thank you.
9 MR. KLUKAN: Well, thank you Ian, for that 10 comment, again, we really appreciate it. Wasn't sure 11 if Luis was trying to jump in, or not. But next we'll 12 turn to Brian Golchert. Brian Golchert whenever 13 you're ready.
14 DR. GOLCHERT: This is Brian Golchert from 15 Westinghouse. And actually to follow up on the trust 16 issue, looking at level one, we're looking at the 17 possibility of using AIML optimization algorithms to 18 design new products, or improve existing products. If 19 we go back to the old fashioned method of using --
20 sorry. If we use the new method of using AIML to 21 design it, but then go back to the old method of 22 testing the final product, are we going to be expected 23 to submit these AIML?
24 I realize this is nothing in the strategic 25 plan, but it is something to consider in the long run NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
68 1 for regulation. There's a lot of stuff that AIML can 2 do behind the scenes, which may, or may not be needed 3 to be submitted, and a little curious as to how the 4 NRC plans to fold that in.
5 MR. DENNIS: So, this is Matt Dennis from 6 the NRC again, and I will clarify that -- and on the 7 spot, I cannot find the language, and I think we do 8 have it in there. But this is a very good comment 9 that we will go back, and confirm that we do include 10 the language in here about the -- and I think I 11 mentioned this earlier, that the AI strategic plan 12 should not be viewed as purely something that is 13 focused on operational.
14 Meaning human interaction, and autonomy in 15 a control room, but it does -- so, the levels of 16 autonomy is an example, one example application in the 17 broad AI umbrella of things. But we did -- we have 18 given examples, people have given examples in this 19 forum about other business processes that would be 20 heavily reliant on natural language processing, which 21 falls under that AI umbrella.
22 And that is a component of what we are 23 considering. So, to your example that you just gave 24 about using AI, or in particular, machine learning, to 25 maybe do an analysis in a new way that has not been NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
69 1 the traditional physics based model, or something like 2 that using data -- or using a physics based model 3 based on experimental data, and instead using a 4 machine learning model to design, or qualify a 5 component, or something along those lines as you were 6 mentioning.
7 That is something that is being considered 8 under the AI strategic plan. So, it is a component, 9 and if not explicitly already stated in the strategy, 10 I think it is, I can't go find the exact words right 11 now, but it is a very pertinent comment, that we will 12 go back, and make sure that that is communicated.
13 DR. GOLCHERT: Well, actually I understand 14 that, the point was if I go back, and test the new 15 design, it doesn't matter how I got to that point. In 16 the old days you would just have here's my drawing, 17 here's my test, here's my results submitted to the 18 NRC. Now, if I use AIML to get to the here's my 19 drawing, and then do the tests, do I need to submit 20 the AI?
21 MR. DENNIS: Okay, thank you for 22 clarifying that, that is a different question, and a 23 good comment. So, I can't answer that for sure, and 24 I think that goes to what we were discussing earlier, 25 mentioning that there is a graded approach to things, NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
70 1 and just because it uses AI does not necessarily 2 invalidate our current regulatory framework, and 3 that's what Luis was mentioning earlier.
4 That part of rollout will be doing a gap 5 analysis to assess whether, or not that is applicable 6 in the specific situation. So, I'll turn it over to 7 Luis to elaborate.
8 MR. BETANCOURT: Yeah, thank you for that, 9 and I apologize for my camera, it's not up, it 10 basically goes away. So, good comment, because 11 sometimes AI can be used as a tool to demonstrate, how 12 do you demonstrate regulatory compliance. So, to your 13 point that's something maybe that we need to look at, 14 but great comment, we'll take that back.
15 DR. GOLCHERT: Thank you.
16 MR. KLUKAN: Thank you again Brian for 17 your comments, and for participating today. Next 18 we'll turn to Yadav. Whenever you're ready, feel free 19 to unmute yourself, and begin your question, or 20 comment. Again, next we will turn to Yadav, it looks 21 like you may have had your question answered? Go on, 22 go on.
23 DR. YADAV: Sorry, I was muted. So, this 24 is Vaibhav Yadav from Nuclear Safety and Regulatory 25 Research Division at INL. Talking about meeting with NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
71 1 current regulatory framework, I was wondering if you 2 are specifically considering Part 53, or intersection 3 of AI based approaches to meet Part 53 requirements 4 when it comes up? Thank you.
5 MR. DENNIS: I'll take a first crack at 6 this, and say yes. This is not -- the AI strategic 7 plan is not in a vacuum, and we do have 8 representatives from our Nuclear Reactor Regulation 9 Office, NRR, and so this is -- if anyone else from our 10 group wants to chime in, please feel free to after I 11 mention this. But the 10 CFR 53 rule making is 12 currently in process, as you're aware.
13 And this, the AI strategic plan is -- one, 14 or the other. As an example the autonomy discussion 15 here is linked with that as well. So, it's not 16 completed, in a vacuum, and I actually think Brian has 17 his hand raised, so I want to turn it over to him 18 actually.
19 MR. GREEN: Hi, thanks. So, this is Brian 20 Green, I'm the human factors team lead for NRR. I 21 can't speak for all of Part 53, because the pieces are 22 just all coming together at one point. But within the 23 human factors regulation, the way we are currently 24 handling this is actually under the concept of 25 autonomy. So, if you were to create an autonomous NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
72 1 system using AI, that's how we had initially conceived 2 of it.
3 And we're having discussions inside to see 4 if adjustments need to be made to align the 5 terminology internally right now. But thus far, we 6 have not precluded it, but we have been hesitant to go 7 too far as to prescribing how it would happen. So, 8 there's kind of a place holder built in that as we get 9 more comfortable with it, we would be able to 10 potentially facilitate it.
11 But we're not speaking directly to it.
12 I'm not sure if that holds true with all the other 13 areas, but that's how human factors is handling it 14 right now.
15 MR. KLUKAN: Brian, thank you for weighing 16 in on that question, and thank you again Yadav for 17 raising the question as well. So, next we will go 18 back to, and thank you to Matt. Next we will go to 19 Jim, whenever you're ready.
20 MR. SLIDER: Thanks Brett, it's Jim Slider 21 from NEI again. Just a final question on process.
22 I'm just wondering Matt, if you could just tell us 23 what we can expect to see on this project over the 24 next say six months, between now, and when the final 25 strategic plan is ready for release next spring. You NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
73 1 mentioned in your previous slide that you've got an 2 ACRS meeting planned for November. Comments on this 3 draft are due August 19th.
4 And I'm just wondering if you can give us 5 some idea of what we might be seeing in terms of 6 public communications, or public releases of 7 information between now, and next spring.
8 MR. KLUKAN: Matt, I think you're on mute, 9 I'm sorry.
10 MR. DENNIS: I couldn't make it the entire 11 meeting without having one microphone mess up, so I 12 apologize. But thank you Jim, I'm going to reiterate 13 some of the things we already said, and you've just 14 mentioned, and then also maybe clarify a few. So, as 15 you mentioned, the AI strategic plan, the plan is to 16 finalize -- I'm sorry, the comment receipt period 17 closes on August 19th.
18 So, this is again, another reminder to 19 please go submit your formal, written comments on 20 regulations.gov for the docket number which is 21 provided in the public meeting notice. So, the 22 strategic plan is intended to be finalized after 23 receipt of public comments, and consideration of any 24 feedback from the advisory committee on reactor 25 safeguards, who has stood up a joint committee NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
74 1 specifically looking at this topic as far as 2 artificial intelligence at the NRC, and the nuclear 3 industry.
4 That public meeting is not scheduled yet, 5 it's planned for November of 2022, and will be noticed 6 in both the ACRS public meetings, on their website, 7 and we will try to let everyone know that it is 8 upcoming, and you can participate in the ACRS meeting.
9 Additionally, again, for 2023, which is why we also 10 mentioned that this will be finalized in 2023, we plan 11 to have another follow on workshop.
12 Which will be a continuation of those data 13 science, and regulatory applications of -- sorry, data 14 science, and AI regulatory applications public 15 workshops. So, we have those three in 2021, and we 16 will have another one that is planned to have in 2023.
17 Additionally, this will probably, most likely, cannot 18 confirm, but odds are will be a topic at the NRC's 19 regulatory information conference, which I believe is 20 in March.
21 I don't have the date in front of me, but 22 there will probably be a session at the RIC on 23 artificial intelligence as well. So, those are 24 another couple of opportunities. So, the ACRS 25 meeting, the workshop, RIC, these will all be NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
75 1 opportunities for us to engage with the public, and 2 update on the current status of the plan, when it's 3 finalized, when we envision to have any action plans 4 to support the strategic goals, the implementation of 5 -- probably not the framework, that's a little later 6 on.
7 But at least working groups, or things 8 that we're putting in place to move forward with the 9 different strategic goals throughout FY '23, and FY 10 '24. So, if I've forgotten any other particular 11 things, Elise, or Teri, please chime in.
12 MR. BETANCOURT: Yeah, I think Jim, that's 13 a good question, because what I have been hearing 14 today is early engagement as soon as possible. And I 15 think once we finalize a strategy, I do foresee us to 16 re-engage the industry in a public setting after the 17 strategy's out in the public domain, the final one.
18 And that's what we will need to hear from industry, 19 what are those topics that you guys see as the near 20 term, or where do we need to emphasize our research?
21 As well as the regulatory framework, so we 22 can start planning. Because as you guys know, our 23 budget is two years in advance, we are budgeted for 24 the next year to start doing some of this framework.
25 If there's really interest from the industry in some NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
76 1 of this area, that's where we need to engage now, 2 rather than later. But I would consider even industry 3 could consider even possibly submitting potential 4 white papers.
5 So, that we could have some potential 6 discussions of where do we see gaps, and challenges, 7 as well as areas that we need to consider. Once we 8 are doing some of this initial AI strategy, this will 9 take time. As you guys are aware, AI is a rapidly 10 evolving technology, and we need to meet early with 11 industry on this topic. And that's why we value this 12 conversation. And please submit those type of 13 comments in the FRN so we can consider that.
14 MR. SLIDER: Luis, this is Jim again, 15 thank you. My ears perked up when you said the word 16 white papers, can you elaborate on that?
17 MR. BETANCOURT: Yeah, you know, like at 18 least on the advanced reactor side, there has been a 19 process of submitting a white paper to have a better 20 understanding of what industry wants to pursue 21 regulatory changes, and that's where I was talking 22 about like a white paper.
23 MR. SLIDER: Okay, thank you.
24 MR. BETANCOURT: Hope that answered that 25 question.
NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
77 1 MR. SLIDER: Got it, thanks very much 2 Luis.
3 MR. DENNIS: This is Matt Dennis again, 4 just one last plug on this topic of what our plans 5 are. As we've said, it is very important to 6 re-emphasize that the public comment for this draft 7 strategic plan concludes on August 19th, and I did put 8 it in the chat, I finally copied it. So, anyone here 9 who is participating in the Teams meeting, feel free 10 to follow that link to regulations.gov.
11 It's docket number -- and that's what I 12 was looking for specifically, it's docket number 13 NRC-2022-0095-0001. And you can submit at that link 14 on regulations.gov, your public comments. Thanks.
15 MR. KLUKAN: And thank you again Jim, and 16 thank you Matt, and Luis. So, at this time, I don't 17 see anyone in the queue, or individual hands raised.
18 So, again, if you're on the phone, please hit star 19 five if you'd like to speak, or to raise your hand.
20 And again, if you're on Teams, use the app, or the 21 browser, the react function to raise your hand. And 22 we have one person who has entered the queue, and that 23 is Sherman Remer, whenever you're ready.
24 MR. REMER: Yes, thank you very much.
25 Jason Remer from Idaho National Laboratory, and a NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
78 1 question for you just real quick. In the past we've 2 been working a lot in promoting digital systems as 3 replacements for analog systems in our plants. And we 4 worked through a lot of issues with the NRC, and I 5 think we've got a good plan.
6 But I just want to make sure that whatever 7 we do in the AI area, we also just kind of incorporate 8 all the good things we've learned about how we're 9 going to deal with common caused failure, how 10 redundancy, and diversity affect things. And so that 11 whatever we come up with AI to ML, that it helps us 12 move forward as a whole. We've found sometimes we 13 think we've solved all the problems, and we look 14 around, and something suddenly pops up.
15 So, just a note of consideration, and 16 appreciate the NRC working on this issue, I think it's 17 very important. But I want to make sure that we can 18 -- we kind of need to crawl before we can run, and 19 AIML is a little bit running, and replacing analog 20 with digital is kind of crawling. So, just a note, 21 and again, appreciate you having this.
22 MR. BETANCOURT: Jason, great comment.
23 Actually my background is digital I&C, so I know 24 exactly what you're talking about. We need to keep 25 doing that coordination with our digital I&C folks NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
79 1 just to make sure that some of the areas that you 2 mentioned, like that the introduction of AI, once it's 3 in conducting operations, will not negatively effect 4 the safety, and the security of operations, so 5 absolutely.
6 MR. REMER: Yeah, thanks Luis, and you, 7 and I have worked together a little bit already on 8 this, so appreciate your understanding, thank you.
9 MR. BETANCOURT: Yeah.
10 MR. KLUKAN: Okay, thank you again. Matt, 11 did you want to chime in?
12 MR. DENNIS: I apologize, I was going to 13 say, and Luis pretty much said the same thing I was 14 going to say, that the topic of digital I&C, and 15 lessons learned in the implementation of digital I&C 16 within industry, and NRC is not a topic that is lost 17 on us as considering how we do this. So, there are a 18 lot of lessons learned from implementing a new 19 technology in the nuclear industry that we can carry 20 over into doing this same exact application.
21 And that is something we have discussed as 22 something we need to consider for our purposes going 23 forward, so we don't make the same mistakes. Or at 24 least learn from the processes that were implemented 25 as part of the digital I&C applications. So, it's NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
80 1 something we have definitely thought about in 2 considering the lessons learned from digital I&C.
3 MR. KLUKAN: Thank you Matt, and again, 4 thank you for the question. We have about 15 minutes 5 left, so if you have any remaining questions, or 6 comments, please feel free to raise your hand. Again, 7 if you're on the phone, press star five. Again, 8 that's star five. And we have a few moments here to 9 see if anyone enters into the queue. Again, if you'd 10 like to make a comment, please feel free to raise your 11 hand.
12 And again, if you're on the phone, press 13 star five. Again, that is star five. Doing a little 14 auctioneer countdown in my head, it doesn't look like 15 we have anyone else. I'm going to turn it over to 16 Teri.
17 DR. LALAIN: Thanks Brett. I just want to 18 thank everyone for the great feedback today, the 19 comments, the questions, that's the purpose, and the 20 goal today, so thank you for helping us achieve that 21 goal of getting the input, really appreciate all the 22 comments. Great dialogue today, so with that Brett, 23 I'll turn it over to you to close us out, thank you.
24 MR. KLUKAN: All right, great. Can we --
25 yeah, there we go, we have the last slide up. So NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
81 1 again, in closing, what you can see here on the slide 2 is the contact information if you want to reach out to 3 any of the NRC staff you heard during this meeting 4 today beyond, or after this public meeting. Again, I 5 would like to thank all of the meeting participants 6 for taking time out of their day to engage in this 7 very dynamic discussion.
8 It was mentioned at the start of the 9 meeting, no regulatories were made during the course 10 of this NRC public meeting. With that, in closing, I 11 would reiterate if you would like to have your 12 attendance in the meeting reflected in the meeting 13 summary, please email Matt Dennis at 14 matthew.dennis@nrc.gov, that's 15 M-A-T-T-H-E-W.D-E-N-N-I-S@nrc.gov.
16 His email address is up on the screen.
17 Feedback reports from the meeting can be found on the 18 public meeting -- or feedback on the format of this 19 meeting may also be emailed to Matt as well, or 20 provided through the NRC's public meeting website.
21 Again, I'd like to thank you all for this great 22 discussion today. And with that, I will adjourn the 23 meeting. Have a good rest of your afternoon everyone, 24 thank you again for participating.
25 (Whereupon, the above-entitled matter went NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
82 1 off the record at 2:49 p.m. and resumed at 2:59 p.m.)
2 MR. DENNIS: Hi Steve, this is Matt Dennis 3 from NRC, I noticed you joined, and unfortunately the 4 meeting, and public comment portion ends in one 5 minute. So, if you have a comment, or anything to 6 say, if not, if it's just a time change problem, let 7 me know. Otherwise we're going to be ending the 8 meeting in about a minute. Hi, for anyone that's just 9 joined the meeting, the public comment portion of the 10 meeting was from 1:30 to 3:00 p.m. Eastern.
11 If you have anything you'd like to say, I 12 can take that real quick, but otherwise I'm going to 13 be ending the meeting, since it is after 3:00 o'clock.
14 I was literally getting ready to end the meeting, but 15 Rafa I saw you, did someone raise their hand? Okay, 16 well, all right, thank you for attending the NRC's 17 public meeting. Hearing no responses, I'm going to 18 close the meeting, as it is 3:02 p.m. Eastern.
19 Thanks.
20 (Whereupon, the above-entitled matter went 21 off the record at 3:02 p.m.)
22 23 24 25 NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com