ML22229A500
| ML22229A500 | |
| Person / Time | |
|---|---|
| Issue date: | 08/03/2022 |
| From: | Dennis M Office of Nuclear Regulatory Research |
| To: | |
| Dennis M | |
| References | |
| NRC-2045 | |
| Download: ML22229A500 (83) | |
Text
Official Transcript of Proceedings NUCLEAR REGULATORY COMMISSION
Title:
Public Meeting on NRC Draft Artificial Intelligence Strategic Plan for Fiscal Years 2023-2027 Docket Number:
(n/a)
Location:
teleconference Date:
Wednesday, August 3, 2022 Work Order No.:
NRC-2045 Pages 1-82 NEAL R. GROSS AND CO., INC.
Court Reporters and Transcribers 1716 14th Street, N.W.
Washington, D.C. 20009 (202) 234-4433
1 UNITED STATES OF AMERICA 1
NUCLEAR REGULATORY COMMISSION 2
+ + + + +
3 PUBLIC MEETING ON NRC DRAFT ARTIFICIAL INTELLIGENCE 4
STRATEGIC PLAN FOR FISCAL YEARS 2023-2027 5
+ + + + +
6 WEDNESDAY, 7
AUGUST 3, 2022 8
+ + + + +
9 The public meeting convened via Video 10 Teleconference, at 1:00 p.m. EDT, Brett Klukan, 11 Facilitator, presiding.
12 13 PRESENT:
14 BRETT KLUKAN, Meeting Facilitator 15 THERESA LALAIN, Deputy Director, Division of Systems 16 Analysis, Office of Nuclear Regulatory Research 17 LUIS BETANCOURT, P.E., Chief, Accident Analysis 18 Branch, Office of Nuclear Regulatory Research 19 MATT DENNIS, Reactor Systems Engineer (Data 20 Scientist), Division of Systems Analysis, Office 21 of Nuclear Regulatory Research 22 23 24 25 NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
2 TABLE OF CONTENTS 1
Page 2
Opening Remarks and Agenda 3
3 Artificial Intelligence Strategic Plan 4
Presentation 7
5 Public Comment
................. 21 6
Adjourn..................... 82 7
8 9
10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
3 P R O C E E D I N G S 1
1:01 p.m.
2 MR. KLUKAN: Welcome, everyone, again.
3 Next slide, please. So good afternoon, my name is 4
Brett Klukan. Normally I serve as the Regional 5
Counsel for Region I of the U.S. Nuclear Regulatory 6
Commission, but this afternoon I will be acting as the 7
facilitator for this meeting.
8 This meeting is being held to discuss the 9
NRC's Draft Artificial Intelligence Strategic Plan for 10 fiscal years 2023-2027. For this meeting, we are 11 using Microsoft Teams.
12 We hope that the use of Microsoft Teams 13 will allow you, the stakeholders, to participate more 14 freely during the meeting. But this will also require 15 us to continually ensure that we are muted when we are 16 not speaking and to do our best not to speak over each 17 other.
18 To help facilitate the question and answer 19 portion of the meeting, we also recommend that you use 20 the raise-hand feature in Teams so we can more easily 21 identify who has a question or a comment and then call 22 on that individual to then pose their question or 23 comment.
24 You can also use the chat feature to alert 25 NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
4 us if you have a technical question. We would ask 1
that you please do not use the chat to address any 2
substantive questions. The chat will not be 3
considered part of the official meeting record and is 4
tended to be used mainly for handing virtual meeting 5
logistical issues. For example, I can't hear the 6
speakers, the slides aren't appearing for me, whatnot.
7 Next slide, please.
8 So before I turn to the meeting agenda, 9
just a few logistical matters. This meeting is being 10 again hosted as a Microsoft Teams meeting and is being 11 recorded and transcribed.
12 Should you have trouble using the 13 Microsoft Teams application, I would recommend that 14 you, one, or first, use the Microsoft Teams link 15 provided in the meeting notice as opposed to the 16 Microsoft Teams app, or application.
17 Second, you can try disconnecting and 18 trying to reconnect to this Teams meeting. Or third, 19 as a final alternative, you can use the teleconference 20 number that has also been provided in the meeting 21 notice to listen to and speak during the meeting.
22 This public meeting has been categorized 23 as a comment-gathering meeting, which means that this 24 meeting is for NRC staff to meet directly with 25 NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
5 individuals to receive comments from participants on 1
specific NRC decisions and actions to ensure that the 2
NRC staff understands their views and concerns.
3 Lastly, I would ask again that you please 4
endeavor to limit any interruptions during the 5
meeting. We ask again that you remain on mute until 6
you are ready to engage the participants. Prior to 7
addressing the meeting participants, please speak 8
- clearly, identifying yourself and any group 9
affiliations.
10 The slides for this meeting have been 11 posted on the public meeting notice site and can be 12 found in ADAMS, and I'm going to read the ML number to 13 you for those of your participating via the phone.
14 The ML number for the slides within ADAMS is ML-222-15 13A-273. Again, that's ML-222-13A-273.
16 If you did not register for this meeting 17 using the Teams webinar link, please send an email to 18 Matthew Dennis, M-A-T-T-H-E-W dot D-E-N-N-I-S at 19 NRC.gov so we can document your attendance in the 20 meeting summary. Contact meeting for this public 21 meeting -- the public meeting organizing staff can 22 also be found on the meeting's public notice.
23 And now we move on to today's agenda.
24 Next slide, please. So today's meeting will feature 25 NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
6 a discussion again of the NRC Draft Artificial 1
Intelligence Strategic Plan and will be followed by a 2
public comment-gathering period for interested 3
attendees.
4 First, the NRC will be providing opening 5
remarks -- as opening remarks, an overview of the 6
Draft AI Strategic Plan. Next, the NRC staff will 7
present on the regulatory purpose, AI definition, 8
preparatory activities, supporting development of the 9
strategy, and then overview of the strategy and next 10 steps.
11 Following the NRC presentation, we will 12 then have an opportunity for the public to provide 13 comments and ask questions of the NRC staff. Other 14 meeting participants are welcome to provide responses 15 but are not obligated to do so.
16 This meeting is scheduled to run from 1:00 17 p.m. to 3:00 p.m. Eastern Time. We will try to allow 18 for as much public input as possible, but again, we 19 will try to adhere to the meeting schedule.
20 At this time, I would now like to turn the 21 meeting over to Dr. Teri Lalain, a Deputy Division 22 Director for the Division of Systems Analysis in the 23 Office of Regulatory Research, who will provide the 24 NRC's opening remarks. Thank you.
25 NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
7 DR. LALAIN: Great, thank you, Brett.
1 Welcome, everybody. I'm please to be here today to 2
discuss our draft strategic plan with you, and thank 3
you for your interest in joining us today for this 4
public meeting. Next slide, please.
5 Artificial intelligence is one of the 6
fastest growing technologies globally and an area the 7
NRC must be prepared to evaluate technological 8
adoption by the nuclear industry.
9 As a result, there's interest, industry 10 interest in deploying these technologies to meet our 11 future national energy needs. And as a modern risk-12 informed regulator, we must keep pace with these 13 innovations while reducing barriers and ensuring safe 14 and secure use of AI.
15 To increase our awareness, during 2021 we 16 held three public workshops bringing together the 17 nuclear community to discuss the current and future 18 state. And one of the takeaways that we'll talk about 19 today is that AI is a very general term and it can 20 span a range of techniques and varying levels of 21 autonomy. So it's important that we're clear in our 22 language to achieve that common understanding when 23 we're discussing AI.
24 To prepare the Agency for this future, 25 NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
8 Research prepared the draft AI strategic plan that 1
includes several goals that will be discussed further 2
today. I am also joined today by Mr. Luis Betancourt, 3
the Chief of our Accident Analysis Branch, and Mr.
4 Matt Dennis, our Data Scientist in the Accident 5
Analysis Branch. Next slide, please.
6 All right, Matt, over to you.
7 MR. DENNIS: All right, thank you very 8
much, Teri. As Teri said, my name is Matt Dennis. I 9
am a Reactor Systems Engineer Data Scientist in the 10 Office of Nuclear Regulatory Research. And today I 11 will be presenting an overview of the strategic plan.
12 So first off is the purpose. As you might 13 already be aware, and Teri just mentioned, that AI is 14 becoming present in a lot of our -- in our personal 15 lives and our lives across all the things that we are 16 involved in. And that -- and the nuclear industry is 17 no exception.
18 So it has the potential to transform the 19 industry by providing new and vital insights into vast 20 amounts of data generated during the design and 21 operation of a nuclear facility. And in addition, 22 these technologies offer new opportunities to improve 23 safety and performance, operational performance, and 24 implement autonomous control and operation.
25 NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
9 And as a result, the industry, we are 1
aware, is researching these applications, as Teri 2
mentioned. And through those workshops, which we'll 3
discuss a little bit more in a minute, we got a flavor 4
of what is going on in the industry and what is 5
planned for the future through research projects or 6
other opportunities that are being undertaken.
7 So it's critical for us to determine how 8
these factors are evolving and keep abreast of all of 9
those, again, as Teri mentioned. So we need to look 10 at what is being done. AI is being used over a wide 11 range of nuclear power plant operations, from mining 12 nuclear data for predictive maintenance and 13 understanding core dynamics for more accurate reload 14 fuel planning.
15 So we recognize that there is a regulatory 16 decisionmaking component to this potentially, and we 17 are interesting -- interested in understanding the 18 possible regulatory implications on using AI within 19 the industry and to ensure that these technologies are 20 developed in a safe and secure manner.
21 So we see this as an opportunity to shape 22 the norms and values that enable the responsible use 23 of -- responsible and ethical use of AI. And we must 24 be prepared to effectively and efficiently review and 25 NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
10 evaluate the use of AI in NRC-regulated activities.
1 So next we'd like to discuss a little bit 2
of how we envision achieving a common understanding of 3
the term AI, and specifically, more specifically, what 4
that means in the context of the nuclear industry.
5 So you'll notice on the slide we have a 6
few key words there that are underlined in our -- in 7
the definition of artificial intelligence. I will 8
note that this definition is taken as a modification 9
from the National Defense Authorization Act of 2021.
10 So the AI strategic plan considers an 11 evolving landscape where computers use data and unseen 12 behavior to construct underlying algorithmic models, 13 draw inference, and define the rules to achieve a 14 task. So the AI strategic plan focuses on a broad 15 spectrum of sub-specialities, for example, nature 16 language processing, machine learning, and deep 17 learning, which could encompass various algorithms and 18 application examples for which the NRC has not 19 previously reviewed or evaluated.
20 And so some of those words tehre that are 21 underlined include emulate human-like perception, 22 cognition, planning, learning, and communication or 23 physical action. AI is intended to perceive real and 24
-- both real and virtual environments, that's an 25 NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
11 important distinction here. And AI is the top level 1
of -- which encompasses machine learning. And so 2
that's why we have the definition of machine learning 3
down there.
4 And the definition, again, taken from the 5
Defense Authorization Act, is an application -- it is 6
an application of artificial intelligence that is 7
characterized for -- by providing systems the ability 8
to learn and improve, those are two key words, on the 9
basis of data. And data is very important for any of 10 these activities.
11 So next I want to give everyone a little 12 bit of background information on how we had been 13 preparing the AI -- the draft AI strategic plan, what 14 we are learning, and how we are getting ready for the 15 future.
16 So the NRC is anticipating that the 17 industry may deploy AI technology that could require 18 some regulatory review and approval within the next 19 five years. And we are proactively developing the AI 20 strategic plan to better position the Agency in AI 21 decisionmaking.
22 The plan includes goals for AI 23 partnerships, cultivating an AI-proficient workforce, 24 building an AI foundation through AI usage, and 25 NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
12 assuring NRC readiness for AI decisionmaking. We plan 1
to use the strategic plan as a tool to increase 2
regulatory stability and certainty -- not uncertainty, 3
certainty.
4 This plan will also facilitate 5
communication and enable the staff to provide timely 6
regulatory information to our internal and external 7
stakeholders.
8 In developing this plan, we formed a group 9
-- an interdisciplinary team of AI subject matter 10 experts across the Agency and to increase awareness of 11 AI's technological adoption in the industry, we held 12 three public workshops that Teri mentioned early in 13 2021.
14 There is a link to the public workshops, 15 a hyperlink, and the text at the bottom of this slide 16 if you would like to go to look at them. We have the 17 presentations from all of the workshop participants 18 there.
19 And those workshops brought together the 20 nuclear community to discuss the current and future 21 state of AI. And in 2021, additionally the NRC issued 22 a Federal Register notice on the role of artificial 23 intelligence, tools in the U.S. commercial nuclear 24 power operations.
25 NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
13 The insights gained from that public 1
comment were used to develop NUREG/CR-7294. The title 2
of that report is Exploring Advanced Computational 3
Tools and Techniques within Artificial Intelligence 4
and Machine Learning in Operating Nuclear Plants. The 5
document -- it documents the current state of practice 6
of AI tools in the nuclear industry.
7 And lastly, we are committing to provide 8
-- committed to providing opportunities for the public 9
to participate meaningfully in our decisionmaking 10 process. As we continue the development of this plan, 11 we plan to solicit comments from the public and 12 feedback from the Advisory Committee on Reactor 13 Safeguard in the fall of 2023 -- I'm sorry, 2022.
14 So next, we want to give some time to talk 15 about the individual strategic goals of the strategic 16 plan. And so on the left you see that's the title 17 page of our artificial intelligence strategic plan, 18 NUREG 2261.
19 It covers fiscal years 2023-2027 and 20 establishes the vision and goals for the NRC to 21 continue to improve its skills and capabilities to 22 review and evaluate the application of AI in NRC-23 regulated activities, maintain awareness of 24 technological innovations, and ensure the safe and 25 NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
14 secure use of AI in NRC-regulated activities.
1 The AI strategic plan, as you see on the 2
slide, includes five goals. The first goal is to 3
ensure NRC readiness for regulatory decisionmaking.
4 Second is establish an organizational framework to 5
review AI applications.
6 Third is to strengthen and expand AI 7
partnerships. Fourth, cultivate an AI-proficient 8
workforce. And five is to pursue use cases to build 9
an AI foundation across the NRC.
10 Those goals listed are in order of 11 priority and expected to be initiated during varying 12 timeframes.
13 So I wanted to talk a little bit more 14 specifically about the intent of each one of those 15 goals. So the first strategic goal is the ultimate 16 outcome of the implementation of this strategic plan, 17 which is to continue to keep pace with technological 18 innovations to ensure safe and secure use in NRC-19 regulated activities through existing or new 20 regulatory guidance, rules, inspection procedures, or 21 oversight activities.
22 And for this goal, the successful outcome 23 would be providing, as needed, the regulatory guidance 24 and tools to ensure readiness for reviewing the use of 25 NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
15 AI in NRC-regulated activities.
1 The second strategic goal -- sorry. AI 2
goals, strategic goals 2 through 5, directly support 3
preparatory activities that culminate in successfully 4
supporting technical readiness for regulatory 5
decisionmaking activities desired in goal number 1.
6 So goal number 2 -- goals number 2 through 7
5 support goal number 1. The establish -- and goal 8
number 2, the establishment of the organizational 9
framework, would be undertaken, which ensures all 10 aspects of the NRC are represented in the preparation 11 for reviewing AI in NRC-regulated activities.
12 The successful outcome of this goal is an 13 organization that facilitates effective coordination 14 and collaboration across the NRC for those intended 15 purposes of NRC use of AI in NRC-regulated activities.
16 Goal number 3 is strong partnerships, and 17 those are essential to ensuring the safe and secure 18 use of AI in the nuclear industry. The NRC is 19 committed to engaging the industry and relevant 20 stakeholders to maintain awareness of industry efforts 21 and prepare for regulatory reviews.
22 When achieved, this goal will provide a 23 mechanism to, one, maintain aware of industry plans.
24 Two, establish communication forums to discuss future 25 NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
16 plans and regulatory needs. And three, effectively 1
partner with other agencies on AI topics of mutual 2
benefit.
3 Strategic goal number 4, the NRC will also 4
engage in workforce development and acquisition to 5
ensure that the NRC staff and contractors have the 6
critical skills required to evaluate AI, the use of 7
AI, in NRC-regulated activities. The goal is to adopt 8
an appropriate training programs and tools to develop 9
the requisite skills in the NRC workforce.
10 A successful outcome of this goal is to 11 ensure appropriate qualifications,
- training, 12 expertise, and access to tools -- and access to tools 13 exists for the NRC workforce to review and evaluate AI 14 usage in NRC-regulated activities.
15 And then finally, strategic goal number 5, 16 the NRC recognized the establishment of a foundation 17 in AI requires a broad swath of data science 18 principles as a fundamental requirement for evaluating 19 AI applications. And therefore, the NRC will build 20 the necessary AI foundation to pursue use cases across 21 the NRC.
22 This will help build an organizational 23 experience that supports future regulatory reviews and 24 oversight activities. For this goal, a successful 25 NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
17 outcome is one in which the NRC staff possesses an 1
ecosystem that supports AI analysis, integration of 2
emerging AI tools, and hands-on talent development for 3
reviewing AI applications from the nuclear industry.
4 As was mentioned earlier, the public 5
meeting notice has a link to the actual strategic 6
plan. It is also linked here. It's also linked in 7
the PDF of those slides. You can click on the actual 8
NUREG to the left there. Or it's available if you 9
search in ADAMS with the accession number ML-221-75A-10 206.
11 Next, we want to provide a little more 12 discussion on what we mean by notional AI -- we want 13 to discuss a notional AI and autonomy levels in 14 commercial nuclear activities. So as we're aware, AI 15 technology provides the underlying capability for 16 autonomous systems. While AI enables autonomy, 17 though, not all cases of AI are autonomous. And we 18 want to make that distinction.
19 So for example, many AI capabilities may 20 be used to augment human decisionmaking rather than 21 replace it. And so the table shown here provides a 22 notional AI and autonomy level framework in potential 23 commercial nuclear activities. Higher autonomy levels 24 indicate less reliance on human intervention or 25 NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
18 interaction, and therefore may require greater 1
regulatory scrutiny of the AI system.
2 AI strategic goal number 1 will assess the 3
current regulatory framework and in part establish the 4
appropriate regulatory requirements for varying 5
degrees of AI and autonomy.
6 In the figure that you see on the screen 7
here, with increasing machine independence, you have 8
decreasing human involvement. And so at level one, we 9
have name these each category level one, two, three, 10 and four, with notional AI autonomy levels of insight, 11 collaboration, operation, and full autonomy.
12 It's worth noting to point out that at 13 level one, that would be the lowest level of machine, 14 human-machine interaction, where the human 15 decisionmaking is merely assisted by a machine, all 16 the way up to level four, which would be fully 17 autonomous, where the machine is making decisionmaking 18 with no human intervention.
19 And so our regulatory framework will need 20 consider a graded approach in which each one of these 21 autonomy levels, four different systems, may be 22 applied at varying levels with regulatory scrutiny.
23 Lastly, in conclusion, we want to talk a 24 little bit more about the next steps for the 25 NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
19 artificial intelligence strategic plan and where we 1
can -- where you can be involved or where to look out 2
for more information.
3 We'd like to discuss where we currently 4
are with the draft strategic plan in development, and 5
that's what we've been presenting here today at this 6
public meeting, and our plans moving forward. Our 7
ultimate goal is to issue the AI strategic plan in 8
spring 2023.
9 Our early coordination dialog and 10 preplanning are key to increasing our regulatory 11 stability and certainty for the industry to deploy AI 12 technologies. This includes engagements across the 13 federal government, industry, and international 14 organizations.
15 And early engagement and information 16 exchange supports staff knowledge management and 17 development, which enables timely development and 18 execution of the AI strategic plan.
19 To ensure stakeholder engagement, we have 20 developed the timeline shown in this slide of our 21 current for the remainder of the year, and we 22 encourage your participation and comments.
23 As part of our communication strategy, we 24 recently unveiled -- unveiled our AI public website, 25 NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
20 making it easier for members of the public to obtain 1
information about the staff's AI activities, such as 2
public meetings like this, workshops, and other 3
activities.
4 We recognize that there is a public 5
interest in the potential regulatory implications of 6
AI in nuclear activities, and we want to provide 7
opportunities for the public to be heard.
8 So as you can see on the screen here, in 9
November we plan to brief the Advisory Committee on 10 Reactor Safeguards Joint Subcommittee on AI. I 11 already mentioned that in spring 2023, we plan to 12 issue the final strategic plan.
13 And as a reminder, this is a public 14 meeting on -- for comment-gathering on our AI 15 strategic, so we would direct you, you can use the 16 hyperlink in the slides.
17 Or it is also referenced in the public 18 meeting notice to go to the Federal Register notice at 19 regulations.gov using the docket ID number for this 20 FRN and submit your written public comments. So 21 again, you can use the link here on the slides or find 22 that in the public meeting notice.
23 And also the hyperlink there to the NRC AI 24 public website is provided, as it is down at the 25 NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
21 bottom of the slide in text form.
1 So that concludes our prepared comments 2
and slides on the AI strategic plan. I will turn it 3
back over now to Brett.
4 MR. KLUKAN: Great, thank you. So again, 5
this brings us to the public question and comment 6
portion for this meeting. I would ask individuals 7
when they begin speaking to please identify 8
themselves, name and any affiliation if you so choose 9
to list an affiliation, before you begin with your 10 questions or comments.
11 The NRC welcomes comments from the public 12 on any areas they believe are relevant to the topics 13 we've discussed today. With that said, the NRC is 14 particularly interested in receiving input on the 15 following questions, and I will read those out loud 16 for those of you on the phone.
17 Are there any specific recommendations for 18 improvements to consider in the development of the AI 19 strategic plan? What goals, objectives, or strategies 20 within the NRC's current strategic plan should be 21 added, enhanced, or modified in the AI strategic plan?
22 What are the potential near-term or far-23 term AI activities that the NRC should be aware when 24 finalizing and prioritizing the AI strategic plan or 25 NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
22 associated supporting research? And finally, what are 1
potential challenges the NRC should be aware of when 2
preparing to review the potential use of AI in nuclear 3
applications?
4 So again, for those of you using the Teams 5
application or in the browser, please use the raise-6 hand function, and then a presenter will allow you to 7
unmute. And then you will need to unmute yourself and 8
then begin your comment.
9 For those of you calling in, the process 10 is a little different. In order to raise your hand, 11 press *5, again, that's *5. When you are called upon, 12 the presenter will unmute you. You will then need to 13 press *6 to unmute yourself within the system.
14 Again, that is *6 to unmute yourself 15 within the system. And then separately, if you've 16 muted yourself separately on your phone, you would 17 need to unmute that as well, okay. So again, that's 18
- 5 to raise your hand and *6 to unmute yourself.
19 And for those participating via the app or 20 via browser, use the raise-hand function, which is in 21 the top right-hand corner of your screen under 22 reactions or react. And I think it's located in the 23 same, generally the same area for those of your 24 participating in the browser.
25 NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
23 So with that said, we are ready to begin 1
public comments and questions. So please feel free to 2
raise your hands, and I'll call on the order of 3
individuals in terms of the order in which they raise 4
their hands. Thank you.
5 So James Slider, I am going to -- go right 6
ahead, James Slide.
7 MR. SLIDER: Thank you, Brett. This is 8
Jim Slider, I just want to verify you can hear me.
9 MR. KLUKAN: We can hear you.
10 MR. SLIDER: Okay, great, thank you. I'm 11 Jim Slider from the Nuclear Energy Institute. And I 12 have lead responsibility for our work in supporting 13 innovations for the operating plants, including 14 adapting artificial intelligence for use in our 15 operating facilities.
16 Just a couple of questions to get the 17 conversation started. Matt, the timeframe given for 18 this plan is fiscal years '23 through '27, and I'm 19 curious, you're going to be halfway through fiscal '23 20 before you have the final plan out.
21 And I'm wondering, there's nothing in this 22 document that indicates what's going to be happening 23 over those succeeding couple of years in the scope of 24 this document. Can you describe or give any sort of 25 NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
24 overview of what we can expect the NRC to focus on in 1
the latter half of fiscal '23 out through '27?
2 MR. DENNIS: I can -- I will give a brief 3
response, and then actually ask Teri to give the more 4
fulsome answer. But yes, in the latter part of '23 5
once this is finalized, one of the first action items 6
is going to be that prioritization of organizational 7
framework.
8 And so there is some mention of that with 9
the strategic goal number 1, and that is standing up 10 the, either the working groups or community of 11 practice for that AI, to support the remainder of the 12 AI and do the prioritization of activities within 13 budget constraints and such.
14 So Teri, if you want to elaborate on that.
15 DR. LALAIN: Yeah. Hey, Jim. So the 16 strategic plan years map to the NRC's overarching 17 strategic plan. So that's where the year increments 18 tie back to. And as Matt said, you know, we'll be 19 kicking off, we'll be getting the pieces in place, 20 we'll be working on each of those tasks.
21 So we anticipate, for example, in the 22 external awareness one, we're going to continue with 23 having workshops and other activities, so we'll 24 continue having the dialogs in a public forum. And 25 NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
25 that we're tracking together.
1 So all of the things will start to launch 2
out when we officially kick off in '23. Over.
3 MR. SLIDER: Okay, thank you, Teri. I'm 4
just trying to get a mental picture of what we can 5
expect to see from the NRC as you start to move out on 6
the actions that fall under the scope of this plan.
7 Thank you.
8 MR. KLUKAN: Well, thank you, Jim, for 9
your questions and for participating today.
10 Next we'll turn to Tyler Cody. Tyler 11 Cody, whenever you're ready, please feel free to 12 unmute yourself.
13 DR. CODY: Thank you, yes, I'm Tyler. I 14 am with the Virginia Tech National Security Institute.
15 I'm a research assistant professor, and my background 16 is in systems engineering and where systems 17 engineering meets artificial intelligence.
18 I had a comment on the question one here 19 about specific recommendations or improvements. On --
20 (inaudible) pretty specific, but on line 34, page 4-2 21 in the strategic goal 1 section, like 4.1, the 22 statement reads, The NRC will undertake research to 23 develop an AI framework to determine the approach to 24 assess areas such as, but not limited to, 25 NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
26 explainability, trustworthiness, bias, robustness, 1
ethics, security risks, and technical readiness of AI.
2 So while these topics are definitely top 3
of mind in the computer science community and like the 4
people who are very close to artificial intelligence, 5
from a systems engineering perspective, I think that 6
these various topics are akin to properties we would 7
like an AI solution to have. And that there are other 8
concerns which are fundamental to engineering 9
generally, but may take on different forms in AI that 10 it's important to consider.
11 So I just want to mention too, basic and 12 related ones. And the first is test and evaluation, 13 which is you need to just even see if these properties 14 are actually possessed by the AI. So it's one thing 15 to say, oh, we have a method for explainability, but 16 how do we even know that method's working? What kind 17 of scenarios will we test that method in?
18 So under test and evaluation, there are 19 two sort of issues I think about and I just want to 20 highlight them for the sake of being a little thought-21 provoking. But the first is that there's this 22 prevailing focus on using held-out but identically 23 distributed test sets of data.
24 And identically distributed testing checks 25 NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
27 that the algorithm's working, but like the learning 1
algorithm or the AI algorithm, but it's not actually 2
testing if, say, the function approximation that it 3
produces, right. Because the AI produces some 4
function that tells us, okay, give me these inputs, 5
I'll give you those outputs.
6 It's not actually telling us if that's 7
going to work as intended in the variety of scenarios 8
we're going to face during operation. So the first 9
part is this identically distributed testing.
10 But the second is sort of built on that, 11 because while the AI solutions themselves will be like 12 these input-output components for most part, they also 13 have these system-level influences. So, and they 14 create system-level effects, and you know, system-15 level outcomes that we look at.
16 And so just the input-output component 17 level testing might lack the scope to properly 18 identify operating envelopes of AI solutions in terms 19 of their system or the environment they're in, like 20 what the operating conditions are, etc.
21 So backing out, I think testing evaluation 22 has these dual challenges where the -- right, 23 currently the component-level tests are insufficient.
24 But also, the component-level tests won't be all that 25 NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
28 we need to make sure that, you know, we're producing 1
the outcomes we want from the AI.
2 And I know I'm going on here, but briefly, 3
the second related issues is on the life cycle, and 4
it's very closely related, right. So AI solutions do 5
have life cycles. But that's kind of an understudied 6
topic. So whatever framework that's proposed should 7
involve some plans for test and evaluation over the 8
life cycle to sort of monitor the health of the AI, if 9
you will.
10 And also address the concepts of, like, 11 system maintenance that we have generally, right. So 12 for the AI, what does is it to retrain it or 13 recalibrate it, etc. As well as retirement, you know, 14 what is the process for retiring an AI model.
15 So I will concede that, you know, this is 16 really relevant in dynamic settings. So you know, in 17
-- in a nuclear setting, things are very constrained.
18 So it's important to say okay, how much variance can 19 we expect, where will that variance come from. And 20 that's where life cycle and test and evaluation I 21 think should be focused on.
22 So in short, my specific recommendation is 23 to consider an emphasis on test and evaluation and 24 life cycle management, in addition to these pillars or 25 NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
29 these properties that are currently listed on line 34.
1 MR. KLUKAN: Tyler, thank you so much for 2
participating and for your comments today. And 3
please, no worries, the purpose of this meeting is to 4
hear from -- from you and from others about, you know, 5
your comments on the report and on the questions that 6
we pose here today. So again, thank you.
7 And again, NRC staff, please feel free to 8
raise your hands as well if you want to jump in at any 9
point to respond to anything that anyone said. But 10 next, we will turn to Ian Davis. Ian Davis, whenever 11 you're ready, please feel free to unmute yourself and 12 begin your comments.
13 MR. DAVIS: Hi, thank you, Brett. This is 14 Ian Davis from X-energy. My questions are related to 15 the level of industry involvement with these 16 activities, and specifically in two areas.
17 One, there's a mention of an AI community 18 of practice, and I was curious if this was an internal 19 community or if this was going to be open to the 20 public.
21 And another section talks about pilot 22 projects and proof-of-concept projects, and I was 23 wondering if that was also going to be entirely 24 internal to the NRC, or if that would involve 25 NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
30 partnerships with industry, and if so, what would that 1
look like. Thank you.
2 MR. KLUKAN: Thank you for your -- for 3
your questions. I'll turn it over to Matt.
4 MR. DENNIS: Thanks, yeah, I was going to 5
-- I unmuted myself and I'll turn my camera back on, 6
so.
7 Yes, in response to Ian's first question 8
about the AI community of practice that is referenced 9
in, I was trying to flip back through and find the 10 exact place in the document. I think that's in 11 Section 4.2 somewhere around line 15, To support staff 12 engagement, the NRC will establish an AI community of 13 practice.
14 And that is intended to be comprised of 15 internal Agency staff members. So that's really 16 focusing on much like we have other communities of 17 practice for other specific technical topic areas in 18 the Agency, for us to have a community of practice 19 internal to the Agency for AI-related development.
20 So I hope that answers the first question.
21 The second question referencing the pilot projects, 22 that is -- and I will ask that Luis or Teri please 23 chime in on this answer as well. But that was 24 intended to be a pilot project with industry or other 25 NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
31 external entity to ultimately pilot whatever may be 1
potentially determined as a framework for assessment 2
and evaluation.
3 MR. BETANCOURT: Yeah, hi, this is Luis 4
Betancourt, I am the Branch Chief for AI. So really 5
good question. So on the second one, the idea behind 6
the pilot is that once we're developing the framework 7
and we just see these levels of autonomy, we want to 8
basically have an early (inaudible) to go to that 9
framework.
10 And it will be ideal for industry to let 11 us know like what are the areas that they want us to 12 focus for each one of those levels so we can better 13 tailor our priorities so we can better start 14 identifying these early pilots.
15 Because the idea is that we need to start 16 baby steps in a way to be able to see whether or not 17 the framework is effective in order to provide greater 18 stability to the industry. And whether or not this 19 could be part of the licensing framework.
20 So that's the idea behind those pilots and 21 that's what we would like to further engage with 22 industry on. What are those things that we need to be 23 looking upon now so we can be looking upon the 24 framework.
25 NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
32 And I see Teri just put her camera on, so 1
Teri.
2 DR. LALAIN: Yeah, I'm going to go back to 3
the first part. So while the community of practice is 4
going to be internal, that's to help us with all the 5
different areas that NRC has responsibility for for 6
that communication and coordination inside.
7 One of the goals in the strategic plan is 8
to strengthen and expand those AI partnerships. So 9
I've mentioned that, you know, we be continuing with 10 the workshops and talking about the workshops from 11
'21. So there will be public engagements to where we 12 can bring the community together in a public 13 environment to have those types of continuing dialogs 14 in that.
15 So there will be inclusion through goal.
16 Over.
17 MR. DAVIS: Thank you.
18 MR. KLUKAN: Thank very -- oh, don't want 19 to jump in there if the NRC has more to say on this 20 topic. Okay.
21 MR. DAVIS: No, sorry, Brett, that was me.
22 I was just saying thank you.
23 MR. KLUKAN: Okay, great, well, thank you 24 for the questions. And again, if any of our speakers 25 NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
33 have followup, please feel free to raise your hand 1
again, and you know, we can go through a second round 2
of questions or comments if anything that you hear 3
said by another speaker or by the NRC staff sparks 4
something.
5 So our next speaker will be Yadav Vaibhav.
6 And I am mispronouncing anyone's name, I apologize 7
profusely in advance. But our next speaker again will 8
be Yadav. Feel free to unmute yourself whenever 9
you're ready and then state your name and any 10 affiliation.
11 DR. YADAV: Thank you, Brett, can you hear 12 me?
13 MR. KLUKAN: Yes, yes, we can.
14 DR. YADAV: Thank you. My name is Vaibhav 15 Yadav. I am a Senior Scientist in the Nuclear Safety 16 and Regulatory Research Division at the Idaho National 17 Lab.
18 And my question is related, actually, to 19 the previous question by Ian. Is there -- is -- does 20 NRC have a structure or a mechanism to enable that, 21 you know, industry engagement? Is there a -- you 22 know, we're talking about 2027.
23 Are there any tangible objectives and 24 goals and timelines defined, or does NRC has a vision 25 NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
34 to, you know, have a like a competitive engagement 1
released in next year or so that will kick off that 2
type of industry engagement have a
research 3
development or demonstration or use case demonstrated 4
to achieve those objectives over the course of next 5
three to four years?
6 MR. DENNIS: I will --
7 MR. KLUKAN: No, go ahead, Matt.
8 MR. DENNIS: Sorry, Brett. I will -- I 9
will answer that the best I can and say that the --
10 this is -- so the strategic plan is written to be high 11 level objective with implementing actions that will 12 come about once we finalize the plan.
13 And that is, to answer your question, 14 Vaibhav, that is part of the goal, is that in 15 strategic goal -- sorry, I always have to go back and 16 look at these to remember which's one which.
17 Strategic goal number 5 with the pilot applications.
18 There would be an -- so one of the first 19 steps of the pilot engagement would be, as Luis 20 mentioned, we first need to find out what do we want 21 to go after first and what is the area that -- or what 22 is the use case, the area, the application, all of 23 that kind of stuff that would make a good pilot.
24 And then of course the follow-on to that 25 NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
35 is planning it and scheduling it. And there is 1
precedent for this for pilot analyses that have been 2
done for other applications. So this isn't new 3
territory for us.
4 However, we are just starting this and so 5
as part of this five-year plan, which would come about 6
in hopefully the next fiscal year, is that we would 7
start to coalesce on what kind of pilot we would want 8
to look at, what framework.
9 We've also, as we mentioned, we've had the 10 three previous data science and regulatory 11 applications workshops, and we kind of got some 12 indication of what might make a good application. But 13 again, we would need engagement to further hone in on 14 that. And we are planning to have another biannual 15 workshop next year, so in 2023.
16 And so that would also be supporting 17 strategic goal number 5 in that area of the use case 18 development. And so again, to kind of summarize that, 19 that yes, there would be something within that --
20 within the horizon of our strategic goal fiscal years 21 to look out and have identified a pilot project, what 22 that's going to be, and then go about the minutiae of 23 implementing something like that.
24 MR. KLUKAN: Thank you, Matt. And again, 25 NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
36 Yadav, thank you for the question.
1 Before we go to our next speaker I just 2
want to remind, because we do have several people 3
participating via phone, that in order to raise your 4
hand press *5. Again, if you are participating via 5
phone and you'd like to speak or, you know, make a 6
comment, pose a question, please press *5, and that 7
lets me know that you'd like to, you know, to speak 8
during the meeting.
9 So our next speaker with be Catherine 10 Buchaniec.
And I
again apologize if I'm 11 mispronouncing your last name. Again, our next 12 speaker will be Catherine Buchaniec. So whenever 13 you're ready, feel free to unmute yourself and state 14 your name and any affiliation.
15 MS. BUCHANIEC: No, you actually got my 16 name 100% correct, you're good. My name is Catherine 17 Buchaniec, I'm a reporter with Defense News, where I 18 cover artificial intelligence.
19 And so my question is in the draft 20 framework, it said that the NRC is committed to 21 ensuring that the use of new technologies is safe and 22 secure, and increasing AI with like any organization 23 or mechanism can lead to increased risks.
24 And so I'm just wondering what 25 NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
37 coordination, if any, is being done with like the 1
country's national security agencies or departments to 2
address any security concerns with the strategic plan 3
or creating kind of like an across-the-board approach 4
framework for ethical AI. Just, what's going on with 5
cooperation or coordination?
6 MR. KLUKAN: Thank you for the question.
7 I'm just verifying who from the NRC -- all right, 8
Matt, it looks like you turned your camera on first, 9
so we'll go to you.
10 MR. DENNIS: There we go, I'm unmuted.
11 I'll let Luis take it. I was letting him get his 12 camera back on. But I will mention that strategic 13 goal number 3 in response to the question discusses 14 strengthening and expanding AI partnerships.
15 And that is, again, as this is a kickoff 16 of our AI strategic plan, that is a component across 17 our engagement -- across the federal agencies, 18 international counterparts, and within industry 19 through our memorandum -- our formal research and 20 development activities that are -- I should say our 21 formal research activities through NRC
- MOUs, 22 memorandums of understanding. We do have some that 23 exist.
24 Also, participation with other activities 25 NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
38 that are already ongoing in the federal government 1
where the NRC has representatives on those activities 2
at other federal agencies who are implementing the 3
federal response to AI planning.
4 So that is -- that is already thought 5
about a little bit, and more will come within 6
strategic goal as this is -- as the AI strategic plan 7
is implemented in next fiscal year. So I'll let Luis 8
answer so more.
9 MR. BETANCOURT: No, Matt, I think you 10 answered basically what I want to mention. But to 11 answer your question, Catherine, yes, we are engaging 12 with other federal agencies at the moment trying to 13 learn about how they're deploying basically safe and 14 secure uses of AI.
15 At the same time, we're not the only 16 agency, not from the nuclear industry, that is facing 17 similar challenges.
The other international 18 regulators that are basically on the same situation as 19 we are, and we are talking with the international 20 community to better understand how we can better 21 safely as well as secure these activities in our 22 regulatory activities.
23 MR. KLUKAN: Well, thank you, Matt and 24 Luis. And Catherine, thank you for your question.
25 NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
39 And again, if you'd like to pose a comment or raise a 1
question, raise your hand within the Teams app, on the 2
Teams browser. If you're having difficulties with 3
that, please let know within the chat window, and then 4
I can -- that lets me know that you'd like to be 5
called upon.
6 And again, if you're participating via 7
phone, it's *5, again, that is *5 to raise your hand.
8 So thanks again, Catherine. Our next 9
speaker will be Dan Solitz, again, that is Dan Solitz.
10 Feel free to unmute yourself whenever you're ready and 11 state your name and the affiliation.
12 MR. SOLITZ: This is Dan Solitz, I'm a 13 private citizen. My background was the Naval Nuclear 14 Propulsion Program, and then later on I went and 15 worked in a particle board plant.
16 The purpose of my question is try to get 17 a better understanding of where you're going with this 18 AI. And let me go back to the particle board plant.
19 It was analog system. The press had 14 openings and 20 a cold place where coolant was -- solder (inaudible) 21 were pushed in and then pressed then pulled out.
22 And before they were pressed, the operator 23 made sure the push arm was clear and all the cold 24 plates had been cleared out. And then the operator 25 NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
40 pushed the press button. They figured out that they 1
could save a -- they could probably do two to three 2
more loads a day if they just relied on the relays 3
that operated all these equipment to go ahead and 4
close the press.
5 And they didn't do it in my plant, they 6
did do it in another plant, and I watched it --
7 watched it happen. It made me very nervous.
8 So moving on to that to a nuclear power 9
plant, are you considering something like calculating 10 rod worth and fuel depletion and then automatically 11 withdrawing the rods to -- to compensate for that? Is 12 that the kind of use of artificial intelligence you're 13 thinking about?
14 MR. KLUKAN: Dan, thank you for the 15 question. I think Luis is going to answer for the 16 NRC.
17 MR. BETANCOURT: Hey, thank you, Dan. So 18 at the moment, like what I would refer to you like in 19 the data science workshops, you can see where the 20 industry is currently using AI applications at the 21 moment. At the moment right now, the majority of 22 applications are under assisted regulatory programs 23 where they can -- they can use AI within their 24 activities.
25 NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
41 Some of the discussions that you mentioned 1
are about different levels of autonomy, and those are 2
the areas that we need to working upon with industry 3
to figure out where they want to use AI for, as part 4
of their licensing strategy as well as oversight.
5 At they moment, we have some idea of where 6
they want to -- they want to use it, but it isn't 7
clear at this point which application is going to come 8
into that (inaudible), and that's why we need to 9
engage with industry to decide which area we want to 10 tackle first as part of our regulatory activities.
11 I hope that answered your question, Dan.
12 MR. SOLITZ: Well, if you could give me a 13 range of things you're considering, it would be 14 helpful.
15 MR. BETANCOURT: Range of where the 16 industry wants to use it? At this time, like that's 17 why we need to engage with industry. Because some of 18 the areas that you mentioned are under regulatory 19 activities, and I cannot speak on behalf of industry 20 of where they want to use AI.
21 MR. SOLITZ: Is there anyone from industry 22 on the call here that could?
23 MR. BETANCOURT: I think like since this 24 is like an NRC call (inaudible), correct me if I'm 25 NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
42 wrong, Brett, the industry cannot answer that 1
question. Like the NRC is the only one who can 2
provide that answer, unless they want to provide that 3
voluntarily.
4 MR. KLUKAN: Yeah, Dan, thank you very 5
much for the question. Recognize you're interested in 6
-- you know, from our perspective at the NRC, we're 7
collecting comments at this time. The question you 8
asked, as Luis has indicated, is really one, you know, 9
of, you know, the industry figuring itself how it 10 wants to use this AI.
11 So again, if there are any speakers who 12 would like to, you know, from industry, who would like 13 to address that comment, please feel free, you're 14 welcome to do so during your portion to speak. So but 15 again, Dan, thank you very much for the question.
16 Well, next actually turn to Jim Slider 17 again. Again, that -- Jim Slider, whenever you're 18 ready.
19 MR. SLIDER: Thanks, Brett. Yeah, this is 20 Jim Slider from NEI. Thank you. Just a follow-up 21 question to some of the previous comments in the last 22 few minutes.
23 On the question about levels of autonomy, 24 I don't know if this goes to Matt or Luis, but I'm 25 NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
43 curious, the table that you showed implies that 1
there's one criterion that goes into defining those 2
levels.
3 And I think that the answer is more 4
complicated than that, and I'm just wondering if you 5
have any thoughts at this point about how -- how you 6
might be distinguishing, say, levels two and three in 7
more ways than just that description of -- of autonomy 8
versus independence and so forth, as you showed on 9
your chart.
10 And if you don't have an answer to that 11 question, I'm just curious where in the timeframe of 12 this plan you would expect to flesh out those criteria 13 that distinguish, say, AI applications at level two 14 that you might require some regulatory review from, 15 say, levels -- AI level three applications where it 16 might not be -- might not fall under NRC's aegis.
17 I'm just curious if you have any thoughts 18 on that or when that would be defined.
19 MR. DENNIS: Yes, I'll -- this is Matt 20 Dennis again, I'll take -- I will answer that question 21 to the best of my ability. And first say, Jim, I 22 think you make a -- you raise the question that we 23 have ourselves and that is, again, the point of what 24 we're going to do within the scope of the strategic 25 NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
44 plan is to better define the level two to three bleed-1 over.
2 And I think I said this a couple times in 3
other presentations, and I'll say it again here, I do 4
agree that the -- the difficulty for us is figuring 5
out what that is and it's at level between 6
collaboration and operation. So your question or 7
comment is very pertinent, and I agree with you that 8
that is an area where we don't have an answer at this 9
point. And we will have to figure that out.
10 I will -- the one comment I will make in 11 response to your question is I do -- I do not think 12 that the table -- and that's a good -- if that is your 13 comment, so maybe I need some clarification on the 14 comment, is that the communication of the table -- is 15 the table communicating that there's a single 16 criteria.
17 And if that's the intent, or if that's the 18 comment that you're providing, I don't think that that 19 was our intent, and that the table should be more 20 nuanced than that. That there is not a single 21 criteria based on the distinction between human making 22 a decision vice machine making a decision.
23 So it is not as clear-cut as that, and 24 there will be multiple components to defining what 25 NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
45 that tipping point is from level two to level three.
1 So I guess one -- my returning for just a 2
clarification is, is the comment that the table 3
conveys that there's just a singular criterion?
4 MR. SLIDER: That's right, Matt, that 5
fundamentally that's my question. And the answer I 6
hear you saying is you agree it's more complicated 7
than that. And that's fine. The table is very 8
instructive and very helpful as a place to begin the 9
conversation. I just wanted to make sure that you 10 were thinking about it multi-dimensionally.
11 Matt, if I could ask a follow-up question 12 on your slide 10, on the second bullet, where you 13 talked about the interdisciplinary team of AI subject 14 matter experts across the Agency. I'm curious what --
15 what other disciplines besides expertise in AI are 16 important to the success of that team?
17 MR. DENNIS: Thanks for the question.
18 Yeah, the interdisciplinary team of AI subject matter 19 experts across the Agency was engaged with social 20 scientists, human factors, regional representatives 21 such as inspectors, and a variety of others. So other 22 engineering disciplines or other degreed disciplines, 23 so computer scientists, traditional nuclear engineers, 24 data scientists, computer engineers, application.
25 NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
46 So we have a wide -- we tried to capture 1
-- and people also expert in safeguards and security.
2 So we tried to, in that interdisciplinary team, 3
capture a lot of the subject matter expertise in a 4
variety of areas across the Agency that have some 5
engagement with artificial intelligence so that it 6
wasn't purely just reactor operations or traditional 7
nuclear engineering.
8 So we tried to look across the Agency 9
where AI might be implemented across all of the 10 offices, and then get people involved in that 11 interdisciplinary team that represented a variety of 12 stakeholders internal to the Agency, as well as 13 experienced expertise in a variety of different areas.
14 MR. SLIDER: Okay, thank you. Yeah, it 15 sounds like you really wanted to touch all of the 16 different functional areas across the Agency. And I 17 just wanted to clarify, thank you. Thank you, you 18 answered my question.
19 I have more questions, but I'm going to 20 yield to floor to Tyler or others who have raised 21 their hands and come back later. Thank you.
22 MR. KLUKAN: Well, thank you, again, Jim.
23 Just a reminder before we go to Tyler is 24 that if you are on the phone, please feel free to 25 NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
47 press *5 to raise your hand, again, that is *5. And 1
again, if you're having any troubles with the Teams 2
app, it can happen, you know, please feel free to let 3
me know in the chat. Again, just for technical 4
difficulties with the software. And then I can know 5
to call on you based upon that.
6 So next again, we will go to Tyler.
7 Whenever you're ready.
8 DR. CODY: Thank you. Yeah, again, Tyler 9
Cody here, research assistant professor with Virginia 10 Tech National Security Institute.
11 I had a comment for the third question 12 here about potential near-term and far-term AI 13 activities. I just want to bring awareness to this 14 group that the systems engineering research community 15 has a growing and active interest in systems 16 engineering methods and best practices for AI and AI-17 enabled systems.
18 And that these lines of research are 19 directly concerned with the engineering process. So 20 like all the way from need analysis through 21 decomposition to components and recomposition to a 22 system, deployment maintenance, retirement, etc. And 23 so in the near term, I just, there is this ongoing 24 work that I think will address a gap in the literature 25 NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
48 as well as the research community in increasingly like 1
the workforce.
2 Because currently there's a lot of focus 3
on the AI algorithms or on those properties like 4
explainability, trustworthiness, privacy, etc. But 5
just not as much research into like what's the 6
engineering process that it followed, will produce the 7
AI solution, you know, that fits our requirements and 8
our needs.
9 So I think that's a near-term area that is 10 being worked on elsewhere and is very relevant to this 11 community. And in the farther term, the same 12 community is looking at the use of digital models as 13 part of digital engineering and model-based systems 14 engineering activities.
15 And those activities will expand to AI-16 enabled systems.
Which basically means that 17 increasingly there'll be digital processes for 18 verification and validation and accreditation of AI 19 solutions using things like digital twins, and model-20 based systems engineering to, you know, to make use of 21 those digital twins to test the performance, to test 22 against requirements, you know, to vary the conditions 23 in those virtual environments.
24 So I just, yeah, I wanted to mention the 25 NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
49 Department of Defense is funding this research, and 1
they have a growing portfolio of this research. And 2
so with respect to strategic goals 2 and 3, I'd really 3
suggest trying to form some connections with the 4
systems engineering research community.
5 Thank you.
6 MR. DENNIS: This is Matt Dennis again 7
from the NRC, and I know there wasn't necessarily a 8
question in that from Tyler, but I would like to 9
elaborate on an issue that he brings up that I think 10 is very pertinent to the strategic plan, and the 11 components of what we envision going forward. And 12 that is that standards development is quite crucial, 13 and has been at the agency for a long time.
14 The agency does look towards standards 15 development organizations as a cooperative, and 16 collegial way to have applicable standards that the 17 NRC can endorse in certain areas, and we find that 18 very successful. So, as a component of the strategic 19 plan, there is a recognition that we will participate, 20 and potentially endorse standards that are applicable 21 to artificial intelligence, or other areas.
22 So, it is a very pertinent comment, and 23 one that we are very much interested in, and will be 24 a component of the strategic goals. Specifically, 25 NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
50 probably strategic goal number three, that's the 1
closest thing to expanding our partnerships, which do 2
include standards development organizations. And then 3
lastly, real quick, maybe not an answer, but just a 4
little elaboration on the digital twin aspect.
5 The agency does have quite an initiative 6
within its Office of Research on digital twins. We 7
have done a number of workshops on the issue related 8
to digital twins for nuclear applications. And so 9
while that is its own stand alone activity within the 10 NRC, there is a nexus, an overlap, because there is an 11 obvious artificial intelligence component to digital 12 twinning, and model based engineering.
13 So, that is something that we are engaged 14 with. That particular group, I will direct you to go 15 to the NRC website, and find out more information 16 about the digital twin activities that have been 17 happening at the agency.
18 MR. KLUKAN: Thank you Matt, and again, 19 thank you Tyler for your comments, and for just 20 participating in the meeting today. Our next speaker 21 will be Marty Murphy, again, that is Marty Murphy.
22 Whenever you're ready. Please feel free to unmute 23 yourself, state your name, and any affiliation. So, 24 I'll turn it over to you Marty.
25 NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
51 MR. MURPHY: This is Marty Murphy with 1
Curtiss-Wright Scientech. So, I think this pertains 2
to item number four for potential challenges. And my 3
question really goes to exactly how is the NRC going 4
to consider the generation of new staff positions as 5
it develops, and works through the implementation of 6
the strategic plan? Especially for those applications 7
that don't require specific NRC approval via 8
amendment, or specific review?
9 And what I'm talking about is inspection 10 procedures, I didn't exactly see within the strategic 11 plan where that was considered, or the use of say CRGR 12 was going to be applied to products that the NRC 13 produces that don't typically go through the committee 14 to review generic requirements.
15 MR. KLUKAN: Thank you very much for the 16 question, I'll turn it over to the NRC staff.
17 MR. BETANCOURT: Hey Marty. I believe we 18 did mention the possibility for us to evaluate our 19 variable framework. Because like you mentioned, there 20 are areas where there are inspection procedures that 21 we may need to update. And I think that's an area 22 that we need to talk about initially first to be able 23 to start identifying
- okay, which guidance is 24 available.
25 NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
52 It's insufficient at this point, that's 1
one of the reasons that we're engaging with our 2
reactor inspectors, to figure that out. But you bring 3
up a really good point, so I'll take note on that, and 4
see if we can also do better, refine that aspect. But 5
one of the things that we need to do is to do 6
potential cap analysis of our regulatory framework.
7 Do we need new guidance?
8 Do we need new special procedures? And 9
that's an area that we need to talk over initially.
10 I hope that answered your question Marty.
11 MR. MURPHY: Yeah, thanks very much Luis, 12 I appreciate it. And I think it's great that the NRC 13 is proactively developing this plan, and engaging with 14 its stakeholders to get that kind of feedback.
15 Because like I said, I do think there's the potential, 16 and it would be great to have that openness, and 17 transparency as we move forward with this. So, thank 18 you very much.
19 MR. BETANCOURT: Yeah, and one thing that 20 I forgot to mention, on goal number three, which I 21 think is the one on cultivating our work force, one 22 other that we want to do is to have our staff to be 23 more AI savvy. Because there's not many people at the 24 NRC that are knowledgeable about AI, and one of the 25 NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
53 things that we need to do is to train our staff, 1
including the resident inspectors. So, that's an area 2
that we can also be talking about eventually.
3 MR. MURPHY: Yeah, thanks very much.
4 MR. KLUKAN: All right, Marty thank you 5
again for the questions, and for participating today.
6 Our next speaker will be Brian Golchert. Brian 7
Golchert, whenever you're ready, please state your 8
name, and any affiliation.
9 DR. GOLCHERT: Good afternoon, my name is 10 Brian Golchert, I'm with Westinghouse. I have some 11 concerns about a long range for the NRC regulation.
12 It's a broad topic AIML, whatever you want to call it, 13 it covers a lot of things. And from a practical point 14 of view, I'm worried about any future regulations 15 will, I'll use the term what if us to death, that it 16 will be cost ineffective for us to use these tools if 17 you say do this model, check that model, do these 18 uncertainty studies.
19 I was just wondering for the strategic 20 vision, if there had been thought to have a more 21 intimate relationship with the potential applicants to 22 have discussions on future regulations, future 23 guidance to say you can't ask us to do this, it'll 24 kill us cost effectively. Has that been considered as 25 NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
54 part of the strategic plan?
1 MR. BETANCOURT: Yeah Brian, that's an 2
area that we cannot be a barrier, to your point if 3
industry wants to use these technologies, we need to 4
be able to enable these technologies. To your point, 5
we have been thinking for example, that we need to do 6
confirmatory analysis in order to have industry 7
provide us the data, yes, or no. I don't see AI being 8
different than other types of technology.
9 Where the system framework can be used to 10 get it out the door, or we need to have that 11 discussion with industry to -- what are the things 12 that we really need to require for AI applications?
13 Because this is a new area that is constantly 14 evolving. So, to your point Brian, that's something 15 that is in the back of our minds while we're 16 developing this AI strategy.
17 DR. GOLCHERT: Thank you.
18 MR. BETANCOURT: Good question.
19 MR. KLUKAN: Thank you again for the 20 question. And again, just as a reminder to everyone, 21 if you'd like to pose a comment, or raise a question, 22 raise your hand within the Teams app, or the Teams 23 browser, or press star five on your phone if you're 24 participating via phone. We will next go back to Jim 25 NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
55 Slider.
1 MR. SLIDER: Thanks Brett. Just to follow 2
up on Brian Golchert's last comment Luis, I just want 3
to exhort you, and Matt, and Terry to please engage 4
industry early, and often as this work unfolds. I 5
think what I hear in Brian's comment, and others is a 6
high level of interest in working with the NRC 7
collaboratively to find an appropriate level of 8
regulatory scrutiny that assures public confidence in 9
these tools.
10 But at the same time, doesn't kill them in 11 the crib. Because they do offer the prospect of 12 improving safety in our operating plants, and we very 13 much want to do those things that enhance safety, but 14 it's important that the level of regulation be 15 appropriate, and fitting for the technological 16 challenges that these represent.
17 DR. LALAIN: Agree Jim, and that's why 18 engagements like this to get comments today, the 19 workshops that we had last year, the future workshops, 20 all of those engagements are key, so that we keep the 21 communication going, and with that awareness, can work 22 for that future together. So, point's well taken, 23 thank you.
24 MR. SLIDER: Thank you. A follow up 25 NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
56 question on the five year horizon that's posited at 1
the beginning of this plan. On the one hand I would 2
urge the NRC to be as prompt as possible in getting 3
down the road, and developing the regulatory guidance.
4 Because the feedback I've gotten from my industry 5
stakeholders is that in the absence of clear 6
regulatory guidance, that they're unlikely to tackle 7
AI applications that require NRC review.
8 So, you'll see -- but on the other hand, 9
that's also a choice to help industry build experience 10 with AI applications on business precesses, for 11 example. Like AI applied to the corrective action 12 program, and so forth. But we would, as quickly as we 13 can, to begin to extend the reach of these AI tools 14 for the benefit of safety, and it's going to be 15 important for the NRC to get its regulatory guidance 16 out in a timely way that helps industry to understand 17 what the challenges are -- the regulatory challenges 18 are, and move ahead with those more ambitious 19 applications.
20 MR. BETANCOURT: That's a great question.
21 I think that's where the table on the levels of 22 autonomy will come into play, where you like 23 mentioned, the majority of the AI applications that we 24 have been seeing are being used within the system 25 NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
57 regulatory framework (inaudible) but if industry, if 1
we can know a sector of industry, where do they see 2
the first applications, we can have that conversation 3
just on the levels of autonomy on the table.
4 That can help us to better develop that 5
guidance. So, yeah, that's the reason that we wanted 6
to have this strategy out the door, so we can start 7
having this conversation with industry.
8 MR. SLIDER: Thanks Luis.
9 MR. KLUKAN: There we go. Jim, thank you 10 again for your comments thus far in the meeting. Next 11 we're going to turn to Bradley Fox. Again, that is 12 Bradley Fox, whenever you're ready, state your name, 13 and any affiliation.
14 MR. FOX: Hey there, thanks for the 15 questions. I'm Brad Fox, I'm the CEO, and co-founder 16 of NuclearN.ai, and we actually have, I guess what 17 would be categorized as level one solutions deployed 18 at operating plants across the U.S., and Canada right 19 now. My only comments here would be to be cautious of 20 the verbiage could impact. I think it would be good 21 to clarify that a little further.
22 Kind of looking back to my years in 23 engineering, when we had that word could, there was a 24 wide variety of what that could mean, and how that was 25 NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
58 interpreted by the inspectors at the plants, or 1
various individuals, and that led to evaluations, and 2
discussions. So, I think looking at how that would be 3
interpreted up front would be great. I'd also love if 4
you could look into impacts of AI on software quality 5
assurance.
6 And the different ways that might interact 7
with the requirements for SQA at plants. Because we 8
do get a lot of SQA impact questions from the 9
operating facilities, and the folks involved in SQA 10 there. That's it from me.
11 MR. KLUKAN: All right, well thank you.
12 I think the NRC staff, here comes Matt.
13 MR. DENNIS: Yeah, I did have a clarifying 14 question back at Brad just to get a better sense of 15 what the comment was. Because we appreciate the 16 comments, and I want to get a good handle on what 17 exactly, with a little more specificity, if you can 18 provide it, when you're mentioning, are you just 19 talking about the language of the word could in 20 general about what we're discussing?
21 Or specifically something within the text 22 of the strategy at this point? And the only reason I 23 ask is because if you had a specific thing where it's 24 line 14 through 20 of something talks about this, and 25 NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
59 you have a comment about that, or if it's just a 1
general wanting more specificity about applications, 2
or guidance, that would be appreciated, so thanks.
3 MR. FOX: Yeah, thanks Matt. I think it 4
would be the difference between your level one, and 5
your level two definitions in the table you showed a 6
couple slides earlier. I think we have product 7
running in level one right now, and you could 8
theoretically make a case that a lot of systems that 9
plants have could affect safety, from everything from 10 search algorithms that might exist
- today, to 11 corrective action program AI, things like that.
12 And saying things like could affect plant 13 safety can be carried out, or interpreted very 14 differently by whomever is looking at that, or later 15 evaluating that definition, and that classification so 16 to speak. So, clarity around there is always really 17 good. If there's guidance, in engineering world 18 there's NUREGs, and things that would help us 19 understand what that means.
20 MR. DENNIS: Okay, thank you, that 21 clarifies it for the NRC staff I think, and we 22 appreciate it.
23 MR.
KLUKAN:
Thank you again for 24 participating in the meeting, and for raising that 25 NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
60 comment. Next speaker will be Ross. Ross, whenever 1
you're ready, please feel free to unmute yourself, and 2
state your name, and any affiliation.
3 MR. MOORE: Ross Moore, Oklo. I just want 4
to make a quick comment slash question I guess with 5
regard to the table. When I look at the table, and 6
the levels of notional AI, and autonomy levels, level 7
four, which is described as machine decision making 8
with no human intervention, which I think potentially 9
blends automation with AI. Where you can have an 10 automated process that controls machine operability 11 without actually integrating, or incorporating AI 12 characteristics.
13 And I'm curious whether that's the intent 14 of the strategic plan, or if perhaps the strategic 15 plan -- or if there is a potential way to delineate 16 within the strategic plan, the difference between AI, 17 and automation for which the NRC already has a 18 regulatory framework surrounding.
19 MR. DENNIS: I'll take a first crack at 20 the interpretation of this, and I think it is a very 21 good, and appreciated point, which we can go back, and 22 make some improvement on. That the table is not, and 23 rightfully so, should not be conveying that there is 24 a
similarity between automatic operation, and 25 NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
61 autonomous operation. So, there is a distinction, and 1
there is a component that autonomy -- and I think I 2
kind of mentioned this when I was presenting.
3 And that is that while AI enables 4
autonomy, and I think that's what you were mentioning 5
in your comment, is that not all uses of AI are 6
necessarily autonomous, or vice versa. So, the intent 7
of the table is very much, because this is the AI 8
strategic plan, is to not be focused on the 9
traditional automatic operation of systems that we do, 10 as you pointed out, have a framework in place at the 11 NRC for automatic safety systems, and actuation, and 12 control, and logic, and all of that.
13 But this is layering on top of that the AI 14 component, which may have other consequences. So, I 15 think that is a fair comment, and we appreciate the 16 comment to go back, and make some clarifications on 17 the table's intent.
18 MR. BETANCOURT: Okay Ross, good comment.
19 I don't know if you saw under page 1-2 at the end, as 20 well as 1-3, we did recognize that autonomy, and 21 automation are quite different. So, we didn't want, 22 to your point, to reinvent the wheel for everything, 23 that has already been under the system framework. But 24 we'll take a look back at the table, but there's a 25 NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
62 footnote on number three that acknowledges that 1
automation is different than autonomy.
2 MR. KLUKAN: Thank you Ross, again for 3
your questions, and your comment. We will now turn 4
again to Jim Slider, whenever you're ready, feel free 5
to unmute yourself.
6 MR. SLIDER: Thanks Brett. Just, I'm 7
eager for Matt to make all of this scope of work 8
happen immediately, and I say that with tongue in 9
cheek Matt. But whatever the agency can do to 10 accelerate the work that is called for in this plan 11 is, I think important, and beneficial to NRC, and the 12 industry. In that spirit, I would like to ask, in 13 section four the plan mentions working with other 14 federal agencies in the draft to inform the drafting 15 of AI standards, and guidance documents.
16 And I'm curious what other federal 17 regulatory agencies have experience with AI that is 18 relevant to the NRC's mission. I wonder if you might 19 be able to comment on that.
20 DR. LALAIN: Hi Jim, it's Teri. So, NIST 21 has an AI standards working group that has brought 22 together a broad range of participants across the 23 federal government from the implementing agencies like 24 the DOD, to the regulatory agencies, and in that forum 25 NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
63 there's a lot of conversation that is happening on 1
what would the standards look like, how would AI be 2
implemented?
3 And it's an opportunity for all the 4
agencies to connect, and talk about their specific 5
roles, and aspects as AI is used more broadly across 6
the federal government. So, that is one of the forums 7
that we get those engagements, us, organizationally, 8
over.
9 MR. SLIDER: So, Teri, just to pick up on 10 that, for clarification, so if the NIST group, or NIST 11 publishes a standard on say explainability, you would 12 see the NRC as -- the NRC would take that guidance on 13 what explainability means, and how you demonstrate it, 14 et cetera, et cetera, and interpret that in the NRC's 15 domain of safety regulation, is that how that would 16 flow down?
17 DR. LALAIN: So, to be determined. I'm 18 sorry that's not a more direct answer to your 19 question. We're waiting to see what comes out of that 20 forum, because ultimately we need to look how it's 21 going to work for our industry as well, and we've got 22 the breakout process, and everything else that we're 23 going to have to go through. But to the question 24 about getting that cross organizational input, that's 25 NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
64 one of the forums that we have there.
1 We also have our MOU with DOE, which is 2
another forum to us have conversations of things that 3
are happening there.
4 MR. SLIDER: All right, thank you.
5 DR. LALAIN: Yeah.
6 MR. KLUKAN: Jim, thank you again. Next 7
up will be Tyler.
8 DR. CODY: Thanks. This time I have a 9
comment on the fourth item about potential challenges, 10 and it relates to the question of where to place 11 regulations, I guess. It's a bit academic, but it's 12 pretty important. Regarding operations, there are 13 many no free lunch theorems of statistical learning 14 theory that suggests no single model can be optimized 15 for all conditions at once.
16 So, the concept of test, and evaluation as 17 a pre-deployment activity, or accreditation as an 18 exclusively pre-deployment activity is in pretty 19 strong conflict with the first principles behind 20 machine learning solutions. So, basically as 21 conditions change, for example between operations, or 22 as platforms degrade, or with changes in use, if there 23 is a material difference in the data that flows 24 through the model, then the performance of the model 25 NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
65 is expected to change.
1 And if it was well fit originally, then by 2
definition it has to be worse fit now, because it was 3
well fit to conditions that are no longer occurring.
4 So, this all goes to suggest that domain adaptation, 5
like the fact that we have to update our models is the 6
rule, not the exception. And that so called universal 7
models, for example general purpose vision models are 8
the exception, not the rule.
9 So, I think when you think about that 10 table, and the criterion for what level of autonomy 11 are we at, there's other things to consider like is 12 this thing going to need to adapt over time? Is this 13 a stationary environment, or not? Meaning is the 14 distribution of the data changing over time, is this 15 thing doing online learning? And all those little 16 check boxes sort of back you into the quadrant of how 17 worried do I need to be about unexpected changes in 18 the behavior of my model?
19 Because some technician tightened 20 something somewhere, does that need to be folded in to 21 how -- do you need to incorporate the policies for how 22 you change the physical system with the policies for 23 the AI? And do those things need to be coupled, et 24 cetera, I think come down to those kinds of analysis.
25 NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
66 Again, just a comment, but I hope thought provoking.
1 Thank you.
2 MR. DENNIS: I will just -- this is Matt 3
Dennis again from the NRC. I will just confirm that 4
that is a -- thank you for the comment, and I think it 5
goes along with your previous comment on system 6
maintenance, and online learning, which we took note 7
of before, but I think is a good one, and we 8
appreciate it, thank you.
9 MR. KLUKAN: Thank you again Tyler. And 10 I apologize for those little pauses, I just want to 11 make sure I give the NRC staff an opportunity to jump 12 in before I thank you myself. So, thank you again 13 Tyler. Next up we'll have Ian Davis. Ian Davis, 14 whenever you're ready, feel free to unmute yourself, 15 and again your comment, or question.
16 MR. DAVIS: Thank you. Ian Davis, again 17 from X-energy. I also wanted to address potential 18 challenges in line item number four there. INPO 19 regularly releases reports talking about operator 20 induced events, and I can foresee a conflict coming 21 down the road where AI may have a goal of reducing 22 operator induced events further by removing the human 23 from the loop to some degree.
24 And that -- to do so, we must figure out 25 NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
67 how we can create trustworthy AI. So, there's going 1
to be a conflict of whether, or not we trust the human 2
more, which we know the human is also infallible, or 3
we trust the AI more, and I think focusing on how to 4
quantify, and evaluate the trustworthiness of both the 5
human, and the AI is going to be really important to 6
make a decision about whether, or not that application 7
is acceptable, thank you.
8 MR. KLUKAN: Well, thank you Ian, for that 9
comment, again, we really appreciate it. Wasn't sure 10 if Luis was trying to jump in, or not. But next we'll 11 turn to Brian Golchert. Brian Golchert whenever 12 you're ready.
13 DR. GOLCHERT: This is Brian Golchert from 14 Westinghouse. And actually to follow up on the trust 15 issue, looking at level one, we're looking at the 16 possibility of using AIML optimization algorithms to 17 design new products, or improve existing products. If 18 we go back to the old fashioned method of using --
19 sorry. If we use the new method of using AIML to 20 design it, but then go back to the old method of 21 testing the final product, are we going to be expected 22 to submit these AIML?
23 I realize this is nothing in the strategic 24 plan, but it is something to consider in the long run 25 NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
68 for regulation. There's a lot of stuff that AIML can 1
do behind the scenes, which may, or may not be needed 2
to be submitted, and a little curious as to how the 3
NRC plans to fold that in.
4 MR. DENNIS: So, this is Matt Dennis from 5
the NRC again, and I will clarify that -- and on the 6
spot, I cannot find the language, and I think we do 7
have it in there. But this is a very good comment 8
that we will go back, and confirm that we do include 9
the language in here about the -- and I think I 10 mentioned this earlier, that the AI strategic plan 11 should not be viewed as purely something that is 12 focused on operational.
13 Meaning human interaction, and autonomy in 14 a control room, but it does -- so, the levels of 15 autonomy is an example, one example application in the 16 broad AI umbrella of things. But we did -- we have 17 given examples, people have given examples in this 18 forum about other business processes that would be 19 heavily reliant on natural language processing, which 20 falls under that AI umbrella.
21 And that is a component of what we are 22 considering. So, to your example that you just gave 23 about using AI, or in particular, machine learning, to 24 maybe do an analysis in a new way that has not been 25 NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
69 the traditional physics based model, or something like 1
that using data -- or using a physics based model 2
based on experimental data, and instead using a 3
machine learning model to design, or qualify a 4
component, or something along those lines as you were 5
mentioning.
6 That is something that is being considered 7
under the AI strategic plan. So, it is a component, 8
and if not explicitly already stated in the strategy, 9
I think it is, I can't go find the exact words right 10 now, but it is a very pertinent comment, that we will 11 go back, and make sure that that is communicated.
12 DR. GOLCHERT: Well, actually I understand 13 that, the point was if I go back, and test the new 14 design, it doesn't matter how I got to that point. In 15 the old days you would just have here's my drawing, 16 here's my test, here's my results submitted to the 17 NRC. Now, if I use AIML to get to the here's my 18 drawing, and then do the tests, do I need to submit 19 the AI?
20 MR.
DENNIS:
- Okay, thank you for 21 clarifying that, that is a different question, and a 22 good comment. So, I can't answer that for sure, and 23 I think that goes to what we were discussing earlier, 24 mentioning that there is a graded approach to things, 25 NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
70 and just because it uses AI does not necessarily 1
invalidate our current regulatory framework, and 2
that's what Luis was mentioning earlier.
3 That part of rollout will be doing a gap 4
analysis to assess whether, or not that is applicable 5
in the specific situation. So, I'll turn it over to 6
Luis to elaborate.
7 MR. BETANCOURT: Yeah, thank you for that, 8
and I apologize for my camera, it's not up, it 9
basically goes away. So, good comment, because 10 sometimes AI can be used as a tool to demonstrate, how 11 do you demonstrate regulatory compliance. So, to your 12 point that's something maybe that we need to look at, 13 but great comment, we'll take that back.
14 DR. GOLCHERT: Thank you.
15 MR. KLUKAN: Thank you again Brian for 16 your comments, and for participating today. Next 17 we'll turn to Yadav. Whenever you're ready, feel free 18 to unmute yourself, and begin your question, or 19 comment. Again, next we will turn to Yadav, it looks 20 like you may have had your question answered? Go on, 21 go on.
22 DR. YADAV: Sorry, I was muted. So, this 23 is Vaibhav Yadav from Nuclear Safety and Regulatory 24 Research Division at INL. Talking about meeting with 25 NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
71 current regulatory framework, I was wondering if you 1
are specifically considering Part 53, or intersection 2
of AI based approaches to meet Part 53 requirements 3
when it comes up? Thank you.
4 MR. DENNIS: I'll take a first crack at 5
this, and say yes. This is not -- the AI strategic 6
plan is not in a
- vacuum, and we do have 7
representatives from our Nuclear Reactor Regulation 8
Office, NRR, and so this is -- if anyone else from our 9
group wants to chime in, please feel free to after I 10 mention this. But the 10 CFR 53 rule making is 11 currently in process, as you're aware.
12 And this, the AI strategic plan is -- one, 13 or the other. As an example the autonomy discussion 14 here is linked with that as well. So, it's not 15 completed, in a vacuum, and I actually think Brian has 16 his hand raised, so I want to turn it over to him 17 actually.
18 MR. GREEN: Hi, thanks. So, this is Brian 19 Green, I'm the human factors team lead for NRR. I 20 can't speak for all of Part 53, because the pieces are 21 just all coming together at one point. But within the 22 human factors regulation, the way we are currently 23 handling this is actually under the concept of 24 autonomy. So, if you were to create an autonomous 25 NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
72 system using AI, that's how we had initially conceived 1
of it.
2 And we're having discussions inside to see 3
if adjustments need to be made to align the 4
terminology internally right now. But thus far, we 5
have not precluded it, but we have been hesitant to go 6
too far as to prescribing how it would happen. So, 7
there's kind of a place holder built in that as we get 8
more comfortable with it, we would be able to 9
potentially facilitate it.
10 But we're not speaking directly to it.
11 I'm not sure if that holds true with all the other 12 areas, but that's how human factors is handling it 13 right now.
14 MR. KLUKAN: Brian, thank you for weighing 15 in on that question, and thank you again Yadav for 16 raising the question as well. So, next we will go 17 back to, and thank you to Matt. Next we will go to 18 Jim, whenever you're ready.
19 MR. SLIDER: Thanks Brett, it's Jim Slider 20 from NEI again. Just a final question on process.
21 I'm just wondering Matt, if you could just tell us 22 what we can expect to see on this project over the 23 next say six months, between now, and when the final 24 strategic plan is ready for release next spring. You 25 NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
73 mentioned in your previous slide that you've got an 1
ACRS meeting planned for November. Comments on this 2
draft are due August 19th.
3 And I'm just wondering if you can give us 4
some idea of what we might be seeing in terms of 5
public communications, or public releases of 6
information between now, and next spring.
7 MR. KLUKAN: Matt, I think you're on mute, 8
I'm sorry.
9 MR. DENNIS: I couldn't make it the entire 10 meeting without having one microphone mess up, so I 11 apologize. But thank you Jim, I'm going to reiterate 12 some of the things we already said, and you've just 13 mentioned, and then also maybe clarify a few. So, as 14 you mentioned, the AI strategic plan, the plan is to 15 finalize -- I'm sorry, the comment receipt period 16 closes on August 19th.
17 So, this is again, another reminder to 18 please go submit your formal, written comments on 19 regulations.gov for the docket number which is 20 provided in the public meeting notice. So, the 21 strategic plan is intended to be finalized after 22 receipt of public comments, and consideration of any 23 feedback from the advisory committee on reactor 24 safeguards, who has stood up a joint committee 25 NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
74 specifically looking at this topic as far as 1
artificial intelligence at the NRC, and the nuclear 2
industry.
3 That public meeting is not scheduled yet, 4
it's planned for November of 2022, and will be noticed 5
in both the ACRS public meetings, on their website, 6
and we will try to let everyone know that it is 7
upcoming, and you can participate in the ACRS meeting.
8 Additionally, again, for 2023, which is why we also 9
mentioned that this will be finalized in 2023, we plan 10 to have another follow on workshop.
11 Which will be a continuation of those data 12 science, and regulatory applications of -- sorry, data 13 science, and AI regulatory applications public 14 workshops. So, we have those three in 2021, and we 15 will have another one that is planned to have in 2023.
16 Additionally, this will probably, most likely, cannot 17 confirm, but odds are will be a topic at the NRC's 18 regulatory information conference, which I believe is 19 in March.
20 I don't have the date in front of me, but 21 there will probably be a session at the RIC on 22 artificial intelligence as well. So, those are 23 another couple of opportunities. So, the ACRS 24 meeting, the workshop, RIC, these will all be 25 NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
75 opportunities for us to engage with the public, and 1
update on the current status of the plan, when it's 2
finalized, when we envision to have any action plans 3
to support the strategic goals, the implementation of 4
-- probably not the framework, that's a little later 5
on.
6 But at least working groups, or things 7
that we're putting in place to move forward with the 8
different strategic goals throughout FY '23, and FY 9
'24. So, if I've forgotten any other particular 10 things, Elise, or Teri, please chime in.
11 MR. BETANCOURT: Yeah, I think Jim, that's 12 a good question, because what I have been hearing 13 today is early engagement as soon as possible. And I 14 think once we finalize a strategy, I do foresee us to 15 re-engage the industry in a public setting after the 16 strategy's out in the public domain, the final one.
17 And that's what we will need to hear from industry, 18 what are those topics that you guys see as the near 19 term, or where do we need to emphasize our research?
20 As well as the regulatory framework, so we 21 can start planning. Because as you guys know, our 22 budget is two years in advance, we are budgeted for 23 the next year to start doing some of this framework.
24 If there's really interest from the industry in some 25 NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
76 of this area, that's where we need to engage now, 1
rather than later. But I would consider even industry 2
could consider even possibly submitting potential 3
white papers.
4 So, that we could have some potential 5
discussions of where do we see gaps, and challenges, 6
as well as areas that we need to consider. Once we 7
are doing some of this initial AI strategy, this will 8
take time. As you guys are aware, AI is a rapidly 9
evolving technology, and we need to meet early with 10 industry on this topic. And that's why we value this 11 conversation. And please submit those type of 12 comments in the FRN so we can consider that.
13 MR. SLIDER: Luis, this is Jim again, 14 thank you. My ears perked up when you said the word 15 white papers, can you elaborate on that?
16 MR. BETANCOURT: Yeah, you know, like at 17 least on the advanced reactor side, there has been a 18 process of submitting a white paper to have a better 19 understanding of what industry wants to pursue 20 regulatory changes, and that's where I was talking 21 about like a white paper.
22 MR. SLIDER: Okay, thank you.
23 MR. BETANCOURT: Hope that answered that 24 question.
25 NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
77 MR. SLIDER: Got it, thanks very much 1
Luis.
2 MR. DENNIS: This is Matt Dennis again, 3
just one last plug on this topic of what our plans 4
are. As we've said, it is very important to 5
re-emphasize that the public comment for this draft 6
strategic plan concludes on August 19th, and I did put 7
it in the chat, I finally copied it. So, anyone here 8
who is participating in the Teams meeting, feel free 9
to follow that link to regulations.gov.
10 It's docket number -- and that's what I 11 was looking for specifically, it's docket number 12 NRC-2022-0095-0001. And you can submit at that link 13 on regulations.gov, your public comments. Thanks.
14 MR. KLUKAN: And thank you again Jim, and 15 thank you Matt, and Luis. So, at this time, I don't 16 see anyone in the queue, or individual hands raised.
17 So, again, if you're on the phone, please hit star 18 five if you'd like to speak, or to raise your hand.
19 And again, if you're on Teams, use the app, or the 20 browser, the react function to raise your hand. And 21 we have one person who has entered the queue, and that 22 is Sherman Remer, whenever you're ready.
23 MR. REMER: Yes, thank you very much.
24 Jason Remer from Idaho National Laboratory, and a 25 NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
78 question for you just real quick. In the past we've 1
been working a lot in promoting digital systems as 2
replacements for analog systems in our plants. And we 3
worked through a lot of issues with the NRC, and I 4
think we've got a good plan.
5 But I just want to make sure that whatever 6
we do in the AI area, we also just kind of incorporate 7
all the good things we've learned about how we're 8
going to deal with common caused failure, how 9
redundancy, and diversity affect things. And so that 10 whatever we come up with AI to ML, that it helps us 11 move forward as a whole. We've found sometimes we 12 think we've solved all the problems, and we look 13 around, and something suddenly pops up.
14 So, just a note of consideration, and 15 appreciate the NRC working on this issue, I think it's 16 very important. But I want to make sure that we can 17
-- we kind of need to crawl before we can run, and 18 AIML is a little bit running, and replacing analog 19 with digital is kind of crawling. So, just a note, 20 and again, appreciate you having this.
21 MR. BETANCOURT: Jason, great comment.
22 Actually my background is digital I&C, so I know 23 exactly what you're talking about. We need to keep 24 doing that coordination with our digital I&C folks 25 NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
79 just to make sure that some of the areas that you 1
mentioned, like that the introduction of AI, once it's 2
in conducting operations, will not negatively effect 3
the safety, and the security of operations, so 4
absolutely.
5 MR. REMER: Yeah, thanks Luis, and you, 6
and I have worked together a little bit already on 7
this, so appreciate your understanding, thank you.
8 MR. BETANCOURT: Yeah.
9 MR. KLUKAN: Okay, thank you again. Matt, 10 did you want to chime in?
11 MR. DENNIS: I apologize, I was going to 12 say, and Luis pretty much said the same thing I was 13 going to say, that the topic of digital I&C, and 14 lessons learned in the implementation of digital I&C 15 within industry, and NRC is not a topic that is lost 16 on us as considering how we do this. So, there are a 17 lot of lessons learned from implementing a new 18 technology in the nuclear industry that we can carry 19 over into doing this same exact application.
20 And that is something we have discussed as 21 something we need to consider for our purposes going 22 forward, so we don't make the same mistakes. Or at 23 least learn from the processes that were implemented 24 as part of the digital I&C applications. So, it's 25 NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
80 something we have definitely thought about in 1
considering the lessons learned from digital I&C.
2 MR. KLUKAN: Thank you Matt, and again, 3
thank you for the question. We have about 15 minutes 4
left, so if you have any remaining questions, or 5
comments, please feel free to raise your hand. Again, 6
if you're on the phone, press star five. Again, 7
that's star five. And we have a few moments here to 8
see if anyone enters into the queue. Again, if you'd 9
like to make a comment, please feel free to raise your 10 hand.
11 And again, if you're on the phone, press 12 star five. Again, that is star five. Doing a little 13 auctioneer countdown in my head, it doesn't look like 14 we have anyone else. I'm going to turn it over to 15 Teri.
16 DR. LALAIN: Thanks Brett. I just want to 17 thank everyone for the great feedback today, the 18 comments, the questions, that's the purpose, and the 19 goal today, so thank you for helping us achieve that 20 goal of getting the input, really appreciate all the 21 comments. Great dialogue today, so with that Brett, 22 I'll turn it over to you to close us out, thank you.
23 MR. KLUKAN: All right, great. Can we --
24 yeah, there we go, we have the last slide up. So 25 NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
81 again, in closing, what you can see here on the slide 1
is the contact information if you want to reach out to 2
any of the NRC staff you heard during this meeting 3
today beyond, or after this public meeting. Again, I 4
would like to thank all of the meeting participants 5
for taking time out of their day to engage in this 6
very dynamic discussion.
7 It was mentioned at the start of the 8
meeting, no regulatories were made during the course 9
of this NRC public meeting. With that, in closing, I 10 would reiterate if you would like to have your 11 attendance in the meeting reflected in the meeting 12
- summary, please email Matt Dennis at 13 m a t t h e w. d e n n i s @ n r c. g o v,
t h a t ' s 14 M-A-T-T-H-E-W.D-E-N-N-I-S@nrc.gov.
15 His email address is up on the screen.
16 Feedback reports from the meeting can be found on the 17 public meeting -- or feedback on the format of this 18 meeting may also be emailed to Matt as well, or 19 provided through the NRC's public meeting website.
20 Again, I'd like to thank you all for this great 21 discussion today. And with that, I will adjourn the 22 meeting. Have a good rest of your afternoon everyone, 23 thank you again for participating.
24 (Whereupon, the above-entitled matter went 25 NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com
82 off the record at 2:49 p.m. and resumed at 2:59 p.m.)
1 MR. DENNIS: Hi Steve, this is Matt Dennis 2
from NRC, I noticed you joined, and unfortunately the 3
meeting, and public comment portion ends in one 4
minute. So, if you have a comment, or anything to 5
say, if not, if it's just a time change problem, let 6
me know. Otherwise we're going to be ending the 7
meeting in about a minute. Hi, for anyone that's just 8
joined the meeting, the public comment portion of the 9
meeting was from 1:30 to 3:00 p.m. Eastern.
10 If you have anything you'd like to say, I 11 can take that real quick, but otherwise I'm going to 12 be ending the meeting, since it is after 3:00 o'clock.
13 I was literally getting ready to end the meeting, but 14 Rafa I saw you, did someone raise their hand? Okay, 15 well, all right, thank you for attending the NRC's 16 public meeting. Hearing no responses, I'm going to 17 close the meeting, as it is 3:02 p.m. Eastern.
18 Thanks.
19 (Whereupon, the above-entitled matter went 20 off the record at 3:02 p.m.)
21 22 23 24 25 NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1716 14th STREET, N.W., SUITE 200 (202) 234-4433 WASHINGTON, D.C. 20009-4309 www.nealrgross.com