ML21225A713
ML21225A713 | |
Person / Time | |
---|---|
Issue date: | 03/11/2021 |
From: | Office of Nuclear Reactor Regulation |
To: | |
References | |
NRC-1420 | |
Download: ML21225A713 (54) | |
Text
Official Transcript of Proceedings NUCLEAR REGULATORY COMMISSION
Title:
33rd Regulatory Information Conference Technical Session - TH26 Docket Number: (n/a)
Location: teleconference Date: Thursday, March 11, 2021 Work Order No.: NRC-1420 Pages 1-49 NEAL R. GROSS AND CO., INC.
Court Reporters and Transcribers 1323 Rhode Island Avenue, N.W.
Washington, D.C. 20005 (202) 234-4433
1 UNITED STATES OF AMERICA NUCLEAR REGULATORY COMMISSION
+ + + + +
33RD REGULATORY INFORMATION CONFERENCE (RIC)
+ + + + +
TECHNICAL SESSION - TH26 TECHNOLOGY ADVANCES IN REGULATORY DECISIONMAKING FOR NUCLEAR REACTORS
+ + + + +
THURSDAY, MARCH 11, 2021
+ + + + +
The RIC session convened via Video Teleconference, at 1:30 p.m. EST, Shaun Anderson, Director of EMBARK Venture Studio, Nuclear Reactor Regulation, presiding.
PRESENT:
SHAUN ANDERSON, Director, EMBARK Venture Studio, NRR/NRC EETU MIKAEL HENRIKKI AHONEN, Program Manager, Radiation and Nuclear Safety Authority, Finland TAYLOR LAMB, Technical Assistant, EMBARK Venture Studio, NRR/NRC ROBERT AUSTIN, III, Senior Program Manager, Electric NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1323 RHODE ISLAND AVE., N.W.
(202) 234-4433 WASHINGTON, D.C. 20005-3701 (202) 234-4433
2 Power Research Institute P R O C E E D I N G S 1:25 p.m.
MR. ANDERSON: Good afternoon, everyone, and thank you for joining us on one of the last but hopefully informative sessions on technology advances and regulatory decisionmaking. I'm Shaun Anderson, Managing Director for EMBARK Venture Studio in the Office of Nuclear Reactor Regulation.
In last year's RIC, we really hoped to provide us a lot of heavy focus on data, harnessing data and discovering more innovative ways of just possibly applying it. And although that RIC didn't happen, this year we want to make sure we continue our focus on gathering data and pushing us towards this application.
So for this session, we thought it'd be great to talk about some of the applications, but more importantly, how we're trying to leverage the data today. So if you attended the earlier session, you may have heard some of the -- a lot of the great discussions about data analytics and AI and where we're going, although for this session, I hope to discuss more where we're at today and our NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1323 RHODE ISLAND AVE., N.W.
(202) 234-4433 WASHINGTON, D.C. 20005-3701 (202) 234-4433
3 presentations are aligned about the -- with their approach of the vision to an in-depth application to drive data-driven decisionmaking.
So today I'm joined with both of our panelists Eetu Ahonen, a Program Manager at the Radiation and Nuclear Safety Authority in Finland, or STUK. He'll provide us with perspective of STUK's vision of leveraging data as part of their future model for decisionmaking.
Also Taylor Lamb, Technical Assistant and Product Manager for the Mission Analytics Portal in EMBARK Venture Studios in NRR. She's going to provide us a perspective on some of the tools and how parts of the reactor safety program is leveraging data.
And last but not least, Robert Austin, Senior Program Manager for Plant Modernization and Data-Driven Decisionmaking at the Electric Power Research Institute, or EPRI. He's going to provide us the -- a non-regulatory perspective of how data is leveraged from external stakeholders' perspectives.
So again, I hope that these presentations provide you some -- various perspectives and some applications on how we're working with data today.
NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1323 RHODE ISLAND AVE., N.W.
(202) 234-4433 WASHINGTON, D.C. 20005-3701 (202) 234-4433
4 And with that, I'll kick it off to Eetu with our first presentation.
MR. AHONEN: All right, thank you, Shaun.
And thank you for this opportunity to be here at the RIC and to be able to share my thoughts on -- about how data enables risk-informed decisionmaking. Next slide, please.
In my presentation there are four sections. First, we'll look into the strategic background behind our approach to use data and increase informed decisionmaking. And secondly, I'll present you with an idea of a licensee performance tool. And thirdly, I'll show you our vision of an inspector and a reviewer information portal. And lastly, I'll talk about some data-related challenges being faced. Next slide please.
So let's --- well, first the strategy background. Next slide, please. In our strategy, we have stated that we want to be a more risk-informed regulator and that we want to emphasize licensee responsibility. To be more risk-informed means, among other things, focusing on the most safety-significant oversight actions.
But how do we really know what's NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1323 RHODE ISLAND AVE., N.W.
(202) 234-4433 WASHINGTON, D.C. 20005-3701 (202) 234-4433
5 significant? And for that, we need risk measures obviously, preferably data-based.
Secondly, we want to emphasize the licensee responsibility of safety. One part of this is that we do less oversight on well-performing licensees, and to distinguish licensees, we need to have performance visits. Next slide, please.
So we need a safety performance tool.
Let's look into that. Next slide. So safety performance evaluation is based on observations like safety findings, requirements, and etcetera. From this we draw conclusions how are licensees performing, say, for example, in the area of radiation protection or safety culture or what have you.
Obviously, this is nothing new. We've had these -- we've had tools for this already available. But the idea here is to have a systematic and always up-to-date and preferably automatic system. Next slide, please.
Well, then, what kind of a tool are we talking about? Next slide, please. Thank you.
Yeah, so what kind of a tool are we talking about?
The tool should provide a view of safety performance.
NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1323 RHODE ISLAND AVE., N.W.
(202) 234-4433 WASHINGTON, D.C. 20005-3701 (202) 234-4433
6 It should be always up-to-date from the recent data.
It should be safe through the organization, meaning that everybody doing oversight in, say, radiation protection, use the tool and have the same information at their disposal.
It should be fact-based, based on solid data. It should be formed systematically, based on rules. And it should also have an impact on oversight, meaning that based on safety performance, we will either increase or decrease the amount of oversight. I think we are very good at increasing the oversight, but not so good at decreasing oversight. Next slide, please.
Here's the envisioned process for how to run the safety performance evaluation. Obviously this is still -- this is still quite high level, but the idea is there. First, there's the phase of collecting information, findings, and so forth.
Second, there's the rule-based compiling of information. These rules could be, for example, if there are five major findings in an oversight area, a yellow flag is raised. Or if there's ten, a red flag.
And third, the inspectors and reviewers NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1323 RHODE ISLAND AVE., N.W.
(202) 234-4433 WASHINGTON, D.C. 20005-3701 (202) 234-4433
7 should view the tool and the results from the -- from the process period. And fourth, last phase, the oversight is then adjusted and an oversight focus is changed accordingly.
For example, if there are only positive findings in an oversight area, there will be no inspections in the next, say, a year. If there's a yellow flag, then an inspection should be done in the next -- in the next six months. Or if there's a red flag, as soon as possible.
This way, the licensee is rewarded for good work. Next slide, please. Now moving through the tools for our staff, I'm going to present you with a vision of a tool -- let's call it a portal of information -- for our inspectors and reviewers.
Next slide, please.
The inspector uses this portal to manage information and then work on the information. The idea is to be able to access all relevant information required during an inspection or during a document review. The portal should have a work queue with tasks organized based on their safety significance.
And the portal would also guide the inspector or reviewer conducting the inspection or document NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1323 RHODE ISLAND AVE., N.W.
(202) 234-4433 WASHINGTON, D.C. 20005-3701 (202) 234-4433
8 review.
All the information created during the process would be managed in the portal and it would be -- and it would take care of storing it in databases, instead of like the user going to different databases and typing it in manually. Next slide, please.
I'm not going to go through this part, but I just want to highlight here the modularity of the system. The idea is to have all these different modules separate, but the portal would then tie all these together, the different databases, the documents maintenance system, the reports maintenance system, and then so on. Next slide, please.
And the databases I mentioned, they are really the foundation here. All data should be stored -- really like everything, especially observations and findings -- in order to be able to analyze them later easy. Databases should be normalized containing only what they are supposed to, and everything should be stored all in one place.
This is the concept of primary data.
This way we can avoid mismatch of data. Everywhere this data is used should refer to this primary data, NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1323 RHODE ISLAND AVE., N.W.
(202) 234-4433 WASHINGTON, D.C. 20005-3701 (202) 234-4433
9 thus ensuring up-to-dateness and eliminating the need for a user to input data in several places. Next slide, please.
Data movement between databases and the portal would be enabled by defined interfaces. The user -- the inspector or reviewer -- would then access everything from a single user interface. There might be several systems behind that interface, but the user really does not have to know about those. Next slide, please.
Now then, my last topic, I have some data-related challenges encountered during our journey towards more -- sorry -- towards using more data. Next slide, please.
Earlier I said that we want to prioritize tasks based on their safety significance. But how do we -- how can we know -- or how will we know that if we do not know what the -- how to quantify the risk or safety significance. Or a related problem, yeah, would be how to visualize risk or safety significance.
Another thing was that we want to be effective in doing the things that improve safety by the greatest amount, whatever that amount is. How NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1323 RHODE ISLAND AVE., N.W.
(202) 234-4433 WASHINGTON, D.C. 20005-3701 (202) 234-4433
10 do we quantify this effectiveness in terms of safety?
And now for the models, this -- how to estimate the remaining radio time. We obviously want to give our licensees a good estimate on that.
Then on the right -- the right hand side, there are some even more fundamental questions we have been struggling with, for example, how to handle confidential data in cloud services. There was a (audio interference) in the -- also in the earlier systems. And most importantly, how to get good data.
Next slide, please.
Okay, that's it, and thank you for your attention. And I'll be happy to answer your questions in the Q&A session after the presentations.
Thank you.
MR. ANDERSON: Thank you, Eetu, I appreciate it. And I forgot to do some housekeeping.
Just for those who are watching, we're planning to pencil the Q&As after we go through the presentations. So if you have any questions as the presentations are going on, feel free to use the Q&A module and we'll make sure we have those questions answered right after the presentations are over.
Thanks, Eetu. We're going to now turn NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1323 RHODE ISLAND AVE., N.W.
(202) 234-4433 WASHINGTON, D.C. 20005-3701 (202) 234-4433
11 it over to Taylor Lamb, and she's going to give us perspective of what we're currently doing with data analytics. Taylor.
MS. LAMB: Great, thanks, Shaun.
So as Shaun said, my name is Taylor Lamb, I am the Product Manager for the Mission Analytics Portal in NRR's EMBARK Venture Studio. EMBARK was created back in 2019 when the Office of Nuclear Reactor Regulation and the Office of New Reactors merged.
And at that time, staff and management, they recognized the need that we had to have a more permanent fixture in the organization to promote a culture of innovation and transformation while providing the staff with an outlet to make positive change happen. Next slide, please.
In 2019, NRR in coordination with the Office of the Chief Information Officer, OCIO, they recognized that we needed technology -- to use our technology and data better. Some of the things that we could improve upon, resource management, people management, communication in terms of reducing staff burden, as well as regulatory decisionmaking, what we're here about today.
NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1323 RHODE ISLAND AVE., N.W.
(202) 234-4433 WASHINGTON, D.C. 20005-3701 (202) 234-4433
12 To accomplish this, we have and we really do still have challenges to overcome. So just like the star clusters that you see on the slide here, we have data all over the place, and they're stored in multiple systems and locations. We also had a culture where people felt very strongly, had strong ownership of their data sets, which sometimes meant that they were hesitant to share them. Next slide, please.
So to tackle these challenges, a Mission Analytics Portal team and vision was established.
The vision is to advance the Agency's ability to leverage data and drive decisionmaking, allowing us to become a more modern, risk-informed regulator, which is obviously a huge theme of this year's RIC.
Ultimately, this will be accomplished by making data more widely available to the staff by providing them with the tools that they need to maximize their ability to turn data into insights that help them become more efficient and effective regulators. Next slide, please.
So for one of the NRC staff who've seen MAP presentations, this slide will look very familiar to them. So we approach this vision in four steps.
NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1323 RHODE ISLAND AVE., N.W.
(202) 234-4433 WASHINGTON, D.C. 20005-3701 (202) 234-4433
13 First partnered with OCIO to create the data analytics environment that contains authoritative data sets that anyone at the NRC can access. We take the data from the NRC systems that the staff already uses. We standardize it, and we centralize it in a data warehouse.
Second step, we develop the data analytics skill sets and partner with the program organizations to ensure that the end users are getting tailored analytical tools, pun intended, otherwise known as user-centered design. Because what might interest a branch chief doesn't necessarily interest a technical reviewer.
Third, we empower the staff by providing them access to the data and the tools that they need to enhance their current positions. An example, well, this might mean access to Tableau or Power BI, which are the common tools that use here at the NRC, as well as additional training resources.
An example of this is the NRC dashboard developers community, which actually consists of approximately 150 staff across the Agency. We meet on a biweekly basis to discuss the analytical tools that are being developed across the Agency, and we NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1323 RHODE ISLAND AVE., N.W.
(202) 234-4433 WASHINGTON, D.C. 20005-3701 (202) 234-4433
14 provide the training based on the community needs.
And finally, the fourth step to this is we'll work to acquire the technologies needed to enable more effective communications and analytics.
Next step please, next slide.
All right, so now that you understand how we accomplish our vision, hopefully these star clusters are starting to come together a little bit more for you. By focusing on those four steps that I just discussed MAP is able to provide the staff with the capability to analyze data and gain insights from regulatory decisionmaking, as I previously said, I might sound like a broken record, and assess performance across all levels of the reactor safety program.
We'll work with the subject matter experts to discuss their needs, and clean up their data, which is really about 80% of the dashboard development process, centralize it in a data warehouse, and turn it into visualizations that would show schedules, trending, and resources, just to name a few.
So let's take a quick tour of this galaxy. Next slide, please. All right, first I'd NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1323 RHODE ISLAND AVE., N.W.
(202) 234-4433 WASHINGTON, D.C. 20005-3701 (202) 234-4433
15 like to note that the data has been blurred for sensitivity reasons, as a lot of these tools have been created for internal use. Let's take a look at the quarterly performance reporting as a first example.
Previously the staff would take a significant amount of time to manually gather the data and organize it in reports. But now it's fully automated, which allows us to focus more on other important activities. Next slide, please.
So continuing along the lines of workload management, this particular dashboard is open licensing actions, which provides details on schedule, resources used and workload by branch and staff, with the ability to drill down into more detail. We've shifted from tables and spread sheets to having more visual information that allows for the quick identification of outliers.
Now, that's going to be an (audio interference) later on. We no longer need to open individual systems to find information. And this is just one example of the many project management tools that we've created. Next slide.
So even on the oversight side of the NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1323 RHODE ISLAND AVE., N.W.
(202) 234-4433 WASHINGTON, D.C. 20005-3701 (202) 234-4433
16 house, our inspection team gurus developed the operational experience tools that are available to inspection staff.
Specifically, the this dashboard covers findings information across the operating reactor fleet since 2001, but there are other tools that also show OpE trends, such as scram data, generic communications, plant risk information by structure, system, or component, and data from INPO's Industry Reporting Information System database, other known as IRIS. Next slide, and that'll be it.
All right, and finally we have the OpE COVID-19 dashboard to track the impacts on nuclear facilities. Of course we had to throw in a COVID-19 one. Next slide.
Great. So as our data and availability of visualizations continues to come together, we need to take MAP to the next level by making data externally available to our stakeholders. This will mean the development of an externally-facing data warehouse, and the tools to view this information, such as the aforementioned Tableau and Power BI, which will be determined at a later date.
And this will allow us to improve NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1323 RHODE ISLAND AVE., N.W.
(202) 234-4433 WASHINGTON, D.C. 20005-3701 (202) 234-4433
17 transparency and have better communications. Next slide, please.
So as an example of this which has been presented at Commission meetings and you may have seen it at other public meetings, Project Rango, it's an initiative to take data that is normally published on the public website, but transform it into a way that allows users to conduct their own assessments, look at trends, and ask deeper questions. We're aiming to have this made publicly available in the coming months. Next slide.
Also along the lines of public interface, during a recent public meeting we demonstrated the new web-based relief request systems, otherwise known as WRR, which is a vision that spawned from COVID-19 to improve the submitting, processing, tracking, and approving of relief requests and alternatives submitted by licensees.
While efficiency is just one piece of the puzzle, this will provide raw transactional data to staff, industry, and the public. Next slide.
So ultimately WRR is the foundation to MAP-X, which is an effort to enhance innovation by developing a more intuitive interface for interaction NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1323 RHODE ISLAND AVE., N.W.
(202) 234-4433 WASHINGTON, D.C. 20005-3701 (202) 234-4433
18 with the NRC data through data analytical tools to -
- empower internal and public stakeholders with system integration and automated work flows.
This portal interface will have the capabilities to submit, retrieve, and interact with NRC data, just to name a few things. Next slide, please.
So while a lot of the tools discussed here today are predominantly for NRR, we work with other offices across the Agency. We work with many offices. So as we continue our data journey, we realize more opportunities to leverage data and we're revisiting more structured approach for the development of these analytical tools.
And I must add a plug here. All of this could not be done with the amazing team that I get to work with. Big thanks to Michael Lee, the product owner for MAP, Nate Anzalone, Caty Nolan, Andrew Lerch, Gayathri Sastry, Melissa Ash, and Jason Carneal. Team members are extremely important for the accomplishment of these huge tasks.
And thank you for taking the time to listen to me, and I look forward to answering several of your questions. Thank you.
NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1323 RHODE ISLAND AVE., N.W.
(202) 234-4433 WASHINGTON, D.C. 20005-3701 (202) 234-4433
19 MR. ANDERSON: Thanks, Taylor. Thanks for the overview of what we've been doing at the NRC.
Rob, we're going to close it out with your presentation. I know there's a lot been going on with EPRI and data and data analytics, especially for your decisionmaking. So let's turn it over to you.
MR. AUSTIN: Yeah, thanks, Shaun, I appreciate it. And I thank the NRC for the opportunity to speak at this year's Regulatory Information Conference. And I guess we're the, we're bringing it home because we're the last session of the NRC.
So going to talk today -- so I'm Ron Austin, Senior Program Manager at EPRI. I wanted to talk today about our expansion of the data-driven decisionmaking initiative within EPRI. What we're doing here, actually, is we're going to be unifying our data tools to maximize our insights, connect the paperless work, with an ultimate goal of perhaps having autonomous nuclear plants.
What's significant is we want to make this framework not just for ourselves, but make it available for others and hope it can become a standard NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1323 RHODE ISLAND AVE., N.W.
(202) 234-4433 WASHINGTON, D.C. 20005-3701 (202) 234-4433
20 way to share data and information within the nuclear industry. Go to the next slide.
Just real quick, EPRI obviously not a regulator. We are a nonprofit research and development organization, so the three key pillars of our -- nonprofit, independent, and collaborative.
But our R&D and collaborative approaches can be available to our members and regulators going forward. Next slide, please.
We've also done a lot of work already in the area of developing analytical tools, somewhat older examples being preventive maintenance-basis database, which many are familiar with, used to set maintenance intervals for equipment and nuclear plants and also generating plants as well.
Our check works floor accelerated corrosion analyses. This is kind of more old-school analytics, but been very valuable and has representation in software of course.
And then of course we're looking ahead, like many others, with our members at artificial intelligence, machine learning, more advanced data analytics. Recently published a technical brief on automated analysis, so promote visual inspections of NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1323 RHODE ISLAND AVE., N.W.
(202) 234-4433 WASHINGTON, D.C. 20005-3701 (202) 234-4433
21 containment buildings. And actually it's independent of the analysis, but the pictures were taken using drones.
We're looking a lot, along with others, at natural language processing and how we can expand that out to a nuclear utility industry language dictionary or model to use in natural language processing going forward. Developing our C-STARR digital assistant, which will bring in image recognition and analysis along with like a digital field guide.
And our event management response tool, which essentially similar to C-STARR, except it does it for text. And actually been working with US NRC doing just a portion of the ADAMS database along with other sources and see what -- how natural language processing can enable operating experience and other searches as personnel do their work at plants.
Now, if you're an EPRI member, you can follow that breadcrumb trail or link there and look at our data-driven decisionmaking page as a -- you know, all US nuclear utilities and many international nuclear utilities are EPRI members and be able to access this.
NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1323 RHODE ISLAND AVE., N.W.
(202) 234-4433 WASHINGTON, D.C. 20005-3701 (202) 234-4433
22 But a lot of tools are -- we have a lot of tools, but really what we want to do looking ahead with this expansion treaty is put it all together.
So next slide.
Of course must have a metaphor, but we want to move from a toolbox to machinery, to a processing line to make this more efficient. Next slide.
We're able to unify the data sense --
science and artificial intelligence because once it's realized in software, actually moving from AI application 1 to AI application 2, they have much more in common from a software standpoint than is commonly thought of. Even if they deal with radically different domains, even moving from text-based analysis to image-based analysis, there's a lot that's the same there.
And we can leverage that to make it more efficiency -- to make it more efficient and effective. So you're hear the term we're developing all these common features, you know, what we call Product R, very much a placeholder name. We'll be probably rebranding that into something a little snazzier later. We might get Taylor's help with some NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1323 RHODE ISLAND AVE., N.W.
(202) 234-4433 WASHINGTON, D.C. 20005-3701 (202) 234-4433
23 of the acronyms. Those were good.
But we want to first unify our messaging standards around the power generation common information model, come up with some common data structures to describe data, packets, equipment coming off machinery so we can start to enable apps that may have different developers and will always have their own internal messaging but at least have a common messaging for cross-app communications.
Want to unify the source. And that's just common tools, libraries, essentially apply modern software techniques to unify the source. So we're not redoing the same buttons each time, even though the button has the same function.
We're trying to unify the look, feel, interface so it's, you know, everything's always in more or less the same space, similar to the IOS or Windows experience. We're going to do two things with that.
One is to make your development more efficient. So you're going to be able to get AI, machine learning, and other tools more efficiently out in front of people that can actually use it.
Move it out of the research phase and into the field NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1323 RHODE ISLAND AVE., N.W.
(202) 234-4433 WASHINGTON, D.C. 20005-3701 (202) 234-4433
24 deployment space.
But, there's also big data. So if you can get this data unified and combine it, you can get greater insight. So rather than get one set of insights from image analysis, another set of insights from text analysis, and yet another set of insights maybe from process, process time signals, like from a plant computer.
And then some system engineer, somebody or, has to sit down and decide what all that means, we can unify that and bring that together. And in order to do that, first you got to get everything able to talk to each other. Next slide, please.
So in addition to doing this, we want to connect it to paperless work. Many plants throughout the world are already using electronic work packages or procedures or some form thereof. They're quickly moving to this tablet-enabled world for a work and plants. And they've already seen significant savings and efficiency and improvements from adopting that.
What we also want to do is to have where the analytics, the models, the AI, is happen seamlessly as someone steps through the procedure.
So instead of taking a picture of something and then NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1323 RHODE ISLAND AVE., N.W.
(202) 234-4433 WASHINGTON, D.C. 20005-3701 (202) 234-4433
25 remembering oh yeah, I have to remember to upload this procedure to this app and then go connect to the internet and get the results and then type those results back in my work procedure so I can go to the next step. Use application programming interfaces so that happens seamlessly and the operator is able just to follow on through. And not either have to do a bunch of manual window-switching or remember to do something.
That's about developing standard applications programming interfaces, APIs. And that's something that we would like to do, make it available for others to use and incorporate into their work, including regulators.
So we are moving ahead. We have finished our roadmapping of, and product vision of this Product R. We will be having actually a kickoff meeting with external stakeholders at the end of March. And really looking forward to beginning to pull all this together and then working with the industry regulators and others to unify our approach to data science and software.
Thank you. And I'll take questions at the end, and I guess that's now.
NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1323 RHODE ISLAND AVE., N.W.
(202) 234-4433 WASHINGTON, D.C. 20005-3701 (202) 234-4433
26 MR. ANDERSON: Yeah, I think we are at the end, awesome. Thank you all for your delightful presentations.
So if they could have a couple questions that we're starting with, and I see several questions in the Q&A module that we'll get to in a second.
First I want to go back to some of the current practices, and I'm going to start with you, Eetu. Considering the data, and since much of data has been more valuable over the years and definitely accessible to many of staff and your -- probably your organization, what are some of the business impacts that you -- or processes that you may believe that may be changing as part of your development efforts?
And I will just say for the, you know, Taylor and Rob, if you have anything to add after this, feel free, keep the discussion open.
MR. AHONEN: Thank you, Shaun, for the question. I think that the impacts are going to be huge. I mean, we want to be more risk-informed, and that's one thing we want to move forward with. And that is going to happen.
I want to give an example about how to use -- how does this fit in really. We've had some NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1323 RHODE ISLAND AVE., N.W.
(202) 234-4433 WASHINGTON, D.C. 20005-3701 (202) 234-4433
27 experiments with natural language processing, because all of our decisions and stuff, most of our data is in PDF, so we have to use a natural language processing to get data from there.
But here we did an interesting experiment and we got out how much of our reviews deal with systems that have a safety Class 1, 2, or 3. And it turned out really that in terms of number, in the number of decisions -- decision when adjusting to the number of different systems in (audio interference).
We focus mainly on the safety class 1 and 2, and that's really what we want to do. That's already, we are already, even though we didn't know that, you know, already focusing on the -- probably the most safety-significant oversight actions.
MR. ANDERSON: Awesome, so it sounds like you're just going to -- you're making sure, at least your initial tool developments are validating some of the substances you already have, right?
MR. AHONEN: Mm.
MR. ANDERSON: That's really good to help that validating some of your processes.
MR. AHONEN: Yup.
MR. ANDERSON: Rob or Taylor, do you have NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1323 RHODE ISLAND AVE., N.W.
(202) 234-4433 WASHINGTON, D.C. 20005-3701 (202) 234-4433
28 anything else to add to that?
MS. LAMB: I don't think we could really (audio interference) --
MR. ANDERSON: All right, no worries.
Move on to the next question that we have. So think about some of the challenges, and I'll start with Taylor for the NRC. Leveraging data does not necessarily provide a decision, rather it's, you know, it's just additional insights to support decisionmaking.
Although, to get to that point, I'm sure there's going to be challenges that we have to overcome. What have you faced in implementing your solutions thus far, and what have you learned?
MS. LAMB: Okay, well, as Shaun said, I think that the biggest challenge that we've come across is that most people don't know what they're looking for. (Audio interference).
MR. ANDERSON: -- your audio.
MS. LAMB: And that they're not accustomed to having so much data available to them.
And working with the agile process, which is an iterative form of development, it's allowed us to continue making enhancements realistic.
NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1323 RHODE ISLAND AVE., N.W.
(202) 234-4433 WASHINGTON, D.C. 20005-3701 (202) 234-4433
29 So one of the next things that we're going to be working on is actually simplifying some of the tools, because, you know, as I stated, it's a lot of information all at once and it can be a little overwhelming.
MR. ANDERSON: Rob, Eetu, anything you want to add to that? Any challenges you may have experienced?
MR. AUSTIN: Yeah, --
MR. AHONEN: Go ahead, Rob.
MR. AUSTIN: One challenge, so yeah, one challenge we've noticed in getting lots of data is, we've seen this in a couple different, you get a lot of data that is bad data. And you have to make some decisions up front.
You know, it's a signal its a noise, that's oftentimes dependent upon the problem you're trying to solve. But you know, we'll get plant historian data and there's sensors that have just, non-functioning, and you have to make a decision on whether or not you want to do that.
Scraping, you know, getting text-based data from ADAMS, you know, there's a lot of things there that is correspondence, just simple NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1323 RHODE ISLAND AVE., N.W.
(202) 234-4433 WASHINGTON, D.C. 20005-3701 (202) 234-4433
30 correspondence back and forth. It's important for the records that are maintained in ADAMS, but how important is that to building out a technical language model.
And so there's really no substitute for the domain expert and the data scientist going through the data at the start of the project and making sure there's not data quality issues. That's a big deal.
I mean, my background is acoustics and noise control, and I know that many times we thought we had measured something interesting and it was just an artifact of our data sampling and collection technique. And same sort of thing can happen with AI and machine learning if you don't up front understand what you're looking for and get your data set appropriately.
MR. AHONEN: Yeah, I'm going to add some.
I agree with Taylor and Rob. And some things we've had challenges, one thing I already mentioned is that our data is often inside a PDF or Word or -- it's a text document basically. For example, a decision or a memorandum or something else.
And it's a challenge to get the data out NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1323 RHODE ISLAND AVE., N.W.
(202) 234-4433 WASHINGTON, D.C. 20005-3701 (202) 234-4433
31 of a PDF. And then you really need natural language processing and it's hard -- well, it's a bit more tedious to get the data out of it.
Another thing is one thing we've learned is that currently most of our observations and findings are not tabulated in a database or they are not that way. It's kind of hard to search for those.
And it's really hard to get -- we have quite a lot of those findings, but they're very hard to summarize, those findings. It's always an expert judgment. An expert has to review all those findings and do the magic there.
But if it would be, if those observations and findings were to be in a database, it would be much more easy to analyze them.
MR. ANDERSON: Yeah, I think everyone has this common challenge. And Rob, you mentioned, you know, taking information from ADAMS. Just a plug for, you know, our OCIO and NRC, where you know, they've taken all the data that's been in the microfiche and scanned that in ADAMS, and I think that's going to be a wealth of information. So hopefully more information that's to EPRI's efforts there.
NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1323 RHODE ISLAND AVE., N.W.
(202) 234-4433 WASHINGTON, D.C. 20005-3701 (202) 234-4433
32 And going to our next question, want to kick it off to you, Rob. You know, I think it's safe to say the industry has always leveraged data or information in general to support decisionmaking.
And we have a wealth of experience in this area.
So what areas would you think that would benefit the most, or at least from unifying some of the data, the data models that you referenced?
MR. AUSTIN: Yeah, I think where we want to focus initially is just that where, you know, it's kind of in the title, where someone in the plant or an operator is having to take data for information from multiple sources and then make a decision. And that's normally something like okay, do we need to, you know, from inspection results do we need to increase our inspection frequencies or do we need to replace or repair this component.
Those are the type of work management, you know, that didn't feed into work management. A lot of effort is spent in plants, you know, system health and component health to make those decisions.
If we can make those decisions more efficiently and then have more effective results where you're not doing -- you're doing the right levels of maintenance NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1323 RHODE ISLAND AVE., N.W.
(202) 234-4433 WASHINGTON, D.C. 20005-3701 (202) 234-4433
33 and repair, you're not overdoing it or under-doing it.
MR. ANDERSON: Anybody want to add anything else to that?
MS. LAMB: I can make a general comment on this item. I would say that it's one thing to have data, but it's another thing to wield it in a -
MR. ANDERSON: Taylor, your audio's a little down. Can you double check your mic?
MS. LAMB: Can you hear me better now?
MR. ANDERSON: Think it's better, yeah.
MS. LAMB: Okay, fantastic, I'm sorry about that. As I was saying, I think it's one thing to have data, but it's another to wield it in a way that's useful and provides insights and helps us to become more effective.
Data analytics isn't necessarily about finding new things but uncovering the diamonds that are already there by connecting the data in new ways and presenting that data in a way that allows us to find those new insights.
MR. ANDERSON: Thanks. I'm going to the, some of the Q&A questions I want to make sure we NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1323 RHODE ISLAND AVE., N.W.
(202) 234-4433 WASHINGTON, D.C. 20005-3701 (202) 234-4433
34 touch on. For -- sorry, question for Taylor, but everyone may have some perspectives on how to answer this. You spoke about creating the data analytics using data the NRC already has. Is it inclusive or exclusive of proprietary data?
MS. LAMB: That's a good question. So it's a lot of the tools that we have created internally, it does contain, some of them do contain proprietary data. Now, I have a feeling that folks may ask are the external tools going to contain that proprietary data.
You know, we're going to -- we plan on building an externally data -- externally facing data warehouse, as I mentioned in my presentation. And we're going to be evaluating all data as to what can and cannot be made publicly available prior to it being put in these publicly-facing dashboards or making certain data available to our stakeholders.
MR. ANDERSON: Rob, any thoughts from a perspective of working with any other utilities?
MR. AUSTIN: Yeah, I think, yeah, similar to Taylor. I mean, EPRI similarly gets handed data that we have to treat third-party proprietary confidential. I think what we'd like to work NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1323 RHODE ISLAND AVE., N.W.
(202) 234-4433 WASHINGTON, D.C. 20005-3701 (202) 234-4433
35 towards, and we've already started that is, you know, we de-identify the data so that we can make a data set that can be made available to others. And then, but not have the third-party proprietary restrictions.
Also, once you take data, even if it is proprietary data, say it's dam process signal data, you know, it may be proprietary, but once you feed it through the model, the -- or build a model product, the resulting model, you can't create the data from the model.
It's essentially a one-way function. So you can take proprietary data and then create a non-proprietary model that then can be shared. And a lot of times in AI people are interested in the model.
MR. ANDERSON: Awesome. All right.
Next question we have, this is about approaches, and another question for Taylor, about approaches to our, how we're developing some of the tools. Please help us understand how these new technology approaches and PM tools are helping the technical reviewers without solid technical experience and knowledge.
MS. LAMB: Right, so one of these tools, as I mentioned before, were built on user-centered NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1323 RHODE ISLAND AVE., N.W.
(202) 234-4433 WASHINGTON, D.C. 20005-3701 (202) 234-4433
36 design. So we try to build these tools with a particular end user in mind, whether it's a technical reviewer or product -- a project manager, a branch chief, or a senior executive here at our agency.
So we want to make sure that these tools are usable for everyone across the Agency and that they are built for specific needs in mind. So one of the things that we're going to do more work on is data analytics with technical information.
One of the tools that we've already created internally is for human factors information.
And we're also going to be working with the Office of Research for SPAR model information that provides risk insights.
So ultimately these tools will reduce administrative burden by providing more time for mentoring and training with senior engineers. But ultimately they should be very user-friendly and once again, built for particular users in mind.
MR. ANDERSON: I might open that up to maybe Eetu and Rob. I think -- the thing about knowledge management there, thought that might be some good insights you might add.
MR. AHONEN: Yeah, well, yeah, that's a NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1323 RHODE ISLAND AVE., N.W.
(202) 234-4433 WASHINGTON, D.C. 20005-3701 (202) 234-4433
37 good point. I was thinking about how to create, like, user-friendly applications user-friendly tools. It's obviously you have to have the users in the development, in doing the development with the users. That's obviously -- that's obvious. And that's what we've been doing also. We want to encourage the users to engage in the development of these tools.
About knowledge management, I think I didn't quite catch your question about knowledge management.
MR. ANDERSON: Don't let me mess you up.
I was more talking about how that played into, mentioned knowledge management in terms of how that could be leveraged for the tech reviewers. But I think you answered it appropriately, I just thought that was a good segue.
MR. AHONEN: Yup.
MR. ANDERSON: Right. Next question I have, Rob, you talked about the application, and the question is how secure is the proposed approach and app?
MR. AUSTIN: Well, if we're going to be doing it with security as one of the upfront NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1323 RHODE ISLAND AVE., N.W.
(202) 234-4433 WASHINGTON, D.C. 20005-3701 (202) 234-4433
38 requirements, working to make sure that we have the APIs and it is secure communication so data in transit is secure.
Data at rest, I can say that EPRI has been doing a lot of work to, you know, improve and validate our own security posture. That's a bit complicated discussion, but I think the short answer is security is being built in from the -- at the start.
MR. ANDERSON: All right, thanks. Got another question here. And this may be for the entire group. How do you quantify regulatory effectiveness? How does the panel envision to answer
-- oh, I saw -- already answered this question. And what should the NRC do to quantify and continue to sharpen the pencil on assessing the regulatory effectiveness?
MS. LAMB: That is quite the complicated question.
MR. ANDERSON: Mm-hmm.
MS. LAMB: I think, you know, I wish I could phone a friend, you know, Shaun here on my panel, since technically he's my boss, see what he says. But I'm not sure that I'm allowed to do that.
NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1323 RHODE ISLAND AVE., N.W.
(202) 234-4433 WASHINGTON, D.C. 20005-3701 (202) 234-4433
39 I actually would want to give that a little bit more thought before responding on a whim, I think.
MR. AHONEN: I think in general --
MR. ANDERSON: Go ahead.
MR. AHONEN: Well, obviously this was a question I was also asking. And some would tackle -
- is it -- one thing is that you can measure the process of, all the processes that we have in a regulator. You can measure the regulatory -- the review approach, the inspection approach, how fast are they.
But is that really effective? I mean, is that the only thing that you want to be -- you want to just go through that preview that what --
very fast. Is that effectiveness? I don't think so. And it's one part of effectiveness, but it's not everything, yeah.
And that's why you need to have the users, so that you can ask, okay, now we are -- we want safety, so, okay, yeah and we are dealing with let's say radiation protection or whatever. And so what are the most important things here and try to put it in small pieces and dare to find the things that -- how does the regulatory work change the NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1323 RHODE ISLAND AVE., N.W.
(202) 234-4433 WASHINGTON, D.C. 20005-3701 (202) 234-4433
40 licensee, for example.
MR. AUSTIN: I think, if I could, so I'm not -- EPRI's obviously not a regulator, but I think the broader question, though, is how do you measure the effectiveness of applying AI and data science is
-- I think you have to decide up front, okay, what are the performance metrics are we looking for. Is it in fact to see the --
MR. AHONEN: Yes, absolutely.
MR. AUSTIN: Time of work or time to get stuff into the field. Is it to, you know, are the particular performance indicators, if you're the plant, you know, we hope to see, you know, I would say a plant like, you know, reduced outage time, reduced -- sorry, improved INPO metrics and other metrics.
Because there's lots of metrics they can already look at, decide where they're at in terms of regulatory and safety margin. And then looking at business performance, are we reducing our dollars per megawatt or did we replace five guys working in the field with five IT guys. So maybe that's, you know.
And which could be fine, maybe you improve some other metrics with that.
NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1323 RHODE ISLAND AVE., N.W.
(202) 234-4433 WASHINGTON, D.C. 20005-3701 (202) 234-4433
41 The important thing I think is to decide up front, you know, because you're going to get what you aim for in terms of the goal, you know. So decide up front, what are the most important metrics, what do we run the design on, focus on those. And not spend time on things that maybe doesn't add as much value.
And I think that's a conversation, you know, that we all have to have because all the stakeholders have different metrics or approach to metrics in nuclear power differently. I don't think they have different metrics but different perspectives on it. That's a conversation I think we all have to have up front on what does success look like.
MS. LAMB: So actually now that I've had a moment to think about it, I think I might not necessarily have a really broad answer for that, but I do have a specific example of what we're doing internally to help answer that question.
So for some of those project manager tools that I previously mentioned, we are trying to create what's known as precedent analysis tools, as well as tools that can view previously submitted NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1323 RHODE ISLAND AVE., N.W.
(202) 234-4433 WASHINGTON, D.C. 20005-3701 (202) 234-4433
42 applications.
And what that allows the project manager to do is go back and view similar examples of some licensing actions or relief requests or whatever submittal that licensees or applicants send it to be able to compare them over a period of time.
And you can definitely see outliers with the data. Sometimes you can see, you know, with resource estimates it might have been way over, it might have been way under. And I think that by having these tools available, we're able to more finely tune our scheduling practices, we're able to determine where we might have gotten, you know, caught up in an application that might have stumped us.
But you know, at our fingertips we're able to quickly go in, do these searches, and pull up previous applications, rather than going into the agencywide documents access and management system to manually look up a lot of these previously submitted applications. So it definitely helps streamline things and it definitely shows us some information that was not previously readily available.
MR. ANDERSON: Absolutely, and I think you all covered it well. And you know, every NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1323 RHODE ISLAND AVE., N.W.
(202) 234-4433 WASHINGTON, D.C. 20005-3701 (202) 234-4433
43 organization has performance metrics, the NRC is the federal regulator. And you know, most of these tools, you know, we always want to make sure these are complementing, you know, our final or ultimate decisions that we're making.
So, you know, let's make sure that's, in our minds is that the tools are not going to give us the final answer or decisions, but we already have performance metrics that we want to make sure we achieve.
And another question. What steps -- what steps are being taken to share information with members of the public? Very -- it's to the panel.
MS. LAMB: I guess I can take that question first. Particularly for the externally facing tools, we are holding public meetings. We are presenting to the Commission. So you can actually see some of those recordings now, especially with the virtual environment.
But with MAP-X in particular, Caty Nolan, the product owner for MAP-X, she is going to be holding quarterly public meetings at a minimum to make sure that the public and stakeholders are well aware of, you know, what's taking place with that NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1323 RHODE ISLAND AVE., N.W.
(202) 234-4433 WASHINGTON, D.C. 20005-3701 (202) 234-4433
44 project. As far as Rango goes, it's also, that's pretty similar as well, and hopefully you'll be seeing that shortly.
MR. AHONEN: I might add also that, yeah, this, I mean, thing about open data, that's very important. And we also want to have, at least I want to have, much of our data open. And it's the thing that when I say that okay, why don't we open this or that data, could it be open, just this, isn't this just public, the public should know about this. Then there's always the question about okay, is this really -- is this really public.
There might be some things that if you add all these things that they publish, then it's going to give you some proprietary or confidential information. And I think that we really have to push ourselves to be more open. And I think in the US, you are doing a great job in that, and we have much to look upon you, and for example in that.
MR. ANDERSON: Great, thanks. And Rob, any insights from the information that you all collected and how you plan on, if possible, releasing some of your data to members of the public?
MR. AUSTIN: Yeah, we, some of the NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1323 RHODE ISLAND AVE., N.W.
(202) 234-4433 WASHINGTON, D.C. 20005-3701 (202) 234-4433
45 technical briefs and all that during my presentation are public. If they're not, of course, again, if you're at the nuclear utility, you can access it EPRI as a member.
We are definitely looking to -- we have our, we call it the EPRI 10 data sets that we want to at least make more widely available within the utility industry. Some standardized data sets can be used for training and testing purposes. I don't know if they'll be public. But it's certainly available to EPRI members to use as they build out their own AI and machine learning tools.
MR. ANDERSON: Thank you. Right, going back to questions, just seeing if there's any that we haven't answered. No, I don't think -- I think we answered most of the questions. The second-to-last question I have is, so data analytics.
Data analytics, AI, you know, there have been terms that thrown off, thrown around I guess for the last I would say year, year and a half, probably even more, in terms that are not too sure that organizations are going to focus on. Or if they are going to focus on them, there's a huge investment in, as part of developing it.
NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1323 RHODE ISLAND AVE., N.W.
(202) 234-4433 WASHINGTON, D.C. 20005-3701 (202) 234-4433
46 So how are you all determining the value that's going to be added to the organization? Anyone want to go first?
MS. LAMB: There's silence trying to figure out who's going to first to answer that.
MR. AUSTIN: I'll take first crack at it, I think because it's, I think for us it'll be as, you know, we get feedback from the members on -- and others that are using it, you know. I think the key criteria for any of these is like is it actually being used by your community of users that you target it at. Are they actually using it. If not, why not.
And which could have many reasons to not use something and they may not have anything to do with the AI machine learning aspects of it. It could be a delivery issue, it could be an interface issue.
So it's, yeah, I think really is it actually being used and are the users seeing the value coming from it.
As a research organization, you know, we're always like okay, are we getting -- have we learned anything from this, you know, and how can we use then to feed back in and make the next one even better. But first one is like are people actually NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1323 RHODE ISLAND AVE., N.W.
(202) 234-4433 WASHINGTON, D.C. 20005-3701 (202) 234-4433
47 using it.
MR. ANDERSON: Absolutely. Eetu, you --
were -- saw you --
MR. AHONEN: Yeah, that's -- I agree with Rob that we have to get also data from -- about do people use this tool. If I create a tool for our internal use, I want to have a use -- its paperwork for -- out of it. So that I get the okay, now, only three people have previewed this tool like in a year.
Well, it's probably not that useful then, because nobody looks at it.
And but I think that we are still in the early phases of, at least here in STUK, in using data analytics and AI. And we have a, like, for our the problem might be that we want to have good ideas and just try something and then look at if it's useful.
And then that's how we choose which tools to, or which ideas to go forward with.
And obviously we do have a like a project management -- project management tools for determining the added value of a project. But I think those might not apply here. So yeah, it's a tricky question here.
MR. ANDERSON: Especially when you're NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1323 RHODE ISLAND AVE., N.W.
(202) 234-4433 WASHINGTON, D.C. 20005-3701 (202) 234-4433
48 outside of -- trying new things outside of private management and looking into, like you're saying, added value. Sometimes you have to have the ideas and see if, you know, which ones work and which ones done.
Taylor, I saw you off mute, sorry.
MS. LAMB: Yeah, no worries, no. I was going to kind of add to that, because I think this is definitely an important part of the conversation, and that's one of the things we do as a part of the MAP team. We want to make sure that we are adding value to the organization, so one of the things that we do is continuously seek feedback from the staff.
We recently sought additional feedback on some of those PM tools that we created for Office of Nuclear Reactor Regulation to make sure that folks were using them. If they weren't, why not. And where we could be focusing our efforts. Because we don't want to focus our efforts on something or spend money on something that ultimately folks aren't interested in using.
So we have gleaned some insights, like I said before, that there might be too much data in some of these tools. So we're hoping to streamline NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1323 RHODE ISLAND AVE., N.W.
(202) 234-4433 WASHINGTON, D.C. 20005-3701 (202) 234-4433
49 some of them a little bit more.
Also, as we're in the fiscal year 2023 budget cycle, we've also sought feedback on what are the current cost savings or even projected cost savings with the development of some of these tools.
And we want to make sure that we're properly budgeting for MAP in the future, and once again, focusing our resources where they're needed most.
MR. ANDERSON: Thank, Taylor. Anyone have any -- Eetu, you want to add to that or you have any other questions?
MR. AHONEN: Yeah, I have one thing for Taylor. You mentioned in one of your answers, you said you had a precedent analysis tool, was that correct?
MS. LAMB: Yeah, that's correct.
MR. AHONEN: It sounds very cool. I mean, I heard this kind of a request or an idea for a precedent analysis, it's true. Could you elaborate a bit more?
MS. LAMB: Yeah, I'd be glad to. So this is actually one of those tools that we're trying to get off the ground a little bit more. And one of the missing elements right now that'll really bolster NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1323 RHODE ISLAND AVE., N.W.
(202) 234-4433 WASHINGTON, D.C. 20005-3701 (202) 234-4433
50 this tool is natural language processing and machine learning. Because we just have so many documents that we would have to scrub through in order to be able to make it the most effective.
So right now the precedent analysis tool is just taking a bunch of documents and scanning --
not scanning, because we don't have that technology just yet. But it's looking at really the high level data for scheduling implications and the types of reviews that they were.
So project managers or really anybody that is interested in -- at the Agency, they can go into this tool, they can filter a number of parameters out and they can seek similar reviews to be able to determine the best approach for, you know, an incoming submittal for instance.
MR. ANDERSON: And I'll just put in a plug --
MR. AHONEN: Okay, cool, thanks.
MR. ANDERSON: And I'll just put in a plug for the Office of Research, who's been leading another effort, ongoing similar to that to just digesting a lot of information that's in ADAMS and improving that, the precedents tool that Taylor and NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1323 RHODE ISLAND AVE., N.W.
(202) 234-4433 WASHINGTON, D.C. 20005-3701 (202) 234-4433
51 their team initially developed so that could be a more effective tool just with the documents we currently have.
And also there's a hopefully soon workshop on specific activities we're doing with NRC and research. More to come.
All right, any other questions? I know it's the last session of the day, but I don't see any more questions coming in on the chat.
MS. LAMB: We're just that efficient.
MR. ANDERSON: Yeah.
MR. AHONEN: Now we can quantify that.
MR. ANDERSON: There you go.
MR. AHONEN: Two hour's work --
(Simultaneous speaking.)
MR. ANDERSON: Time saved.
MR. AHONEN: Yeah, time saved.
MR. ANDERSON: All right, if no other questions, don't see any more questions in the chat, I will thank my panelists for coming today, Eetu, Rob, and Taylor, for presenting on our session. It was the last session of the RIC, but I thought this was, you know, a great discussion.
Hopefully we can have more collaboration NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1323 RHODE ISLAND AVE., N.W.
(202) 234-4433 WASHINGTON, D.C. 20005-3701 (202) 234-4433
52 and share some of the activities that we're doing across everyone's organizations as we start to, you know, using data, data analytics to support decisionmaking.
With that, thank you for joining.
(Whereupon, the above-entitled matter went off the record at 2:34 p.m.)
NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1323 RHODE ISLAND AVE., N.W.
(202) 234-4433 WASHINGTON, D.C. 20005-3701 (202) 234-4433
53 NEAL R. GROSS COURT REPORTERS AND TRANSCRIBERS 1323 RHODE ISLAND AVE., N.W.
(202) 234-4433 WASHINGTON, D.C. 20005-3701 (202) 234-4433