Smart people i get to work with on a daytoday basis. Applying technology to solve problems. That spans a pretty big mission space. Take a look at daytoday, there is a lot of science that goes on in there. In the atmosphere to the ocean and all things in between. A lot of that work, a lot of the ai work going on now touches all of that. We look at things like fisheries to identify fish species, we can do more with augmenting scientists in the field. Theres also interesting projects identifying things, understanding the marine animal population from data that we have available to normal course of business. Theres a lot of other things that we do with ai. We are moving into the forecasting realm with it. Operations determine it is not ai. Predicting projects, predicting precipitation science. Using things like ai, interesting content. But there is a lot of work that can still be done and be worked through on aiding the forecast. In the news lately, there has been a lot of extreme weather events, hurricanes now. A lot of folks are talking about a lot of those events, they have uncertainty with them. The forecaster has to do a lot of work to deal with uncertainty. What we are doing now is running multiple copies of the model to understand where points of uncertainty may be. 10 or 100 copies, depending on what theyre looking for. Ai is looking to reduce the number. The other side of it is data assimilation, basically prepares operations for use in the models. We can get data into the forecasting pipeline at the right time. The longer you can delay it, the better product that you have. We could automate the processes to help augment a lot of the folks in the field. Data centers and offices around the country to really get the job done faster and better. The other side of it is extreme weather events or weather events in general, we want to make sure how do we augment people near a wildfire or hurricane . Ai can be added to those devices, drones, sensors, whatever they are. You can get a lot more data in more situations. The temperature of a hurricane is something you cannot measure. The result for the observer is probably bad. If you have a drone, significantly less. You get better measurements. So a lot of applications across the agency. It is one of the places we have a lot of benefit. People it further, what can we do in the future . This is speculative, but i think it can help us work with the public more. Whether data is complicated. I am not a scientist, i rely on my friends that are scientists to interpret that data. But there are things we do on a daytoday basis that are affected by the weather, travel, if you are going to buy a house, what is the flood risk . Other risk for weather events and being able to interact with that data in a better way, have a conversation. That sounds very futuristic, but these are the kinds of things, the kinds of technologies that will allow us as consumers to potentially interact with the data and understand it better. Better understand the flood hazard to your house or neighborhood. Really make it a more digestible product for everybody. A lot of the goals within noaa, how do we make things more understandable . You talk about products from seconds to minutes to hours. If there is a Tornado Warning or hurricane watch or whatever that may be. Understand to take action when we need to. I think these tools will be part of that in the future. I think theres a lot of potential. Currently, i think the goal is to make that data more accessible and usable not just by a scientist, but the public and people that we serve. That was a very comprehensive overview on noaas Larger Mission regarding data and technology. I would love to drill in a little more. He briefly mentioned more innovative to collect meteorological data that noaa relies on. In addition to Drone Technology i believe you called it may be edge technology. Does noaa have any other emerging novel, innovative ways to collect more enhanced and precise data . Every ai model that noaa would like to train anymore predictive capacity will need a huge variety of data to learn upon. Theres lots of opportunities and i think it spans the mission space. It comes down to the ocean and there are lots of places you need instrumentation to take observations. That is one of those things. The other side of that is there are scientists throughout the world, right . Getting to them is hard because they are more isolated because those areas are isolated. Being able to supplement the technologists or now, with technologies that are available, break those things down to something where we can they can have that in the field. Start the processing out there. Start your processing at the edge and by it the time it gets to the middle, you have a richer data set to work with. Ai is part of that and part of that is the architecture they run it on. Working toward that edge to core i do not want to sound very marketing. How do we get the data processed and turn it into product faster . On the others of that, having smarter applications that really make that engageable. It is not just a forecast or a data set you can download. But really have the tools for folks to interact with it. I think the goal is to make sure it is usable, not just for the folks are meteorologists, but everyday citizens. I am glad you brought that up. That interoperability of data, if i recall correctly, that is something the Biden Administration has wanted to bring into the modernization of the government. Be able to have a lot of the quantitative material be accessible to a large audience. Noaa is carving out its own little territory within that. In that vein, you mentioned we are witnessing a slew of natural a very Severe Weather phenomena. I do not have to tell you about the maui wildfires, canada reported several devastating ongoing fires in addition to the ones that we saw before. One of the main Canary Islands is suffering from very eerie burns, parallel to what is going on in maui. We read about these incidents so much in the headlines. You are talking about a more interactive way to communicate the data. Which brings me to a bit of a general question. Does noaa have any plans for a were upgraded and enhanced type of User Experience related to these disasters or other predictive weather analyses . That is a good question. I will start at the prediction part. I think one of the drivers of noaa is the social science. How do we get people not just to interact, but take action . Producing, observing, processing. In any event like a wildfire, it is important. The preparation period is not very long, unfortunately. They can move very rapidly and catch people off guard. You have to find out where the fire started. Caveat to this just as a technical practitioner view. The event itself, the downrange weather. It is a complicated thing forecasting a model. It makes the presented presentation of data more complex. It involves a greater number of people. There are other agencies. I think the drive on the social science side, you can see it in media and other places where they talk a lot about how we change the language to get people out of harms way. I think that is part of the data part, how we interact, how we make it understandable. That takes more than language. That takes data science individualization science and visualization. There are initiatives to make that bring it to the forefront. We do that to a certain extent with everything. Having things where you can work with the data and play with it to understand it better. There are ways we can do that. Showing an animation of the storm or a chart, the things you see on your Weather Report are important. But really understanding, especially the foreign event. How do i prepare myself . How do i make sure my family is in the right place . Buying a house, what are the investor risks . Those are the kind of things that will help folks and those are the things we can get to with things like generative ai and other technologies. They are not really in operations yet, but i think those are the directions people will go toward. It has a significant impact. I imagine perhaps a final product goal for your team would be something that would be able to catch maybe we can talk about it in terms of a hurricane or tornado. Catch something on the radar then be able to use Historical Data to understand here are the areas likely to be hit in a climate event like this. Yes . So i imagine as you look at so many of the cataclysmic and often times unexpected or may be underestimated events we have seen, the storms in texas from a few years ago, that felt very out of left field, very unanticipated. I would imagine a lot of what noaa would like to use in terms of a more predictive capacity would be able to warn people, perhaps in a certain area code. You should probably evacuate, there is a chance you are going to see two inches of rainfall over the next two days or Something Like that. The computing that is out there and we can deliver to folks in the field right now, those are ports you are talking about saying we have to evacuate , severe thunderstorms. There is a local forecast responsible for that. Getting those tools, things like ai and clouds and other technology that we talk about, ways we can get more capacity for those folks to get better predictions, then work with the social science folks on how we communicate at the right way. Part of the progression will be how much ai gets to forecasting . That is yet to be seen and one of the things a lot of scientists are looking to do. We will see how we can move from the forecasting that we do two more of a statistical ai part. Theres a lot of opportunity. That is one of the areas a lot of folks are working on. Perfect. You brought up a point i didnt want to clarify for a potential audience member, because i know i needed clarification. Would you mind giving a brief overview of the distinction of ai . Deterministic is the traditional if you are going to describe a system, you are doing the physics calculations. You are taking in the observation. It is a simulation. Statistical is the data that you have. Run it through your data model. Your ai model, however you want to say it. The goal is the same, but different approaches to a problem. There is a mix of that in forecasting. The forecast reports that you see and the models we use are still in that simulation. Dealing with uncertainty, that is part of it. Statistical, but changes the toolset we have. Speaking of those toolsets, you talked quite a bit about how important the local weather stations, people in the field and on the ground, the noaa affiliates are to cultivating the data that will hopefully train more predictive ai and automated algorithms. One fear that i hear being discussed quite a bit in the emerging Technology Field and industry broadly is who is able to get access to emerging systems . We have seen that already have been with ai, to some degree. Kind of a page in the Biden Administration playbook has been to democratize access to a lot of technology. Bringing me to my question for you. Noaa doesnoaa have any sort of agenda items that involve helping equip more local and potentially underfunded meteorological stations or Research Areas across the u. S. With more advanced and potentially Automated Technology . I imagine it would help noaa in the long run if we are talking about gathering a of nationwide and global data. In general let us start generally and work down. In general, that is certainly in initiative. We want to make sure we are doing our work equitably. There are certainly areas any place, there would still be centers. For the local folks that do forecasting, i think we have good coverage. Certainly within the agency, equitable access is important. As we move toward more ai i will put this in here as well. Having access to that and having the thing you need around that. How do i trust this thing . How do i explain or understand it works . From the technology side, the reason we operate the way we do is we build a lot of trust with what we deploy. That has to happen on the ai side. There are specific programs, maybe not specifically for ai, but it is part of it. My office, their job is to interact with the public, whether it is the academic industry, scientists to work on the next operational sweep of models. They are in the process of open sourcing. Really making it accessible. It takes a lot of work to get a model up and running. Someone could download this thing, run in the cloud or wherever that may be and understand how it is being generated and processed. It is to make sure we only get agency view or academic view industry view, but we are getting everybodys view. From my perspective, it gives you a nice baseline for understanding. There are other programs. That is really to get data out in the public so you can interact, radar data or other data. Operational data sets. You can download them or go to whatever cloud provider and download the data for free and interact with it the way you want. There are a couple ways to do that and they are important because then only make our products better, but understanding. Touching on the community element, i know we are nearly out of time. If you cannot really speak to this topic, just ignore me. The Biden Administration, i think the industry as well as going to be prioritizing trustworthy ai systems in terms of what we are going to adopt and hopefully develop. Can you briefly talk about what noaa would like to see from potential vendors or how you guys plan to incorporate more trustworthy and accountable ai . . You dont have too much time, so if you want to give me the cliff notes, go for it. From a Technology Perspective i do not know how far i would go off. What i think we all need to look at is hes things are complex. They are going to be layered on top of each other. You have to have an understanding of not just the system, but the data. There are biases in data. I do not know if we can ever get rid of that. Do i care about the bias are not . If i care, how will i counteract it, work around it . Understand your data, understand the system. Do i understand the service . Do i understand what it is introducing . It becomes part of the design and development of a product. You have to have a conversation, whether it be with a vendor, the community of interest. Have this kind of conversations upfront to understand how it may potentially impact what you returned to get to. That is the start of it. As things progress, the more you use something, you ask questions less and less. Do i trust this . Of course i do. Have conversations upfront so you are thinking about it throughout the lifecycle. A couple of very important pillars will be quite present as we keep moving through generative ai. How are we carrying for data . How are we monitoring it and looking out for the biases . Not developing that with the technology as well. It is good to hear that noaa is on top of that. Normally at this point when we are dwindling in terms of how much time we have, i like to ask anyone i am interviewing, what are you working on within your agency or office right now you are excited about and would love to share with the public . There is a lot of fun stuff going on right now. I think the most exciting thing going on right now it is not just noaa, probably other agencies and companies. The generative ai part, how we are going to use it, what can we do with it, how does it help . Going back to the trustworthy thing, how did we get it off the ground . The important things we are focused on, how do we get the tools . So we can start trying things out and we get to see all the Different Things as they come out of it. That is the fun part for me. We are going down that path. Just talking about ai i get to talk to ai talk about ai a lot. All the things that folks are putting together, all of the work that we do, those are really fun things and i am looking forward to see what comes of it. I am sure everyone as we enter a priority area with Climate Change and the environment, a precious situation. The work is going to be incredibly important in the years to come. Unfortunately, i think that might be all the time we have for a very conducive conversation. But i enjoyed talking to you, inc. You for taking the time to talk to me. For everyone watching, stay tuned for the next segment to learn more about channeling the power of ai. Will come back, i am the senior manager of Event Operations and it is my pleasure to introduce an nvidia Data Scientist for the second part of todays conversation. Thank you for joining us today. Thank you for having me. To kick things off, can you start by telling us a little bit about your role . I would be happy to. I am a Data Scientist. I have been with nvidia for a couple of years. I have been a fan since the 90s, i followed their journey early on. I have worked in the federal space for 11. 5 years, so i have a lot of expertise in that domain as well. I know how hard it is for a federal government folks to work with ai and on top of that, i am a student at university of maryland Baltimore County in computer science. I used to teach as an adjunct, but not right now while i am working on my phd. I have to sleep sometimes. Thank you for the excellent introduction. What are experts anticipating . Nvidia is a key player in the ai industry because we have worked hard to enable hardware and software to accelerate commuting computing. Our founder recognize the importance of computing and the application of cpu to accelerate computing. He recognized the potential and went all in in terms of developing hardware and software to enable the ecosystem. The target audience is mostly developers. We look at every aspect of the ai pipeline. Not just hardware. We do a lot with hardware, you can purchase for marshaling which models. For Large Language Models. We listen to the industry, listen to partners and customers. We listen to the internal development teams. Recognizing every piece of Ai Development is complicated. Anyway we can accelerate that, we do that. Not just the hardware piece that enables specialized reports for matrix operations, which is the foundation of a lot of these models used in generational ai. So dealing with that aspect of it, then the whole software ecosystem. Whether you are talking generational ai or computer vision, we have Software Pipelines tha