Transcripts For LINKTV Earth Focus 20221027 : vimarsana.com

Transcripts For LINKTV Earth Focus 20221027

Today, scientists are blazing a trail to this very future. The fact that were enabling the system to make its own decisions, i dont even know where to begin with that. [shivani] i want to know what breakthroughs are being made. Its talking, and its having this dynamic conversation with you. Thats the wonder. Machines can be selfaware in ways that we cant. [shivani] that will force the future to. Oh my gosh, its looking at me. Hyper intelligence. bright orchestrated music my name is shivani bigler. As an engineer and neuroscientist in training, im obsessed with Artificial Intelligence. As a kid, my father took me to tech and robotics trade shows where i became dazzled by science. Every year, the inventions were smarter and smarter. Artificial intelligence has come a very long way in the last couple of years. Most Ai Technologies are programmed to think for themselves, to learn from examples, kind of like simulating human intelligence in a way that it learns from past experience. But how does ai actually work . In the future, will ai achieve human traits like emotion, consciousness, or even free will . And how will humans and robots Work Together . humming music today, the clearest road of the future is the self driving car. Unlike a regular car, which is just a machine, a selfdriving car is a robot that can make decisions. In the future, will every car on the road become driverless . To find out, ive come to a hotbed of selfdriving car research, pittsburgh, pennsylvania. Every Single Person has started to have conversations about selfdriving cars, because essentially, theyre the future. But in order to understand it, we have to look under the hood. Making decisions on the fly, even simple ones like these, does not come easy for computers. To discover the inner workings, im meeting a true pioneer in the field. Please get in. Thank you. Dr. Raj rajkumar of Carnegie Mellon university. Carnegie mellon is the birthplace of selfdriving car technology. Thanks in large part to the work of raj and his colleagues. Theyve been the leading innovators in this field for more than 25 years. So how does his selfdriving car make decisions to safely navigatehe world like a human driver . Should we get started . Pretty eager. Yes, we can. Since raj is distracted by our conversation, for safety reasons, the state of pennsylvania requires another driver in the front seat to monitor the road. This is so cool. Im nervous but excited. Whats the longest youve ever driven a vehicle, autonomously . We have gone hundreds of miles. [shivani] awesome. Im going to go auto by pressing this button. Oh my gosh laughs . It really is driving itself. While most selfdriving cars are built from the ground up, raj just bought a regular used car and hacked it with powerful onboard computer systems, making it more adaptable than other regular cars. Weve installed a bunch of sensors in there. It is able to shift the transmission gear. It is able to turn the steering wheel, apply the brake pedal and the gas pedal. Theres really a software that runs on the computer that makes this capability really practical. And there are some very key, fundamental Artificial Intelligence layers that tries to mimic what we humans do. [shivani] to mimic human decision making, most selfdriving cars use a combination of cameras in advanced radar to see their surroundings. The ai software compares external objects to an internal 3d map of streets, signs, and transportation infrastructure. [raj] a map is something that is static in nature, traffic, and people, and objects are dynamic in nature. The dynamic information it figures out on the fly. [shivani] corehending dynamic information allows it to understand where it is heading objectively in space and react to changes in traffic signals. Ahhuh, it recognize the stop sign. Yes. We have pedestrian. We definitely should not be running to that person. shivani laughs the ai challenge to make a vehicle drive itself is not an easy task. Safety is priority number one, and priority number two and number three as well. But what happens when the ai system doesnt understand specific objects in its surroundings . [narrator] a pedestrian in tempe, arizona was killed last night by a selfdriving taxi. Its believed to be the first fatality caused by an autonomous vehicle. [shivani] a tragic accident happened because a selfdriving vehicle didnt recognize sething in its environment, a jaywalker. In the future, advanced selfdriving cars will have to make life and death decisions on the fly. If avoiding the jaywalker means crashing head on with another car, potentially killing the driver, what should it choose . How will scientists address monumental problems like these . The first wave of artificially intelligent robots were programmed by engineers with static sets of rules tochieve their goals. These rules are called algorithm, but not all rules work in all situations. This approach is very inflexible, requiring new programming to accomplish even the smallest changes in any gen task. A new approach called Machine Learning has changed everything. With Machine Learning, computers can absorb and use information from their interactio with thworld to rewrite their own programming, becoming smarter on their own. To see Machine Learning in action, im meeting another Carnegie Mellon team at an abandoned co mine. Dr. Matt travers leads a group that won a challenging subterranean navigation competition called by the department of Defenses Research agency, darpa. Theyre affectionately known as r1, r2, and r stands forobot. all laughs [shivani] these robot twins are designed for search and rescue missions too dangerous for humans. And unlike the selfdriving car, they operate without a map. To achieve this, they have to learn to identify every single object they encounter on the fly. They are programmed to go out and actually act fully autonomously, and they will be making 0 of their own decisions. So theyre recognizing objects, theyre making the decision of where to go next, where to explore. [shivani] to see this in action, the r2 robot is starting on a simulated search and rescue mission to find a stranded human dummy in the mind. Imagine having a map of a collapsed mind before you sent a team of people to go rescue someone in that mind, theyre like its a game changer. [shivani] how the robot discerns elements in this environment parallel is how an infant learns about her environment. A three month old uses her sense to cognitively map out her environment and learn to recognize her parents. She ultimately uses this map to interact with everything in her world, st like this robot. Okay, so we ready to roll. bright chiming music [shivani] Artificial Intelligence makes this learning curve possle, but how does it create its own map and identify a human on its own . And without an internal mapping system, like the internet . Test engineer, steve willits, shows me how the r2 robot can detect a stranded person. [steve] when youre in a search and rescue scenario, thats the kinda situation where youd want to deploy one of these. [shivani] as it explores and maps the cave, it drops devices called signal review to create a wifi network trail. It drops those just like bread crumbs alo the path. [shivani] using this network, the robot sends data back to home base to create a map. At the same time, the robot must look at every single object to identify the stranded human. So the lidar system is giving a full laser scaning. [shivani] lidar stands for light detection and ranging. Similar to its cousin radar, which uses radio waves, lidar systems send out laser pulses of light and calculates the time it takes to hit a solid object and bounce back. This process creates a 3d representation of the objects in the environment, which the onboard computer can then identify. This process is similar to how the eye feeds visual data to the brain, which then recognizes the objects by tapping into our preexisting knowledge of what things look like. By fully understanding its environment, r2 can then make better decisions about where to go and where not to go. [steve] what our robots doing right now is exploring. So the robot came to a junction and off to the left it could see that it couldnt get past. [shivani] right. [shivani] so it saw the opening to the right, and thats where it went. dramatic music it kind of looks like its making decisions about whether or not to climb over these planks and obstacles all up in this area, right . [steve] thats exactly what its doing at this point. Just like a baby, r2 learns through trial and error. Its like a little dog wagging its tail. But eres no one here to rescue, so it moves on. bright dramatic music as r2 continues to map out the mine, oh my god, a human. It stumbles upon its intended target laughs . That is randy, rescue randy. Hello reue randy laughs scared me. With the discovery of rescue randy, the r2 robot can not only alert emergency personnel, but also give them a map on how to find him. That is incredible laughs . Just knows what its doing. These incredible rescue robots are clearly paving the path to the future of hyper intelligence. [narrator] in the future, autonomous exploration vehicles perform search and rescue missions in every conceivable disaster zone. Even in avalanches at top mount everest, incredibly, intelligent offroad vehicles are also mapping deep cave systems previously unknown to science, discovering a vast supply of rare earth elements essential for modern technology. [shivani] Artificial Intelligence will clearly save human lives in the future, but theres a lot of terrain on earth thats too difficult to navigate on wheels. How will intelligent robots make their way over rainforests, bodies of water or even mountain tunnels . In philadelphia, Jason Derenick of exyn technologies isorking to overcome this oblem. What we focus on is autonomous aerial robotics to enable drones to safely navigate in unknown or unexplored spaces. [shivani] jasons team has built the First Industrial drone that can fly itself anywhere. Incredibly, these autonomous robots navigate without gps and map their vironmenas they. [dek] we focus on all aspects of aonomy, which includes perception, orientation of during flight, motion planning, and then finally control. [shivani] but going from two dimensions to three dimensions requires an increase in Artificial Intelligence processes. The mission for their drone is to flight indendently through a three dimensional path from one end of the warehouse to the other. Starting mission 3, 2, 1. dramatic music [shivani] now to mess with its computer mind, jasons team, places new and unexpected obstacles in its path. Will the drone recognize these unexpected changes . Will it get lost . Will it crash . Essentially, we have a gabbled lidar system that allows the vehicle to paint a full 360 sphere around it in order to sense its environment. [shivani] like the robot in the mine, this aerial robot uses lidar to see. It actually generates a virtualized representation ofhe space, which you see here. And each one of these cubes in the space, its trying to determine whether that cube is occupied or whether its free space. drone engine humming [shivani] the robots onboard computer makes real time decisions of where to go based on its visual input. Kind of like a siemens. drone engine roars incredibly, the drone recognizes the whiteboards and flies around them. chiming music one of the things about this system that make it particularly special is that its actually bein used in the real world to keep people out of harms way. [shivani] autonomous drones like these are already at work in Hazardous Industries like mining, construction, and oil exploration. They safely conduct spections in dangerous locations and create detailed maps of rough terrain. From a technological perspective, the fact that were ableo do everything that were doing on board selfcontained and enabling the system to make its own decisions, i dont even know where to begin with that. dramatized music [shivani] selfflying robots like these will revolutionize search and rescue and disaster response. They could also transform how packages are delivered, but there are limits to what large single drones can do. More complex tasks will require teams of small, nimble autonomous robots. Dr. Vijay kumar at the university of pennsylvania is working with swarms of drones to perform tasks like playing music. With obstructions cooperatively. Hes also developing technologies to tackle some very big problems, including world hunger. In a couple of decades, well have over 9 billion people to feed on this planet, of course, thats a big challenge. [shivani] to take on a task this big hes building an army of small flying robots with the ability to synchronize. We think about technologies that can be mounted on small flying robots that can then be directed in different ways like a flock of birds, reacting to a predator or a school of fish. You have coordination, collaboration, and it all happens very organically. [shivani] using ai to get robots to work as a coordinated collective group is a daunting task. Three, five years ago, most of our robots relied on gps like sensors. Today, we have the equivalent of smartphones embedded in our robots and they sense how fast theyre going by just lookg at the world, integrating that wit the inertial measurement unit information, and then gting an estimate of where they are in the world and how fast theyre traveling. [shivani] this, i gotta see, and im gonna check it out virtually as a robot. upbeat music im at u. Penn, remotely in the jay kumars lab. Sample my surroundings. Oh, i hit something. Hello . Hi. [shivani] vijays apprentice Dinesh Thakur is my guide. Today, we are gonna show robots playing in a formation. [shivani] great, can we see how that works . Sure, yeah. [shivani] the first step dinesh takes in coordinating the drones is to provide them with a common point of reference. In this case, a visual tag, similar to a basic qr code. Using only the onboard camera, these drones reference the code on the tag and visualize where they are in states. Using sophisticated bioinspired algorithms, the drones can figure out where each other drone is within the collective swarm. These drones are communicating with one another, right . [dinesh] yeah, right now theyre communicating more wifi. [shivani] so cool giggles . Future versions of these drones will create their own localized Wireless Network to communicate. But for now, this swarm is a proof of concept. Youve defined a formation, and then theyre assuming that formation . Yeah, i just say, i want to form a line and drone themselves figure out where they shod go. [shivani] once they figure out where they are inelationship to each other, they can then Work Together to accomplisa shared goal like ants working as a collective entity. Once they can coordinate between each other, we can send them out and doing specific missions. [shivani] thats really cool laughs . Swarms of flying robots have their advantages. Unlike a single drone self coordinating swarms can perform complex operations like mapping much faster by working in pallel and combining their data, and losing one drone in a swarm doesndo the whole operation. Vijay imagines employing his advanced Swarm Technology to work on farms. This precision agriculture will hp feed the worlds growing population. Wed like robots to be able to roam farms and be able to provide precise information about individual plants that then could be used to increase the efficiency of fooproducti. That would be a huge impact in a world. This is our duty as responsible citizens and as respoible engineers. [shivani] this high flying approach towards resolving the problems of the future is definitely a path i can get on. whooshing music [narrator] in thfuture, Artificial Intelligence, coordinates flocks of drones to protect the environment and boost the food supply. To combat the negative effects of Climate Change on agricultural crops, robotic bees assist with pollination in orchards and on farms, making them more sustainable and productive. Fishshaped underwater robots automatically deploy at the first sight of an oil spill. These drones create a barricade to rapidly contain spills, saving marine life and oceans across the world. [shivani] modern society has a long history of building robots to do work thats dangerous, difficult, or too repetitive for humans. Ai is poised to automate all kinds of humans work ranging from factory work, to taxi driving, to customer service. While some are worried that smart robots will replace human labor, thats not necessarily the case. As a sector, Artificial Intelligence is expected to generate 58 million new types of jobs in just a full years. So what will the future of human robot interaction mean for our work and livelihoods . Im at the Massachusetts Institute of technology, to meet dr. Julie shah. Shes leading groundbreaking research in human robot collaboration. My lab works to develop robots that are Effective Teammates with people. Julie and her team are creating software that helps robots learn from humans. Even giving them insight into different human behaviors. By being aware of real people, robots can directly work and interact with them. How do you teach these robots or machines to do these humanke tasks . Thfirst step as it would be for any person, the first thing they do is become immersed in the environment and observe, and then we need an active learning process. The robot needs to be able to communicate or show back to the person what its learned. We dont want the robot to learn the direct sequence of actions, we want the robot to learn this more general understanding. Thats ultimately like our challenge. [shivani] but getting a robot to grasp the Bigger Picture concept in order to understand the basics of its task in the first place requires a lot of observation and well handholding. My research is focusing on trying to make robot programming easier by trying to teach robots how to do tasks by demonstrating them. Julies colleague, ankit shah, shows me how this robot is learning to set a table. So this is all the silverware and the plates, the bowls, the cups, and this is the table that it has to set. Yes, that is good. Okay.

© 2025 Vimarsana