Today, scientists are blazing a trail to this very future. The fact that were enabling the system to make its own decisions, i dont even know where to begin with that. [shivani] i want to know what breakthroughs are being made. Its talking, and its having this dynamic conversation with you. Thats the wonder. Machines can be selfaware in ways that we cant. [shivani] that will force the future to. Oh my gosh, its looking at me. Hyper intelligence. bright orchestrated music my name is shivani bigler. As an engineer and neuroscientist in training, im obsessed with Artificial Intelligence. As a kid, my father took me to tech and robotics trade shows where i became dazzled by science. Every year, the inventions were smarter and smarter. Artificial intelligence has come a very long way in the last couple of years. Most Ai Technologies are programmed to think for themselves, to learn from examples, kind of like simulating human intelligence in a way that it learns from past experience. But how does ai actually work . In the future, will ai achieve human traits like emotion, consciousness, or even free will . And how will humans and robots Work Together . humming music today, the clearest road of the future is the self driving car. Unlike a regular car, which is just a machine, a selfdriving car is a robot that can make decisions. In the future, will every car on the road become driverless . To find out, ive come to a hotbed of selfdriving car research, pittsburgh, pennsylvania. Every Single Person has started to have conversations about selfdriving cars, because essentially, theyre the future. But in order to understand it, we have to look under the hood. Making decisions on the fly, even simple ones like these, does not come easy for computers. To discover the inner workings, im meeting a true pioneer in the field. Please get in. Thank you. Dr. Raj rajkumar of Carnegie Mellon university. Carnegie mellon is the birthplace of selfdriving car technology. Thanks in large part to the work of raj and his colleagues. Theyve been the leading innovators in this field for more than 25 years. So how does his selfdriving car make decisions to safely navigatehe world like a human driver . Should we get started . Pretty eager. Yes, we can. Since raj is distracted by our conversation, for safety reasons, the state of pennsylvania requires another driver in the front seat to monitor the road. This is so cool. Im nervous but excited. Whats the longest youve ever driven a vehicle, autonomously . We have gone hundreds of miles. [shivani] awesome. Im going to go auto by pressing this button. Oh my gosh laughs . It really is driving itself. While most selfdriving cars are built from the ground up, raj just bought a regular used car and hacked it with powerful onboard computer systems, making it more adaptable than other regular cars. Weve installed a bunch of sensors in there. It is able to shift the transmission gear. It is able to turn the steering wheel, apply the brake pedal and the gas pedal. Theres really a software that runs on the computer that makes this capability really practical. And there are some very key, fundamental Artificial Intelligence layers that tries to mimic what we humans do. [shivani] to mimic human decision making, most selfdriving cars use a combination of cameras in advanced radar to see their surroundings. The ai software compares external objects to an internal 3d map of streets, signs, and transportation infrastructure. [raj] a map is something that is static in nature, traffic, and people, and objects are dynamic in nature. The dynamic information it figures out on the fly. [shivani] corehending dynamic information allows it to understand where it is heading objectively in space and react to changes in traffic signals. Ahhuh, it recognize the stop sign. Yes. We have pedestrian. We definitely should not be running to that person. shivani laughs the ai challenge to make a vehicle drive itself is not an easy task. Safety is priority number one, and priority number two and number three as well. But what happens when the ai system doesnt understand specific objects in its surroundings . [narrator] a pedestrian in tempe, arizona was killed last night by a selfdriving taxi. Its believed to be the first fatality caused by an autonomous vehicle. [shivani] a tragic accident happened because a selfdriving vehicle didnt recognize sething in its environment, a jaywalker. In the future, advanced selfdriving cars will have to make life and death decisions on the fly. If avoiding the jaywalker means crashing head on with another car, potentially killing the driver, what should it choose . How will scientists address monumental problems like these . The first wave of artificially intelligent robots were programmed by engineers with static sets of rules tochieve their goals. These rules are called algorithm, but not all rules work in all situations. This approach is very inflexible, requiring new programming to accomplish even the smallest changes in any gen task. A new approach called Machine Learning has changed everything. With Machine Learning, computers can absorb and use information from their interactio with thworld to rewrite their own programming, becoming smarter on their own. To see Machine Learning in action, im meeting another Carnegie Mellon team at an abandoned co mine. Dr. Matt travers leads a group that won a challenging subterranean navigation competition called by the department of Defenses Research agency, darpa. Theyre affectionately known as r1, r2, and r stands forobot. all laughs [shivani] these robot twins are designed for search and rescue missions too dangerous for humans. And unlike the selfdriving car, they operate without a map. To achieve this, they have to learn to identify every single object they encounter on the fly. They are programmed to go out and actually act fully autonomously, and they will be making 0 of their own decisions. So theyre recognizing objects, theyre making the decision of where to go next, where to explore. [shivani] to see this in action, the r2 robot is starting on a simulated search and rescue mission to find a stranded human dummy in the mind. Imagine having a map of a collapsed mind before you sent a team of people to go rescue someone in that mind, theyre like its a game changer. [shivani] how the robot discerns elements in this environment parallel is how an infant learns about her environment. A three month old uses her sense to cognitively map out her environment and learn to recognize her parents. She ultimately uses this map to interact with everything in her world, st like this robot. Okay, so we ready to roll. bright chiming music [shivani] Artificial Intelligence makes this learning curve possle, but how does it create its own map and identify a human on its own . And without an internal mapping system, like the internet . Test engineer, steve willits, shows me how the r2 robot can detect a stranded person. [steve] when youre in a search and rescue scenario, thats the kinda situation where youd want to deploy one of these. [shivani] as it explores and maps the cave, it drops devices called signal review to create a wifi network trail. It drops those just like bread crumbs alo the path. [shivani] using this network, the robot sends data back to home base to create a map. At the same time, the robot must look at every single object to identify the stranded human. So the lidar system is giving a full laser scaning. [shivani] lidar stands for light detection and ranging. Similar to its cousin radar, which uses radio waves, lidar systems send out laser pulses of light and calculates the time it takes to hit a solid object and bounce back. This process creates a 3d representation of the objects in the environment, which the onboard computer can then identify. This process is similar to how the eye feeds visual data to the brain, which then recognizes the objects by tapping into our preexisting knowledge of what things look like. By fully understanding its environment, r2 can then make better decisions about where to go and where not to go. [steve] what our robots doing right now is exploring. So the robot came to a junction and off to the left it could see that it couldnt get past. [shivani] right. [shivani] so it saw the opening to the right, and thats where it went. dramatic music it kind of looks like its making decisions about whether or not to climb over these planks and obstacles all up in this area, right . [steve] thats exactly what its doing at this point. Just like a baby, r2 learns through trial and error. Its like a little dog wagging its tail. But eres no one here to rescue, so it moves on. bright dramatic music as r2 continues to map out the mine, oh my god, a human. It stumbles upon its intended target laughs . That is randy, rescue randy. Hello reue randy laughs scared me. With the discovery of rescue randy, the r2 robot can not only alert emergency personnel, but also give them a map on how to find him. That is incredible laughs . Just knows what its doing. These incredible rescue robots are clearly paving the path to the future of hyper intelligence. [narrator] in the future, autonomous exploration vehicles perform search and rescue missions in every conceivable disaster zone. Even in avalanches at top mount everest, incredibly, intelligent offroad vehicles are also mapping deep cave systems previously unknown to science, discovering a vast supply of rare earth elements essential for modern technology. [shivani] Artificial Intelligence will clearly save human lives in the future, but theres a lot of terrain on earth thats too difficult to navigate on wheels. How will intelligent robots make their way over rainforests, bodies of water or even mountain tunnels . In philadelphia, Jason Derenick of exyn technologies isorking to overcome this oblem. What we focus on is autonomous aerial robotics to enable drones to safely navigate in unknown or unexplored spaces. [shivani] jasons team has built the First Industrial drone that can fly itself anywhere. Incredibly, these autonomous robots navigate without gps and map their vironmenas they. [dek] we focus on all aspects of aonomy, which includes perception, orientation of during flight, motion planning, and then finally control. [shivani] but going from two dimensions to three dimensions requires an increase in Artificial Intelligence processes. The mission for their drone is to flight indendently through a three dimensional path from one end of the warehouse to the other. Starting mission 3, 2, 1. dramatic music [shivani] now to mess with its computer mind, jasons team, places new and unexpected obstacles in its path. Will the drone recognize these unexpected changes . Will it get lost . Will it crash . Essentially, we have a gabbled lidar system that allows the vehicle to paint a full 360 sphere around it in order to sense its environment. [shivani] like the robot in the mine, this aerial robot uses lidar to see. It actually generates a virtualized representation ofhe space, which you see here. And each one of these cubes in the space, its trying to determine whether that cube is occupied or whether its free space. drone engine humming [shivani] the robots onboard computer makes real time decisions of where to go based on its visual input. Kind of like a siemens. drone engine roars incredibly, the drone recognizes the whiteboards and flies around them. chiming music one of the things about this system that make it particularly special is that its actually bein used in the real world to keep people out of harms way. [shivani] autonomous drones like these are already at work in Hazardous Industries like mining, construction, and oil exploration. They safely conduct spections in dangerous locations and create detailed maps of rough terrain. From a technological perspective, the fact that were ableo do everything that were doing on board selfcontained and enabling the system to make its own decisions, i dont even know where to begin with that. dramatized music [shivani] selfflying robots like these will revolutionize search and rescue and disaster response. They could also transform how packages are delivered, but there are limits to what large single drones can do. More complex tasks will require teams of small, nimble autonomous robots. Dr. Vijay kumar at the university of pennsylvania is working with swarms of drones to perform tasks like playing music. With obstructions cooperatively. Hes also developing technologies to tackle some very big problems, including world hunger. In a couple of decades, well have over 9 billion people to feed on this planet, of course, thats a big challenge. [shivani] to take on a task this big hes building an army of small flying robots with the ability to synchronize. We think about technologies that can be mounted on small flying robots that can then be directed in different ways like a flock of birds, reacting to a predator or a school of fish. You have coordination, collaboration, and it all happens very organically. [shivani] using ai to get robots to work as a coordinated collective group is a daunting task. Three, five years ago, most of our robots relied on gps like sensors. Today, we have the equivalent of smartphones embedded in our robots and they sense how fast theyre going by just lookg at the world, integrating that wit the inertial measurement unit information, and then gting an estimate of where they are in the world and how fast theyre traveling. [shivani] this, i gotta see, and im gonna check it out virtually as a robot. upbeat music im at u. Penn, remotely in the jay kumars lab. Sample my surroundings. Oh, i hit something. Hello . Hi. [shivani] vijays apprentice Dinesh Thakur is my guide. Today, we are gonna show robots playing in a formation. [shivani] great, can we see how that works . Sure, yeah. [shivani] the first step dinesh takes in coordinating the drones is to provide them with a common point of reference. In this case, a visual tag, similar to a basic qr code. Using only the onboard camera, these drones reference the code on the tag and visualize where they are in states. Using sophisticated bioinspired algorithms, the drones can figure out where each other drone is within the collective swarm. These drones are communicating with one another, right . [dinesh] yeah, right now theyre communicating more wifi. [shivani] so cool giggles . Future versions of these drones will create their own localized Wireless Network to communicate. But for now, this swarm is a proof of concept. Youve defined a formation, and then theyre assuming that formation . Yeah, i just say, i want to form a line and drone themselves figure out where they shod go. [shivani] once they figure out where they are inelationship to each other, they can then Work Together to accomplisa shared goal like ants working as a collective entity. Once they can coordinate between each other, we can send them out and doing specific missions. [shivani] thats really cool laughs . Swarms of flying robots have their advantages. Unlike a single drone self coordinating swarms can perform complex operations like mapping much faster by working in pallel and combining their data, and losing one drone in a swarm doesndo the whole operation. Vijay imagines employing his advanced Swarm Technology to work on farms. This precision agriculture will hp feed the worlds growing population. Wed like robots to be able to roam farms and be able to provide precise information about individual plants that then could be used to increase the efficiency of fooproducti. That would be a huge impact in a world. This is our duty as responsible citizens and as respoible engineers. [shivani] this high flying approach towards resolving the problems of the future is definitely a path i can get on. whooshing music [narrator] in thfuture, Artificial Intelligence, coordinates flocks of drones to protect the environment and boost the food supply. To combat the negative effects of Climate Change on agricultural crops, robotic bees assist with pollination in orchards and on farms, making them more sustainable and productive. Fishshaped underwater robots automatically deploy at the first sight of an oil spill. These drones create a barricade to rapidly contain spills, saving marine life and oceans across the world. [shivani] modern society has a long history of building robots to do work thats dangerous, difficult, or too repetitive for humans. Ai is poised to automate all kinds of humans work ranging from factory work, to taxi driving, to customer service. While some are worried that smart robots will replace human labor, thats not necessarily the case. As a sector, Artificial Intelligence is expected to generate 58 million new types of jobs in just a full years. So what will the future of human robot interaction mean for our work and livelihoods . Im at the Massachusetts Institute of technology, to meet dr. Julie shah. Shes leading groundbreaking research in human robot collaboration. My lab works to develop robots that are Effective Teammates with people. Julie and her team are creating software that helps robots learn from humans. Even giving them insight into different human behaviors. By being aware of real people, robots can directly work and interact with them. How do you teach these robots or machines to do these humanke tasks . Thfirst step as it would be for any person, the first thing they do is become immersed in the environment and observe, and then we need an active learning process. The robot needs to be able to communicate or show back to the person what its learned. We dont want the robot to learn the direct sequence of actions, we want the robot to learn this more general understanding. Thats ultimately like our challenge. [shivani] but getting a robot to grasp the Bigger Picture concept in order to understand the basics of its task in the first place requires a lot of observation and well handholding. My research is focusing on trying to make robot programming easier by trying to teach robots how to do tasks by demonstrating them. Julies colleague, ankit shah, shows me how this robot is learning to set a table. So this is all the silverware and the plates, the bowls, the cups, and this is the table that it has to set. Yes, that is good. Okay. As any parent knows, the first step in helping a child to learn is to model the desired behavior. Its the same with Machine Learning. In this case, the ai robot recognizes the objects with a visual text similar to a qr code. And for two weeks, it observes on kids setting a table. So did you pick up an item and then place it on the dinner table . Thats basically what we did. And based on that, the robot learns what it means to set a dinner table. [shivani] dynamic tasks, like setting a table or doing laundry are easy for humans, but incredibly hard forobots. The software has difficulty with so many variables and even subtle changes in their environment can throw them off. One of the things which i like to do is to actually hide some of the objects. So its not going to see the spoon. And the reason we do this is we want to show that the robot is robust to some of the disturbances in the task. [shivani] the Robot Software has learned what each object is and where it goes. Now, lets see if its learned the concept and can think dynamically to set the table. [ankitso you c just pick up the card. Here we go. Ive revealed the spoon. Incredibly, the robot recognizes the spoon and instantly places it next to the bowl. This reveals that the robot has learned the concept and executes the right action dynamically. In the process, the software is continuously writing and revising its own computer code. Basically, its learning. humming music if like humans, robots can grasp the Bigger Picture context and not just the mathematical tasks, will ai driven robots of the future spell the end of having to work . The key aspect is not developing ai to replace or supplant part of the human work, but really interested in how we fit them together as puzzle pieces. People work in teams to build cars, to build planes, and robots need to be an Effective Team member. Its real teamwork, as if youre in a basketball game, you have your goal, right . And chuckles you have to think spatially, who am i gonna pass the ball to . And at what time you do that, so that everything matches up. The analogy of a Basketball Team is outstanding because we actually need to know spatially where theyre going to be. And the timing is of critical importance. And so we need to develop the ai for a robot to then work with us. chiming music [shivani] one of the most difficult aspects of creating hyper intelligence, its actually something that even we humans sometimes get wrong. And that is anticipation. Anticipating what a teammate or coworker might do requires understanding of contextual information on a much more sophisticated level and predicting what will happen next. Can robots make predictions as accurely as we can . Abbeys our industrial robot. [shivani] pem lasota is giving this abbey machine, the intelligence necessary to help it anticipate a human coworkers action. This is our simulated Manufacturing Task that we have set up to simulate some sort of task that a person or robot could feasibly wk on together. [shivani] for safety reasons, actual human robot interaction is at present, fairly minimal. [pem] typically, in a factory, you would e these ys behind a metal cage and you wouldnt have people working with them. So what were trying to do is make something that a person could safely interact with. What is human and robot supposed to do together in this task . On this task, a persons placing fastness in some surface of a plane and a robot applying like a sealant over to sealt. Okay, can we see it happen . [pem] sure. In order to Work Together, the robot must first be able to see and recognize the actions of its human counterpart and adjust to the persons every move. Ooh, feel like im in a superhero movie. So the cameras in the room can see these lights and track your hand so that your hand doesnt get cut off by the robot. Thats right, yeah. So the cameras and the lights basically work as eyes for the robot. So thats how the robot knows where i am. [shivani] the monitor shows the visual representation of the room thats inside the robots mind. So this is what the robot might be doing if you know im not in its way. And the robots just sealing and im not supposed to be here. [shivani] pem does something the robot has no way of expecti. [pem] if i put my hands in a robots way oh wow. By quickly understanding this human action, the ai software reacts accordingly by stopping. Its important to be able to share the workspace. [shivani] building on this sense of teamwork, pems next step is helping abbey anticipate where he will move next. Based on subtle contextual cues. So in this case, the robot will not only trac which actions ive done so far, but also anticipate which portion of the space im going to be using. And when its planning its own motions, itll avoid those locations so that we can more seamless Work Together together, oh, okay. So, what youll see now is after i place this built, and the robots gonna predict, im gonna go to this one next. So what youll see is itll behaven a diffent way. So now that i place this built, the robot kinda takes a more roundout path that allows me and the rot to work more closely together. And i dont have to kind of worry about it crashing into me because i can see that its trying to avoid me. Similar, on lidars side, i place this bolt. I see the robot takes a more kinda like roundabout path. [shivani] yeah, because ure gonna go there next. [pem] indistinct it get slowed down, cause its close to me. [shivani] right. Work together at the same time. So not only is the interaction more efficient in that the robots not spending too much time standing still, its safer because the robots not constantly kinda almost hitting me and also feels nicer for the person working with the robot. I really love this same theme teamwork. Programming robots to coordinate with us and anticipate where we will move wont only revolutionize the workplace, but it will also change societat large. chiming music [narrato in the future, the coordination of men and machine is so advanced that this collaboration increases productivity and accuracy in most industries. Ai robots now accompany surgeons in hospitals across the globe. They anticipate the doctors needs and hand them the appropriate medical tool just before its needed. This dramatically reduces surgery times and human error. dramatic music [shivani] as machines become smarter in their interactions with humans, will they ever develop consciousness . And will Artificial Intelligence actually surpass human intelligence . While some machines have exceeded human ability in gains like trivia. Now we come to watson [shivani] and chess, these ai systems were designed to master just a single skill. These programs use bruteforce computer processing power, and especially tailored software to beat their human opponents. To achieve the wholly grail of hyper intelligence, scientists must develop systems with flexible humanlike abilities to both learn and think this form of smarts is called artificial general intelligence. Im back in new york city on my own campus at columbia univerty to meet with dr. Hod lipson. Hods lab is developing creative robots that paint original artworks, self assembling robots, and even robots that arn aboutheir World Without human assistance. But his ultimate goal is even more ambitious. Can a machine think about itself . Can it have free will . I believe that in fact machines can be sentient, can be selfaware in ways that we cant. [shivani] as a neuroscientist, i know weve only scratched the surface of our Scientific Understanding of how consciousness works in humans. How could one possibly use computer code to put this transcendent feature into a robot . Our hypothesis is actually very simple. It ithat selfawareness is nothing but the ability to simulate ones self. To model ones self, to have a self image. [shivani] the First Step Towards creating robotic consciousness is to teach the software, to build an image of its physical mechanical self inside its computer mind. We humans take conscioness like this for granted, even in simple moments, like understanding our own image reflected in a mirror. Humans start to develop awareness of their own emotions and thoughts around the age of one. This helps babies understand their self image in their minds, and it helps em to learn about their environment, and their role in it. When a robot learns what it i it can use tt selfimage to plan netasks. [shivani] in both humans and robots awareness about the physical self is called proprioception. Neuroscientists, sometimes call this selfawareness of our bodies a sixth sense. We use the same test that a baby does in its crib. A baby moves around, flails around, moves its arm in ways that look random to us,