That will be something that can also take place and there will be the money to fund it. And we do know there has been a confirmation already from the department of defense that that transition has already begun in order to ensure that u. S. National security remains on track something. That was a grave concern to many of the top Security Officials in the united states. Given the fact that donald trump was dragging his heels and essentially blocking this transition from taking place. They argued putting u. S. National security at risk. Kimberley, how . Rather, the International Air transport association is predicting the industry is on trying to lose more than 150000000000. 00. Well, thats off of the number of passengers politics from 4500000000. 00 in 2019 to fewer than 2000000000 this year. The Ethiopian Government says to grain, Regional Forces have started to surrender, but the grays leadership says theyre fighting back. They claim to have destroyed an Ethiopian Military division, including a helicopter, and 2 tanks. At least 7 people have died in 2 separate car bombings in northwestern syria. The 1st was in albab in aleppo. Province monitoring groups say, most of those killed were police officers. A 2nd happened in nearby are free in at least 13 people have been killed in 20 explosions at a market in central afghanistan, the interior Ministry Says more than 40 others were injured in the blasts in bamiyan city. Well, those are your headlines. The news continues here on aljazeera. All hail the algorithm is also observed that for you to stay up to date with the news. There is a huge group of people at work behind our screens. Theyre called behavior architects, dissuasive designers or User Experience specialists, and the power they have is massive. That urge to keep swiping through your twitter feed. Thats designed the way we all click. I agree to the terms and conditions thats designed, swiping left or right on tinder. Thats designed to we live in an online world of someone elses making. And most of us never even give it a 2nd full. And actually, thats designed as well. San francisco, its the mecca for tech designers. Silicon valley. This place pioneered the art of constructing, optimizing, and enhancing a lot of the technology we use every day. Its turbo charge, the speed at which we use the internet and made navigating the way more ensuring. But its also given us a full sense of security. But ive lost count of the number of times ive clicked. I agree to get into a website. We all have to do it. As we speed around the internet, we face hundreds of these annoying pop ups and consent forms all demanding a response from us. And yes, i call them annoying because that is exactly what they are. They may look like that theyre there to provide us with control, but the reality is far from it. When users click on, i agree to the terms and conditions when they see a Privacy Policy and they click on ads, they may think that theyre actually being given control of their personal data. What to collect it, how would she used . And its advantageous to companies for them to do that, because if something bad happens, the tech company can then say you actually agreed to this, nobody ever reads the terms of service. No one should ever be expected to. If we actually tried to read every terms and conditions agreement that we came across, its probably the only thing we would do. We would have to be our day job because theyre so long we come into so many. It may have the veneer of giving control to data subjects, but ultimately its window dressing. What a hot saga is, what youd call a school of design. Hes been studying the psychology ethics and legal implications of new technologies. One area he specializes in is start of protection and privacy. Before we get into this, theres a key term you need to know informed consent. This is a principle that comes up a lot in discussions of our rights online. But its easier to explain in the context of medical surgery. A doctor explains potential risks and worst Case Scenarios to the patient. Once you fully informed you have the option to consent to sergio not in the online world. The informed consent is what everyone says ideal. But he said, even possible consent only works under a very narrow set of conditions, and thats when the decision is infrequent like with surgery. So we dont have surgery all the time. Its when the risks are visceral. There are things that we can easily conjure up in our minds and then finally the harm is possibly great. So if things go wrong with surgery, you could get sick or you could die. So weve got an incredible incentive to take that decision seriously. But of course, none of those things are present in the data ecosystem. We make decisions quite frequently. 10100 times a day. Harms are not visceral at all. They are incredibly opaque. And finally, the harm is not even that great because the private modern privacy harms arent huge. Their death by a 1000 cuts. The spin from Silicon Valley is that their own asaad, when it comes to how we control our long privacy policies are very confusing. And if you make it long and spell out all the details, then youre probably going to reduce the percent of people who read it. However, take a closer look at the design of the buttons in the pups that when they click, and its clear that the Tech Companies have the upper hand in the dot, a bad design is power. And every Single Design decision makes a certain reality, more or less likely. And what Tech Companies and psychologists and other people have known for years is that defaults are notoriously sticky. And so if you design the interface so that all the defaults are set to maximize exposure, then youre going to get a lot more data in the aggregate than you would if you said all the defaults to privacy protective. Because people dont go in and change them. So until we fundamentally change the incentives, were still going to see companies manipulating the design of these buttons and these technologies to ensure that you still keep disclosing data and that they still keep getting whats the lifeblood of their business. Most of us assume that when we go on a website and click that, i agree, but the site simply collects information that we voluntarily choose to share. In reality, there are many ways to dot a collection and the mechanics of it are invisible, hidden by design, the study, it isnt just the website you are on, thats money inclination. There are socalled, 3rd party, advertisers, marketers and analytics agencies. Tracking, using tiny bits of software beacons, pixel tags, they scoop up incredibly detailed information. Everything from the computer using to how long you hold for a fairly. Honestly, its a bit mind boggling. And all you really did was click about informed consent is a fundamentally broken regulatory mechanism for algorithmic accountability. It allows companies to continue to throw risk back onto the user and say, you know, heres a, heres a pop up ad. Heres a pop up banner that tells you about cookies that no one reads and nobody really even cares about yet. We push forward this regime as though it matters to people. As though, if, if someone clicks, i agree, then theyre magically ok with all the Data Processing thats going to come afterwards. Which is a little bit of a joke and its a little bit of a legal fiction. Yet. Its a major component of every, almost every Data Protection framework around the world. Once youve crossed the i agree hurdle. Now in the clutches of the website. And this is where design takes on a whole new level of importance. The job is to keep either one of the most successful innovation in marble and website design is something called infinite scroll. We only use it every single day. I think a little scroll in the c 3 youll see without even needing to click. Im on my way now to meet the creator of this function. His name is as a rascal. Now he no longer works inside be tickled, gratian in early 2018, he cofounded the center for humane technology. All of our apps, all of so Compelling Companies are competing for our attention. And because its such a cutthroat game trying to get our attention where, you know, tens of billions of dollars they have to increasingly point more powerful computers at our heads to try to frack us for that attention. And weve all had that experience of going to you tube and you think im going to watch one video and then like, somehow you shake your head an hour has passed. Im like what, why, how is that . The technology has hypnotized us. This tech unitized nation is key to whats called the attention economy. Our tension is a finite currency in the online world, and its only as long as websites and apps have no attention that they have no business. And our Data Retention economy is just this that says, if we are not paying for product, well the come to have to make something money somehow. How do they do it . They do it by selling our attention to advertisers or to other groups that want us to do something. Theyre trying to make these systems as effective as possible at influencing your decisions. Quite as much information is about you as they can, like who your friends are, how you spend your time off from the munged with like how you spend your money to take all of this data to build a model of you imagine like a little simulator of you that lives in the facebook server, and then they can put things in front of it. Be like, are you more likely to click this, this or this . Or if we want to get you to hate immigration, what kind of message would you think youre going to resonate with this message or this message . And you can see how this like this or the begins just this race for it. For your attention ends up becoming an entire economy is worth of pressure with the very smartest minds in engineering. And the biggest supercomputers trying to make a model of you to be able to influence the kinds of decisions youre going to make. A few years ago you tube set a company wide objective to reach 1000000000 hours of viewing a day. Netflix created Reed Hastings has also said multiple times, that the companys biggest competitor isnt another website. Its sleep. So what happens when you give algorithms the goal of maximizing our attention and time online . They find our weaknesses and exploit them. In 2017, sean parker, a Founding Member of facebook and its 1st president , literally confessed to this at an event. How do we consume as much of your time and conscious attention as possible . And that means that we need to sort of give you a little dope. I mean here every once in a while, because someone like her commented on a photo or a post or whatever. And thats going to get you to contribute more content to a social validation feedback loop that its like. I mean, its exactly the kind of thing that a doctor like myself would come up with because youre exploiting a vulnerability and human psychology. No, its not. As though Silicon Valley piny, the tricks and tactics of addiction, a persuasive design, many tech designers openly admit using insights from behavioral scientists of the early 20th century. Right . The concept of randomly scheduled rewards study developed by american cycle of just, b. F. Skinner. In the 1950 s. , he created whats become known as the skin of books, a simple contraption. He used to study pigeons and even rats. At the start of the process, a pigeon is given a food every time it picks the would pick or turns a full circle when the word turnip is. As the experiment proceeds, the rewards become less frequent. They take place at random, but by the at times the behavior has been established. The pigeon keeps picking will turning, not knowing when it might get through. But in anticipation that a reward could become. A book says with pivotal in demonstrating how design had the power to modify behavior, and it randomly scheduled rewards work for pigeons. Why not humans . Skinners concept is, in fact at the heart of a lot of addictive design from casino machines such as slot machines to social media. Smartphones are unnervingly similar to stop machines. Think about your facebook, instagram, a painter, speeds. We all swipe down, pause and then wait to see what will appear back to those randomly schedule boards. Again. Youll swipe could result in a new comment on a photo when you like, or a piece of spam will software update. You dont really know. And if that unpredictability, that makes it so addictive, natasha dosh will lose a cultural anthropologist who spent more than 15 years studying the algorithms behind this waste of design. Just like on a slot machine when youre texting or when you are looking through the news feed, you really never know whats coming down the pike. You never know when youre going to sort of hit that jackpot, so to speak. When its coming and how much itll be. So the randomness is very important to keep you hooked. And i think the fact that were down bullying money in a casino isnt really that different than what were doing. Because in both cases, what were really gambling is our attention and our time, right and across the board. Were sitting there sort of hoping for some little reward to never knowing when its going to in all cases where were sort of sitting alone with the machine. Theres no natural stopping point. I think that the similarities are quite striking as we check our phones over 150 times a day or it is like put up put up. Its the 1st thing we look at when i wake up. The last thing we look at before we go to sleep is like were glued to it and that, thats by design. We now have like over 2000000000. 00 scanner boxes and peoples pockets. We are running the largest experiment like psychological experiment that the world has ever seen by orders of magnitude one out of every 4 human beings on earth has a skinner box, which is learning how to uniquely target that super fascinating in the abstract but sort of terror when you think about what its doing, the research is being done and the evidence is clear that Digital Technology is being designed with the intention to make us attics. Its not a secret in the Tech Industry that 2 of the biggest tech because bill gates and the late steve jobs admitted they consciously limited the amount of time their children were allowed to engage with the products they helped create. The problem is real, but the media can often sensationalize the issue of tech addiction and make it harder for us to understand. Think that the reason people read maybe reach for the metaphor of crack cocaine is because they see that as a sort of high speed, hyper intensive form of addiction. And you know, while i dont use that metaphor myself, i do think that within the spectrum of media addictions, there are certain ones that are more high potency, you could say. And those are, you know, if were going to use the language of addiction, they have a higher event frequency. Think about horse race, right . You go to the track and youve got to really wait for that event to happen if you are rapidly gauged in an activity such twitter thread that is more high potency, it has a higher event frequency, which means each event has an opportunity to draw you in more to reinforce that behavior more so i think we really can apply the language of addiction to these different media. I have a ongoing frustration which is that whenever i am still for a 2nd, i have this impulse to reach into my pocket and pull out my phone and then i get angry at myself because i say that thats not right. Just just enjoy this moment, right . Just be, be with yourself for a 2nd. And then i get angry at myself that my phone has that much power over me, right. And im angry that im subject to the design of a technology in such a way that i have difficulty sort of resisting its allure. But of course, everything about these technologies is built to, to, to create that impulse, to make it feel as though its irresistible. Its, theres such emphasis put on free choice and being able to be a consumer and you make decision in the marketplace about what you want to do because you have free will. But at the same time, the very people who are promoting that notion of the person as a consumer sovereign are operating and designing their technology with a very different human subject in mind. Its somebody who can like a rat or a pigeon, or any other animal, be incentivized and motivated and hooked, and have their attention redirected. And thats really interesting to me because it is a kind of return to skinner. I think you wouldnt have heard that in the eightys or ninetys that would have even been creeping. You to think about someone designing your behavior, but now its become accepted. The you can be a behavior designer. And behavior design was one part of what as they used to do in a previous life. However, he is now one of a growing number of industry insiders who are taking a more critical stance to silicon, just talking to him. Let me wonder, does he regret his pot . And want to be like humble about it. If i had invented it, it would have been invented. I just havent been the right place at the right time to think about the right kind of thing. But yes, i do regret it, but i do think it talks to like the naive 80 of being like, oh heres just a cool feature and making it. And even if its great for the user without thinking about the effects, itll happen if you can scale it up to 100000000 people or 1000000000 people where this little thing that i went around to like twitter and google and all these other Companies Got a guy who should adopt this, now is wasted, quite literally hundreds of millions of human hours. Im sure all of us have had someone say to us, stop looking at your phone or why you so we dictate to social media. And before i started this series, i thought maybe there is something wrong with me. I just had no idea how deliberately designed our online experience is. And how these design algorithms are made to us. It happens. I asked everyone i spoke to, how do we change this . Can we change how Online Design works . Regulation cannot be expected to happen on its own within these corporations, right . Who are profiting from this because there is just too deep of a conflict of interest. And so the only viable kind of regulation is going to have to come from the outside. We have to have a public conversation about what is actually going on in these products. How are they working . And once we understand that as a collective, what do we want to limit and constrain . So if