Mediate in the conflict in northern ethiopia. Fighting has triggered what the u. N. Is calling a full scale humanitarian crisis. Those who fled across the border into sudan are afraid of going hungry and say they are desperate for help. Has been outrage across brazil after the death of a black man outside a supermarket. Protesters have been demonstrating at branches of the french Supermarket Chain call for 2 security guards in brazil are being investigated over the killing. One of them is an off duty police officer. So give me a little brazil being black in brazil means you have your humanity stolen. You have all your rights store and you dont have the opportunity to come and go in peace. You have the Security System designed to have you accused, even when you were the victim and you were the target of any bullet that circulates in the city. There is no bullet lost when it hits a black body. You know, what we saw in porto allegro is the most despicable expression of structural racism, institutional racism, and of how much brazil still inherits from the heritage of the slave countries. Operating in the americas, police are clashing with antigovernment protesters in guatemala, after hundreds stormed the Congress Building and set it on fire. Police fired tear gas to clear the demonstration after it turned violent. It was sparked by anger at the new budget and hundreds of refugees and migrants have been transferred to a makeshift camp on a military site in Spains Canary islands. It follows a record surge in people arriving by sea from africa, which has overwhelmed the local government. As the headlines that for myself and the team here in london, we will see tomorrow all hell the algorithm is the program coming out. Next i was a little brazilian just now that i was belittling 20 s. Or just me old mother in the father of my dojo. I eat lunch here. When parents are imprisoned, the government doesnt have any plans for the trip to the left behind. So this shouldnt need for, they need shelter, they are searching for love. My passion is to see that this shooting of prisoners are also given another chance to live for making them, which shouldnt, because theyre not a party to that crimes committed by their parents. When i finally get to that place to build a home for these children and they see them become somebody used to form in the society fending for themselves to give me satisfaction. I care about how the u. S. Engages with the growth of the world. Were really interested in taking you into a place you might not visit otherwise and feel that you were there. There is a huge group of people at work behind our screens. There cold behavior architects, persuasive designers, or User Experience specialists and the power they have is massive. That urge to keep swiping through twitter feeds. Thats designed the way we all click. I agree to the terms and conditions thats just swapping left or right on tindall abets design should we live in an online world of someone elses making. And most of us never even give it a 2nd full. And actually thats design is what san francisco. Its the mecca for tech designers. Silicon valley. This place pioneered the art of constructing, optimizing and enhancing. A lot of the technology we use every day. Its turbo charge, the speed at which we use the internet and made navigating the way more ensuring. But its also given us a full sense of security. But ive lost count of the number of times ive clicked. I agree to get into a website. We all have to do it. As we speed around the internet, we face hundreds of these annoying pop ups and consent forms all demanding a response from us. And yes, i call them annoying because that is exactly what they are. They may look like that theyre there to provide us with control. But the reality is far from it. When users click on, i agree to the terms and conditions or they see a Privacy Policy and they click on it. They may think that theyre actually being given control of their personal data. What to collect it, how it shes used, and its advantageous to companies for them to do that. Because if something bad happens, the tech company can then say, you actually agreed to this. Nobody ever reads the terms of service. No one should ever be expected to. If we actually tried to read every terms and conditions agreement that we came across, its probably the only thing we would do. It would have to be our day job because theyre so long we come into so many. It may have the veneer of giving control to data subjects, but ultimately its window dressing. What a hot saga is, what youd call a school of design. Hes been studying the psychology ethics and legal implications of new technologies. One area he specializes in is start of protection and privacy. Now, before we get into this, theres a key time you need to know informed consent. This is a principle that comes up a lot in discussions of our rights online, but its easier to explain in the context of medical surgery. A doctor explains potential risks and worst Case Scenarios to the patient. Once you fully informed you have the option to consent to sergio not in the online world. The informed consent is what everyone says ideal. But he said even possible consent only works under a very narrow set of conditions. And thats when the decision is infrequent. Like with surgery, so we dont have surgery all the time. Its when the risks are visceral, there are things that we can easily conjure up in our minds. And then finally the harm is possibly great. So if things go wrong with surgery, you could get sick or you could die. So weve got an incredible incentive to take that decisions seriously. But of course, none of those things are present in the data ecosystem. We make decisions quite frequently. 10100 times a day. Harms are not visceral at all, they are incredibly opaque. And finally, the harm is not even that great because the private modern privacy harms arent huge. Their death by a 1000 cuts. The spin from Silicon Valley is that their own asaad, when it comes to how we control our long privacy policies are very confusing. And if you make it long and spell out all the details, then youre probably going to reduce the percent of people who read it. However, take a closer look at the design of the buttons in the pups that when they click, and its clear that the Tech Companies have the upper hand in the dot, a bad design is power. And every Single Design decision makes a certain reality, more or less likely. And what Tech Companies and psychologists and other people have known for years is that defaults are notoriously sticky. And so if you design the interface so that all the defaults are set to maximize exposure, then youre going to get a lot more data in the aggregate than you would if you said all the defaults to privacy protective. Because people dont go in and change them. So until we fundamentally change the incentives, were still going to see companies manipulating the design of these buttons and these technologies to ensure that you still keep disclosing data and that they still keep getting whats the lifeblood of their business. Most of us assume that when we go on a website and click that, i agree, but the site simply collects information that we voluntarily choose to share. In reality, there are many ways to dot a collection and the mechanics of it are invisible, hidden by design, the study, it isnt just the website you are on, thats money inclination. There are socalled, 3rd party advertisers, marketers, and analytics agencies. Also tracking, using tiny bits of so when cookies beacons pixel tags, they scoop up incredibly detailed information. Everything from the computer youre using to how long you hold for it. Really. Honestly, its a bit mind boggling. And all you really do is click about informed consent is a fundamentally broken regulatory mechanism for algorithmic accountability. It allows companies to continue to throw risk back onto the user and say, you know, heres a, heres a pop up ad. Heres a pop up banner that tells you about cookies that no one reads and nobody really even cares about yet. We push forward this regime as though it matters to people. As though, if, if someone clicks, i agree, then theyre magically ok with all the Data Processing thats going to come afterwards. Which is a little bit of a joke. And if a little bit of a legal fiction yet, its a major component of every, almost every Data Protection framework around the world. Once youve crossed the i agree hurdle. Now in the clutches of the web site. And this is where design takes on a whole new level of importance. The job is to keep one of the most successful innovation in my belly website design is something called infinite scroll. We all use it every single day. I think a little scroll in the c 3 youll see without even needing to click. Im on my way now to meet the creator of this function. His name is as a rascal. Now, you know whats inside be tickled, gratian in early 2018, he cofounded the center for humane technology. All of our apps, all of so Compelling Companies are competing for our attention. And because its such a cutthroat game of trying to get our attention where, you know, tens of billions of dollars, they have to increasingly point more powerful computers at our heads to try to frack us for that attention. And weve all had that experience of going to you tube and you think im going to go watch one video and then like, somehow you shake your head an hour has passed. Im like what, why, how is that . The technology has hypnotized us. This tech unitized nation is key to whats called the attention economy. Our tension is a finite currency in the online world, and its only as long as websites and apps have no attention that they have no business. And our Data Retention economy is just this that says, if we are not paying for product, well the come to have to make something money somehow. How do they do it . They do it by selling our attention to advertisers or to other groups that want us to do something. Theyre trying to make these systems as effective as possible at influencing your decisions. Quite as much information is about you as they can, like who your friends are, how you spend your time off from the munged with like how you spend your money to take all of this data to build a model of you imagine like a little simulator of you that lives in the facebook server, and then they can put things in front of it. Be like, are you more likely to click this, this or this . Or if we want to get you to hate immigration, what kind of message would you think youre going to resonate with this message or this message . And you can see how this like this or the begins just this race for it. For your attention ends up becoming an entire economy is worth of pressure with the very smartest minds in engineering. And the biggest supercomputers trying to make a model of you to be able to influence the kinds of decisions youre going to make. A few years ago you tube set a company wide objective to reach 1000000000 hours of viewing a day. Netflix created Reed Hastings has also said multiple times, that the companys biggest competitor isnt another website. Its sleep. So what happens when you give algorithms the goal of maximizing our attention and time online . They find our weaknesses and exploit them. In 2017, sean parker, a Founding Member of facebook and its 1st president , literally confessed to this at an event. How do we consume as much of your time and conscious attention as possible . And that means that we need to sort of give you a little dope. I mean here every once in a while, because someone like her commented on a photo or a post or whatever. And thats going to get you to contribute more content to a social validation feedback loop that its like. I mean, its exactly the kind of thing that a packer, like myself would come up with because youre exploiting a vulnerability and human psychology. Now its not as though Silicon Valley piny, the tricks and tactics of addiction, a persuasive design. Many tech designers openly admit using insights from behavioral scientists of the early 20th century. Right . The concept of randomly scheduled rewards study developed by american cycle of just, b. F. Skinner. In the 1950 s. , he created whats become known as the skin of books, a simple contraption, he used to study pigeons and even rats. At the start of the process, a pigeon is given a food. Every time it picks the would pick or turns a full circle when the word turnip is. As the experiment proceeds, the rewards become less frequent. They take place at random, but by the time the behavior has been established, the pigeon keeps picking will turning. Not knowing when it might get through, but in anticipation that a reward could become skinner boxes with pivotal in demonstrating how design had the power to modify behavior. And if randomly scheduled rewards work for pigeons. Why not humans . Skinners concept is, in fact at the heart of a lot of addictive design from casino machines such as slot machines to social media. Smartphones are unnervingly similar to stop machines. Think about your facebook, instagram, a painter, speeds. We all swipe down, pause and then wait to see what will appear. Went back to those randomly scheduled boards again. Youll swipe could result in a new comment on a photo. Would you like . Or a piece of spam will software update. We dont really know. And if that unpredictability, that makes it so addictive, natasha tao should lose a cultural anthropologist who spent more than 15 years studying the algorithms behind this waste of design. Just like on a slot machine when youre texting or when you are looking through the news feed, you really never know whats coming down the pike. You never know when youre going to sort of hit that jackpot, so to speak. When its coming and how much itll be. So the randomness is very important to keep you hooked. And i think the fact that were down bullying money in a casino isnt really that different than what were doing. Because in both cases, what were really gambling is our attention and our time, right and across the board. Were sitting there sort of hoping for some little reward to never knowing when its going to in all cases where were sort of sitting alone with the machine. Theres no natural stopping point. I think that the similarities are quite striking. We check our phones over 150 times a day or it is like put up put up. Its the 1st thing we look at when i wake up. The last thing we look at before we go to sleep is like were glued to it and that, thats by design. We now have like over 2000000000 scanner boxes and peoples pockets. We are running the largest experiment like psychological experiment that the world has ever seen by orders of magnitude one out of every 4 human beings on earth has a skinner box, which is learning how to uniquely target that super fascinating in the abstract but sort of tariff when you think about what its doing, the research is being done and the evidence is clear that Digital Technology is being designed with the intention to make us attics. Its not a secret in the Tech Industry that 2 of the biggest tech because bill gates and the late steve jobs admitted they consciously limited the amount of time their children were allowed to engage with the products they helped create. The problem is real, but the media can often sensationalize the issue of tech addiction and make it harder for us to understand. Think that the reason people root maybe reach for the metaphor of crack cocaine is because they see that is a sort of high speed hyper intensive form of addiction. And you know, while i dont use that metaphor myself, i do think that within the spectrum of media addictions, there are certain ones that are more high potency, you could say. And those are, you know, if were going to use the language of addiction, they have a higher event frequency. Think about horse race, right . You go to the track and youve got to really wait for that event to happen if you are rapidly gauged in an activity such twitter thread that is more high potency, it has a higher event frequency, which means each event has an opportunity to draw you in more to reinforce that behavior more so i think we really can apply the language of addiction to these different media. I have a ongoing frustration which is that whenever i am still for a 2nd, i have this impulse to reach into my pocket and pull out my phone and then i get angry at myself because i say thats, thats not right. Just just enjoy this moment, just be, be with yourself for a 2nd, and then i get angry at myself that my phone has that much power over me, right . Im angry that im subject to the design of a technology in such a way that i have difficulty sort of resisting its allure. But of course everything about these technologies is built to, to, to create that impulse, to make it feel as though its irresistible. Its, theres such emphasis put on free choice and being able to be a consumer and you make decision in the marketplace about what you want to do because you have free will. But at the same time, the very people who are promoting that notion of the person as a consumer sovereign are operating and designing their technology with a very different human subject in mind. Its somebody who can like a rat or a pigeon, or any other animal, be incentivized and motivated and hooked, and have their attention redirected. And thats really interesting to me because it is a kind of return to skinner. I think you wouldnt have heard that in the eightys or ninetys that would have even been creeping. You to think about someone designing your behavior, but now its become accepted. The you can be a behavior designer. And behavior design was one part of what as they used to do in a previous life. However, he is now one of a growing number of industry insiders who are taking a more critical stance to silicon. Just talking to him. Let me wonder, does he regret his pot and want to be like humble about it. If i had invented it would have been invented, i just havent been the right place at the right time to think about the right kind of thing. But yes, i do regret it. But i do think it talks to like the naive 80 being like, oh heres just a cool feature and making it. And even if its great for the user without thinking about the effects, itll happen if you can scale it up 100000000 people or 1000000000 people where this little thing that i went around to like twitter and google and all the other Companies Got a guy who should adopt this now is wasted, quite literally hundreds of millions of human hours. Im sure all of us have had someone say to us, stop looking at your phone or why you so