sticklett look at the main stories and israel's prime minister benjamin netanyahu says his response will be strong and swift, following a series of attacks in occupied east jerusalem. a 2nd cabinet meeting on the issue is now going to be held on sunday. is rainy. police and several ministers are urging every citizen who has a license weapon to carry it and use it to defend themselves. on saturday too is rainy's were injured in a gun attack and so on and occupied east jerusalem, just hours after a palestinian gum and kill 7 israeli settlers near a synagogue on friday. his latest escalation began when israeli forces killed 9 palestinians in a raid on jeanine. only because she shall bertha from the gold of the dawn. i would like to prove the police of the security forces for her determined and quick action, as well as the resourcefulness and bravery showed by ordinary citizens who hit the tourists of verse saved lives. our response will be strong, fast, inaccurate. whoever tries to hurt us, we will hurt him at 81 who helps. meanwhile, protests against israel's foreign government have entered their 4th week. thousands of people are demonstrating in the city of tel aviv against prime minister netanyahu proposals to reform the judiciary. these changes would limit the powers of the supreme court giving more authority to the government. but sir critic say the move is anti democratic. all demonstrations have been under way across the u. s . after police in memphis, tennessee released video showing officers beating an unarmed black man. tyree nichols later died in hospital activists, a rallying in the city of memphis, calling for police reforms of footage from police body warning dashboard. cameras was posted on friday evening, a day after the officers were charged with 2nd degree murder, ukrainian authority say at least 3 people have been killed after a russian missile strike hit. the eastern city of constant in ithaca. regional governor of danielle says 2 others were injured in the attack, which targeted a residential neighborhood. ukrainian president, roger miss zalinski says moscow has been stepping up. it's offensive in the east. oh, the headlines all hell. the algorithm is the program coming up next. we'll see you tomorrow in the laboratory of the dentistry pharmacy in paris. medicines are being prepared in this case. it's the common antibiotic. amoxicillin fiona here says, since people stopped wearing mosques, they are catching more infections again. so the demand for simple medication is higher and the major pharmaceutical companies cannot. the problem is not just affecting simple pain killers and antibiotics across the continent. the european medicines agency says currently, so a chain key drugs are in short supply. the shortage of medicine applies across europe. this berlin pharmacy stones are increasingly having to find alternative trucks to those doctors prescribed in order to help their patients. one, basically, we have a lot more work to do to supply the population with medicine. so far we have still somehow found a solution for people, but we often had to improvise. so sometimes we release and not the active ingredient will strongly, it was prescribed and theme had to adjust the dosage. there is a huge group of people at work behind our screens. they called behavior architects . this wasted design and it's all user experience specialists and the power they have is massive that urge to keep swiping through your twitter feed. that's design . the way we all click, i agree to the terms and conditions that's design, swiping lift right on, kinda that's designed to we live in an online world of someone else is making. and most of us never even give it a 2nd. and actually that's designed as well a san francisco, it's the mecca, the tech designers home to silicon valley. this place pioneered the art of constructing, optimizing and enhancing. a lot of the technology we use every day. it's turbo charged, the speed at which we use the internet and made navigating the web more in tools. but it's also given us a full sense of security. ah, i've lost count of the number of times i've clicked. i agree to get into a website. we'll have to do it as we speed around the internet. we face hundreds of these annoying pop ups and consent forms all demanding a response from us. and yes, i call them annoying because that is exactly what they are. they may look like that they're there to provide us with control, but the reality is far from it. when users click on, i agree to the terms and conditions that i see a privacy policy and they click on it. they may think that they're actually being given control. ringback of their personal data, what's collected, how it's used, and it's advantageous to companies for them to do that. because if something bad happens, the tech company can say you actually agreed to this. nobody ever reads the terms of service. no one should ever be expected to. if we actually tried to read every terms and conditions agreement that we came across is probably the only thing we would do, it would have to be our day job because are so long. and we come into so many. and it may have the veneer of giving control to data subjects, but ultimately it's, it's window dressing. woody hartzog is what you'd call a scholar of design. he's been studying the psychology ethics and legal implications of new technologies. one area he specializes in is data protection and privacy. now before we get into this, there's a key term you need to know informed consent. this is a principle that comes up a lot in discussions of our rights on line. but it's easier to explain in the context of medical surgery. a doctor explains potential risks and worst case scenarios to the patient. once you're fully informed, you have the option to consent to surgery or not in the online well informed consent is what everyone says. the ideal but is that even possible? consent only works under s a very narrow set of conditions and that when the decision is infrequent like with surgery, so we don't have surgery all the time. it's when the risks are, this are all. there are things that we can easily conjure up in our minds and then finally the harm is possibly great. things go wrong with surgery, you could get thick or you could die. so we've got an incredible incentive to take that decision. but of course, none of those things are present in the data ecosystem. we make the decisions quite frequently 10100 times a day. the harms are not visceral at all, they're incredibly opaque. and finally, the harm is not even that great because the private modern privacy harms aren't huge, very deaf by 1000 cut. the spin from silicon valley is that there on our side, when it comes to how we control our data, long privacy policies are very confusing. and if you make it long and spell it all the detail, then you're probably going to reduce the percent of people who read it. however, take a quick look at the design of the buttons and the pop ups that were made to click. and it's clear that the tech companies have the upper hand in the data battle. design is power and everything, all designed decision makes a certain reality, more or less likely. and what tech companies and psychologists and other people have known for years as the defaults are notoriously sticky. and so if you design the interface so that all the defaults are set to maximize exposure, then you're going to get a lot more data in the aggregate than you would if you set all the defaults to privacy protected because people don't go in and change them so we tell, we fundamentally changed the incentives. we're still going to see companies, hippie lighting, the design of these buttons in these technologies to ensure that you still keep disclosing data and that they still keep getting what's the life blood of their business. most of us assume that when we go on a website and click that i agreed button, the site simply collects information that we voluntarily choose to share. in reality, there are many lays to data collection and the mechanics of it are invisible, hidden by design. for starters, it isn't just the website you are on. net mining information. there are so called 3rd party advertises marketing and analytics agencies. also tracking you are using tiny bits of software, cookies, beacons, pixel tags. they scoop up incredibly detailed information. everything from what computer you using to how long you hover over a like, honestly, it's a bit mind boggling. and all you really did is click a button. informed consent is a fundamentally broken regulatory mechanism for algorithmic accountability. it allows companies to continue to throw risk back on to the user and say, you know, here's a, here's a pop up ad i, here's a pop up banner that tells you about cookies that no one reads. and nobody really even cares about yet. we push forward this regime as though it matters to people as though group. if, if someone clicks i agree than they're magically okay, with all the data processing that's going to come afterwards, which is a little bit of a joke and it's, it's a little bit of a legal fiction. yet. it's a major component of every, almost every data protection framework around the world. once you've crossed the i agree hurdle will now in the clutches of the website. and this is where design takes on a whole new level of importance. the job is to keep you one of the most successful innovations in mobile and website design is something called infinite world dealing . we all use it every single day. the ability, endlessly through your fees without having enough to claim. i'm on my way now to meet the creator of this function. his name is as a rescue. now he no longer works inside big tech corporations. in early 2018, he co founded the center for humane technologies. all of our apps, all of our silicon valley companies are competing for our attention. and because it's such a cut throat game trying to get our attention, we're, you know, tens of billions of dollars. they have to increasingly point more powerful computers at our heads to try to frack us for that attention. and we've all had that experience of like going to you tube. and you think i'm just going to watch one video and then like, somehow like you shake your head an hour has passed and like what, why, how is it that technology has hypnotized us? this tech hypnotize ation is key to what's called the attention economy. our attention is a finite currency in the online world, and it's only as long as websites and apps have our attention that they have our business and our data, the tension economies just this. it says, if we are not paying for product, well, the company have to make some money somehow. um, how do they do it? they do it by selling our attention to advertisers or to other groups that want us to do something. they're trying to make the systems as effective as possible at influencing your decision. they collect as much information about you as they can, like who your friends are, how you spend your time off and on monday with like how you spend your money to take all of this data to build a model of you imagine like a little simulator view that lives in the facebook server, and then they can put things in front of it. do you like, are you more likely to click this, this or this? or if we want to get you to hate immigration, what kind of message would you, are you going to resonate with this message or this message? and you can see how it is like this begins with just this rates for it. for your attention ends up becoming an entire economy's worth of pressure with the very smartest minds and engineering, and the biggest super computers trying to make a model of you to be able to influence the kinds of decisions you're going to make . a few years ago, youtube set a company wide objective to reach 1000000000 hours of viewing a day. netflix create a read. hastings has also said multiple times, that the company's biggest competitor is in another website, its sleep. so what happens when you give algorithms the goal of maximizing our attention and time online? they find our weaknesses and exploit them. in 2017 shown parker, a founding member of facebook and its 1st president, literally confessed to this at an event about how we consume as much of your time and conscious attention as possible. and that means that we need to sort of give you a little dover me in here every once in a while. um, because someone liked or commented on a photo or a post or whatever. and that's going to get you to contribute more content, say, social validation feedback loop that it's like a, i mean it's exactly the kind of thing that a, that a hacker like myself would come up with because you're exploiting a vulnerability in, in human psychology. now it's not as though silicon valley piney, the tricks and tactics of addiction or persuasive design. many tech designers openly admit using insights from behavioral scientists of the early 20th century. like the concept of randomly scheduled rewards, studied and developed by american psychologist b of skinner. in the 1950s. he created what's become known as the skin, a box, a simple contraption. he used to study pigeons and even rats at the start of the process, a pigeon is given a food reward every time it picks the wood peck, or turns a full circle when the word turn appears. as the experiment proceeds through woods become less frequent, they take place at random. but by that time, the behavior has been established. the pigeon keeps pecking or turning, not knowing when it might get the reward. but in anticipation that a reward could becoming skinner boxes were pivotal in demonstrating how design had the power to modify behavior. and if randomly scheduled rewards worked for pigeons . why not humans? skinner's concept is, in fact at the heart of a lot of addictive design from casino machine such as slot machines to social media apps. smartphones are unnerving li, similar to astonishing. think about your facebook, instagram or pinterest fades. we all swipe down, pause and then wait to see what will appear. went back to those randomly scheduled rewards again. you'll swipe could result in and you comment on a photo when you like, or a piece of spam or software update. we don't really know. and if that unpredictability, that makes it so addictive. natasha daschle is a cultural anthropologist who has spent more than 15 years studying the algorithms behind persuasive design. just like on a slot machine when you're texting or when you are looking through the news feed, you really never know what's coming down the pipe. you never know when you're going to sort of hit that jack pot so to speak, and when it's coming and how much it will be. ok, so the randomness is very important to keep you hooked in. i think the fact that we're dam bowling money in a casino isn't really that different than what we're doing. because in both cases, what we're really gambling is our attention and our time. right and across the board. we're sitting there sort of hoping for some little reward to come, never knowing when it's going to come on in all cases where we're sort of sitting alone with the machine. there's no natural stopping point. i think that the similarities quite striking. we check our phones over 150 times a day or it just like pull it up, pull it up. it's the 1st thing we look at when wake up. the last thing we look at before we go to sleep, it's like we're, we're glued to it. and that's, that's by design. we now have, like over 2000000000 skinner boxes in people's pockets. we are running the largest experiment like psychological experiment that the world has ever seen by orders of magnitude, right? one out of every 4 human beings on earth has a skinner box, which is learning how to uniquely target them. super fascinating in the abstract, but sort of terrifying what you think about what it's doing. the research has been done. the evidence is clear that digital technology is being designed with the intention to make us attics. it's not a secret in the tech industry either to of the biggest tech figures. bill gates in the late steve jobs admitted they consciously limited the amount of time the children were allowed to engage with the products they helped create. the problem is real, but the media can often sensationalize the issue of tech addiction and make it harder for us to understand it. think that the reason people route maybe reach for the metaphor of crack cocaine is because they see that as a sort of high speed, hyper intensive form of addiction. and you know, while i don't use that metaphor myself, i do think that within the spectrum of media addictions, there are certain ones that are more high potency, you can say. and those are, if we're going to use the language of addiction, they have a higher event frequency. think about. d horse race, right? you go to the track, you've got to really wait for that event to happen if you are rapidly engaged in an activity. twitter thread, that is more high potency has a higher event frequency, which means each event has an opportunity to, to draw us more to reinforce that behavior more. so i think we really can apply the language of addiction to the different media. i have an ongoing frustration which is it whenever i feel for a 2nd, i have this impulse to reach into my pocket and pull up my phone and then i get angry at myself because i say that's, that's not right. just just enjoy that moment. right. as d v with yourself for a 2nd, and then i get angry at myself that my phone has that much power over me, right? i'm, i'm angry that, that i'm subject to the design of a technology in such a way that i have difficulty serve, resisting its allure. but of course, everything about the technology is built to, to, to create that impulse to, to make it feel as though it's irresistible. if there's such emphasis put on free choice and being able to be a consumer and you make decisions in the marketplace about what you want to do because you have free will. but at the same time, the very people who are promoting that notion of the person as a consumer sovereign are operating and designing their technology with a very different human subject in mine. it's somebody who can like a rat or a pigeon, or any other animal, be incentivized and motivated and hooked, and have their attention redirected. and that's really interesting to me because it is a kind of return to skinner. i think you wouldn't have heard that in the eighty's or ninety's that wouldn't be even been creepy to think about someone designing your behavior. but now it's become accepted that you can be a behavior designer. and behavior design was one part of what they used to do in a previous life. however, you have now one of a growing number of industry insiders who are taking a more critical stance towards really enjoyed talking to him. let me wondering, duffy regret his part in inventing infinite, scroll down to be like humble about it. if i hadn't invented it, it would have been invented. i just haven't been the right place to the right time . think about the right kind of thing. but yes, i do regret it, but i do think it talks to like the navy of like, oh, here's a cool feature and making it. and even if it's great for me to user without thinking about the fact that will happen if you can scale it up to a 100000000 people or 1000000000 people where this little thing that i went around to like twitter and google and all these other companies like guy you should adopt, it now is wasted, quite literally hundreds of millions of human ours i'm sure all of us have had some one or another said it was stop looking at your phone or why you so addicted to social media. and before i started this series, i thought maybe there was something wrong with me. i just had no idea how deliberately designed our online experiences and how these design algorithms a