A team that reflects the fact that america is back ready to lead the world, not retreat from it. Once again, sit at the head of the table, ready to confront our adversaries, and not reject our allies ready to stand up for our values. U. S. President donald trump, is taking credit for a record day in the markets. The dow jones index reached an historic high on tuesday following the transition developments in progress on coronavirus back scenes. Ethiopian government says to grand Regional Forces have started to surrender. The government gave the rebels until wednesday to lay down their arms over iska final assault on the main city mechanic warning people to leave the area to graze the ship denies surrendering in claims. An Ethiopian Military division has been destroyed. 14 people have been killed by 2 explosions at a market in central afghanistan. It happened in the historic city of bam. The on, on to the mainly shia has are a minority. The attack happened as the u. N. Secretary general called for an immediate and unconditional cease fire in afghanistan. He spoke to a donors conference in geneva when nations pledged 12000000000. 00 in aid of the next 4 years. If somebody has reported its highest daily death toll from corona virus, since march 853, fatalities were confirmed on tuesday. Mystery also read said more than 23000 new infections. The Aviation Industry is reporting larger than expected losses due to the coronavirus pandemic. The International Air transport association had predicted a 100000000000. 00 loss by the end of 2021. Its our eyes that 150000000000 or news here on aljazeera after all hail the algorithm there is a huge group of people at work behind our screens. Theyre called behavior architects, dissuasive designers or User Experience specialists, and the power they have is massive. That urge to keep swiping through a twitter feed. Thats designed the way we all click. I agree to the terms and conditions. Thats just swiping left or right on tinder. Thats designed to we live in an online world of someone elses making. And most of us never even give it a 2nd thought. And actually, thats designed as well. San francisco, its the mecca for tech designers. Silicon valley. This place pioneered the art of constructing, optimizing, and enhancing a lot of the technology we use every day. Its turbo charge, the speed at which we use the internet and made navigating the way more ensuring. But its also given us a full sense of security. But ive lost count of the number of times ive clicked. I agree to get into a website. We all have to do it. As we speed around the internet, we face hundreds of these annoying pop ups and consent forms all demanding a response from us. And yes, i call them annoying because that is exactly what they are. They may look like that theyre there to provide us with control, but the reality is far from it. When users click on, i agree to the terms and conditions, or they see a Privacy Policy and they click on ads. They may think that theyre actually being given control of their personal data. What to collect it, how is she used . And its advantageous to companies for them to do that, because if something bad happens, the tech company couldnt say you actually agreed to this. Nobody ever reads the terms of service. No one should ever be expected to. If we actually tried to read every terms and conditions agreement that we came across, its probably the only thing we would do. It would have to be our day job because theyre so long we come into so many. It may have the veneer of giving control to data subjects, but ultimately its window dressing. What a hot saga is, what youd call a school of design. Hes been studying the psychology, ethics and legal implications of new technologies. One area, he specializes in protection and privacy. And before we get into this, theres a key term you need to know informed consent. This is a principle that comes up a lot in discussions of our rights online. But its easier to explain in the context of medical surgery. A doctor explains potential risks and worst Case Scenarios to the patient. Once you fully informed you have the option to consent to sergio not in the online world. The informed consent is what everyone says ideal. But he said, even possible consent only works under a very narrow set of conditions, and thats when the decision is infrequent like with surgery. So we dont have surgery all the time. Its when the risks are visceral. There are things that we can easily conjure up in our minds and then finally the harm is possibly great. So if things go wrong with surgery, you could get sick or you could die. So weve got an incredible incentive to take that decision seriously. But of course, none of those things are present in the data ecosystem. We make decisions quite frequently. 10100 times a day. Harms are not visceral at all. They are incredibly opaque. And finally, the harm is not even that great because the private modern privacy harms arent huge. Their death by a 1000 cuts, the spin from Silicon Valleys that their own asaad when it comes to how we control our long privacy policies are very confusing. And if you make it long and spell out all the details, then youre probably going to reduce the percent of people who read it. However, take a closer look at the design of the buttons in the pups that when they click, and its clear that the Tech Companies have the upper hand in the dot about design is power and every Single Design decision makes a certain reality, more or less likely and what Tech Companies and psychologists and other people have known for years is that defaults are notoriously sticky. And so if you design the interface so that all the defaults are set to maximize exposure, then youre going to get a lot more data in the aggregate than you would if you said all the defaults to privacy protective. Because people dont go in and change them. So until we fundamentally change the incentives, were still going to see companies manipulating the design of these buttons and these technologies to ensure that you still keep disclosing data and that they still keep getting whats the lifeblood of their business. Most of us assume that when we go on a website and click that, i agree, but the site simply collects information that we voluntarily choose to share. In reality, there are many ways to dot a collection and the mechanics of it are invisible, hidden by design, the study, it isnt just the website you are on, thats money inclination. There is socalled, 3rd party advertises models and analytics agencies. Also tracking, using tiny bits of so when cookies beacons pixel tags, they scoop up incredibly detailed information. Everything from the computer youre using to how long you hold for a fairly. Honestly, its a bit mind boggling. And all you really did was click about informed consent is a fundamentally broken regulatory mechanism for algorithmic accountability. It allows companies to continue to throw risk back onto the user and say, you know, heres a, heres a pop up ad. Heres a pop up banner that tells you about cookies that no one reads and nobody really even cares about yet. We push forward this regime as though it matters to people. As though, if, if someone clicks, i agree, then theyre magically ok with all the Data Processing thats going to come afterwards. Which is a little bit of a joke and its a little bit of a legal fiction. Yet. Its a major component of every, almost every Data Protection framework around the world. Once youve crossed the i agree hurdle. Were now in the clutches of the website. And this is where design takes on a whole new level of importance. The job is to keep either one of the most successful innovation in my belly website design, something called internet school. We only use it every single day, legally, school, and sleep through your feet without even needing to click. Im on my way now to meet the creator of this function. His name is as a rascal. Now, you know whats inside be tickled, gratian in early 2018, he cofounded the center for humane technology. All of our apps, all of so Compelling Companies are competing for our attention. And because its such a cutthroat game trying to get our attention where, you know, tens of billions of dollars they have to increasingly point more powerful computers at our heads to try to frack us for that attention. And weve all had that experience of going to you tube and you think im going to watch one video and then like, somehow like you shake your head an hour has passed. Im like what, why, how is that . The technology has hypnotized us. This tech unitized nation is key to whats called the attention economy. Our tension is a finite currency in the online world, and its only as long as websites and apps have no attention that they have no business. And our Data Retention economy is just this. It says, if we are not paying for a product well, to come to have to make something money somehow, how do they do it . They do it by selling our attention to advertisers or to other groups that want us to do something. Theyre trying to make these systems as effective as possible at influencing your decisions. Quite as much information is about you as they can. Like who your friends are, how you spend your time off from the munged with like how you spend your money to take all of this data to build a model of you imagine like a little simulator of you that lives in the facebook server. And then they can put things in front of it. Be like, are you more likely to quit this, this or this . Or if we want to get you to hate immigration, what kind of message would be, are you going to resonate with this message or this message . And you can see how this like this or the begins is just this race for it. For your attention ends up becoming an entire economy is worth of pressure with the very smartest minds in engineering. And the biggest supercomputers trying to make a model of you to be able to influence the kinds of decisions youre going to make. A few years ago you tube set a company wide objective to reach 1000000000 hours of viewing a day. Netflix creator Reed Hastings has also said multiple times, that the companys biggest competitor isnt another website. Its sleep. So what happens when you give algorithms the goal of maximizing our attention and time online . They find our weaknesses and exploit them. In 2017, sean parker, a Founding Member of facebook and its 1st president , literally confessed to this at an event. How do we consume as much of your time and conscious attention as possible . And that means that we need to sort of give you a little dope. I mean here every once in a while, because someone like her commented on a photo or a post or whatever. And thats going to get you to contribute more content to a social validation feedback loop that its like. I mean, its exactly the kind of thing that a packer, like myself would come up with because youre exploiting a vulnerability and human psychology. No, its not as though Silicon Valley pioneered the tricks and tactics of addiction, a persuasive design. Many tech designers openly admit using insights from behavioral scientists of the early 20th century at the concept of randomly scheduled rewards study developed by american cycle of just b. F. , skinner. In the 1950 s. , he created whats become known as the skin of books, a simple contraption, he used to study pigeons and even raccoons. At the start of the process, a pigeon is given a food. Every time it picks the would pick or turns a full circle when the word turnip is. As the experiment proceeds, the rewards become less frequent. They take place at random, but by the that time the behavior has been established, the pigeon keeps picking will turning. Not knowing when it might get through, but in anticipation that a reward could become a boxes with pivotal in demonstrating how design had the power to modify behavior. And it randomly scheduled rewards work for pigeons. Why not . If humans, skin this concept is in fact at the heart of a lot of addictive design from casino machines such as slot machines to social media. Smartphones are unnervingly similar to stop machines. Think about your facebook, instagram, a painter, speeds. We all swipe down, pause and then wait to see what will appear. Went back to those randomly scheduled boards again. Youll swipe could result in a new comment on a photo when you like, or a piece of spam will software update. We dont really know. And if that unpredictability, that makes it so addictive, natasha dosh will lose a cultural anthropologist who spent more than 15 years studying the algorithms behind this waste of design. Just like on a slot machine when youre texting or when you are looking through the news feed, you really never know whats coming down the pike. You never know when youre going to sort of hit that jackpot, so to speak, when its coming and how much it will be. So the randomness is very important to keep you hooked. And i think the fact that were down bullying money in a casino isnt really that different than what were doing. Because in both cases, what were really gambling is our attention and our time. Right, and across the board. Were sitting there sort of hoping for some little reward to never knowing when its going to come in all cases where were sort of sitting alone with the machine. Theres no natural stopping point. I think that the similarities are quite striking. We check our phones over 150 times a day or it is like, put up put up, its the 1st thing we look at when i wake up. The last thing we look at before we go to sleep is like were glued to it, and thats, thats by design. We now have like over 2000000000 skinner boxes and peoples pockets. We are running the largest experiment like psychological experiment that the world has ever seen by orders of magnitude one out of every 4 human beings on earth has a skinner box, which is learning how to uniquely target that super fascinating in the abstract but sort of tariff when you think about what its doing, the research is being done and the evidence is clear that Digital Technology is being designed with the intention to make us attics. Its not a secret in the Tech Industry either. 2 of the biggest tech because bill gates and the late steve jobs admitted they consciously limited the amount of time their children were allowed to engage with the products they helped create. The problem is real, but the media can often since the issue of tech addiction and make it harder for us to understand. Think the reason people root maybe reach for the metaphor of crack cocaine is because they see that as a sort of high speed, hyper intensive form of addiction. And you know, while i dont use that metaphor of myself, i do think that within the spectrum of media addictions, there are certain ones that are more high potency, you could say. And those are, you know, if were going to use the language of addiction, they have a higher event frequency. Think about horse race, right . You go to the track and youve got to really wait for that event to happen if you are rapidly engaged in an activity such twitter thread that is more high potency, it has a higher event frequency, which means each event has an opportunity to draw you in more to reinforce that behavior more so i think we really can apply the language of addiction to these different media. I have a ongoing frustration which is that whenever and still for a 2nd, i have this impulse to reach into my pocket and pull out my phone and then i get angry at myself because i say thats, thats not right. Just just enjoy this moment, right. Just be, be with yourself for a seconds, and then i get angry at myself that my phone has that much power over me, right. And im angry that im subject to the design of a technology in such a way that i have difficulty sort of resisting its allure. But of course, everything about these technologies is built to, to, to create that impulse, to make it feel as though its irresistible. Its, theres such emphasis put on free choice and being able to be a consumer and you make decisions in the marketplace about what you want to do because you have free will. But at the same time, the very people who are promoting that notion of the person as a consumer sovereign are operating and designing their technology with a very different human subject in mind. Its somebody who can like a rat or a pigeon, or any other animal, be incentivised and motivated and hooked and have their attention redirected. And thats really interesting to me because it is a kind of return to skinner. I think you wouldnt have heard that in the eightys or ninetys that would have even been creeping. You to think about someone designing your behavior. But now its become accepted that you can be a behavior designer. And behavior design was one part of what as they used to do in a previous life. However, he is now one of a growing number of industry insiders who are taking a more critical stance to silicon. Just talking to him, let me wonder, does he regret his part and want to be like humble about it. If i had invented it, it would have been invented, i just havent been the right place at the right time. I think about the right kind of thing. Yes, i do regret it. But i do think it talks to like the naive 80 of being like, oh heres just a cool feature and making it. And even if its great for the user without thinking about the effect that will happen if you can scale it up to 100000000 people or 1000000000 people where this little thing that i went around to like twitter and google and all these other companies. That guy who should adopt this now is wasted, quite literally hundreds of millions of human hours. Im sure all of us have had someone say to us, stop looking at your phone or why are you so addicted to social media . And before i started this series, i thought maybe there was something wrong with me. I just had no idea how deliberately designed our online experience is and how these design algorithms are made to lessen its. I asked everyone i spoke to, how do we change this . Can we change how Online Design works . Regulation cannot be expected to happen on its own within these corporations, right . Who are profiting from this because there is just too deep of a conflict of interest. And so the only viable kind of regulation is going to have to come from the outside. We have to have a public conversation about what is actually going on in these products. How are they working . And once we understand that as a collective, what do we want to limit and constrain . So if you start with the assumption that people are never going to fully understands the risks of algorithms and the way in which that interacts with the data, then we have to think about what else might work in its place. And the Data Protection regimes like the general Data Protection regulation are a great foundation. And one of the ways in which we could really improve upon that is to embrace a trust basis. So instead of putting all of the risk on the user, the requirements of protecting people would fall on the big company that is using the algorithms thats using the data. I think there is going to be a huge shift from just human centered design, as we call it in our field is like put the human at the center which is was a big movement to thinking about human protective design. We say that the tools that were bill things are so powerful that they cause real damage to us, individually for mental health, to our relationships, to our children. And to us society toward democracy means to having civil discourse. And that move to human protected doesnt mean i think its super hopeful because then we can actually have technology which does what it was supposed to the 1st place which is extend our best so if youre concerned about the ways in which data is being collected and algorithms are being used to affect your life. There are 3 things i think you can do. One, use the tools that are given to you. Use privacy dashboards, use to factor authentication, which is really valuable to be more deliberate in critical about the ways in which Companies Companies are asking for your information. And the devices that you adopt from the services you participate in, understand that companies are trying to tell you things through design. And theyre trying to make things easier or harder and think about whether you want things to be easier. And one of the costs with making things easier, and i understand that companies are trying to get your consent because their entire business depends on it. And so think about that as you go forward. And finally, i think that design and consent and privacy and algorithms need to be a political issue. And so if someones running for office, ask them what their stance is on algorithmic accountability. Ask them what their stance is on privacy and data collection. Because if we get better rules about how the data is collected, an algorithm were used, then we might all be better off. Were at this Inflection Point where our technology is beginning to make predictions about us that are better than the predictions we make about ourselves. And one of the best ways of binoculars and it is by learning about ourselves more dislike, stop and really ask yourself before you post something to paste poker into. Im like, what, why am i doing this . Like, what are my motivations . If you sort of like slow down your thought process, often ive found that itll be like, oh, i am pulling this app out because im a little bit bored or everyone else has pulled up their phones and feeling a little socially awkward. Oh, thats curious. Or maybe like im having this experience and i sort of want to show off just a little bit. And just like stopping and thinking about like what are my motivations for doing things. I found to be in occupation for spending my time in ways that i wish i had waited. Ill just say that i recommend having a general attitude shift where you understand yourself as an organism, as a creature that can like any other number of creatures and animals be turned in certain directions. Have your attention swayed, caught, captured, hooked. I find it liberating to recognize all of the forces that are exerting these different powers on me to turn my attention one way or another and somehow realizing that helps me to disconnect a lot because it makes me a little bit angry. And i dont feel so bad about myself, i feel bad about whats being done to me. And then im more able to disconnect. Dissecting the headlines in the midst of a pandemic. Lets start with some of the all new ground realities affecting the news coverage. Whats the lay of the land, stripping away the spin about president ial corruption. It is real reporting. Its not if you keep challenging assumptions and the official line, we all decided we need to tell it i wish. We dont want to do it and its the listening post on aljazeera december on aljazeera, its 10 years since of revolution in tunisia ignited the arab spring. Aljazeera looks back at the uprising and asks, what really changed across the middle east. The stream is where al jazeera is global audience becomes a global community. A year after the 1st coronavirus case in china will examine the devastation caused by the virus and the efforts made to eliminate covert 90 people in power is back with more investigative documentaries and indepth stories. Climate leaders will gather online to press ahead with a new stage of the paris climate agreement and examine the possible global solutions. December on aljazeera demain, the intersection of reality and comedy and post revolution, tennessee, a mission, to entertain, educate and provoke debate through satire, a weapon of choice, and to not look at what inspires one of tennessee, as most popular comedians to make people laugh. Might in asea hang on aljazeera among the lower castes, dullards, also known as untouchables. Sit at the very bottom of the hindu hierarchy. They perform the lowliest tasks such as manually cleaning sewers, often with no protective clothing or breathing apparatus. Give all sing right or is it . Im a lawyer. Hes taking us to meet some members of the community, some of whom he represents. Carla benvar, galas, family of cleaners. Her husband was poisoned in a sewer 4 years ago. 2 of her 3 sons now do the same work. Despite protections under the constitution, delegates or untouchables are treated as outcasts. Many dullards have escaped their predicament by converting to other faiths, such as islam or christianity. The b. G. P. Controlled state of what the british has prepared an anti conversion law under which religious conversion would require permission from a state official human Rights Groups say the law is aimed at keeping delegates in their place. The this team will make us proud to be american. President s like joe biden and bell says team and promises to restore americas status as a world leader. So raman, youre watching aljazeera life. My headquarters here in doha are also coming up concerns of possible war crimes in ethiopia as northern tigray region, as a government deadline looms for Regional Forces to surrender. Also a deadly bombing underscores the challenges in afghanistan as International Donors meet to commit billions of dollars in aid