Transcripts For ALJAZ All Hail The Algorithm Read Me Or Jus

Transcripts For ALJAZ All Hail The Algorithm Read Me Or Just Tap I Agree 20230202



belief this truce is only temporary because once the security forces are gone, it's just a matter of time before protesters block the road again, mariana sanchez. i just cedar ika. ah, this is out here and these are the top stories. now, the pentagon that says it's tracking a suspected chinese spy, berlin spotted in the u. s. s. based on wednesday, who are reports, military leaders even considered shooting it down, but decided against it because of the potential safety risk from falling debris or pentagon correspondent petticoat haine, has more from washington dc. what they are alleging is that there is this high altitude balloon over the western us. they say it's china again, they haven't provided any evidence that it is in fact chinese balloon, or that its mission is to do surveillance. but they said they have been tracking for several days. i, they say it's so high altitude that it's not a risk to any sort of air travel at all. i'm but one of the reports were seen is that they presented options to the president. what to do with about this, they decided, according to these reports, not to shoot it down because that could potentially have consequences on the ground . european commission, the president erst live on the lawn, says a new sanctions package, including price caps on russian oil will be ready for the 1st anniversary of russia's invasion. she made the comments soon after arriving, and kiva had of a summit. ukraine's push to join the e u. detainee who spent 16 years in guantanamo bay, has been released and transferred to belize, focused on it much kong, was arrested by us forces in 2003. israel's prime minister benjamin netanyahu was attended the opening of chads embassy near tell. aggrieved is a major step when normalizing relations between the 2 countries. shad severed ties with israel in 1972 in solidarity with palestinians. meanwhile, israel and sudan have agreed to normalize relations. israel's foreign minister, eli cohen has just returned from cartoon where he met suzanne's army, chief adel fatah. alba with his continued her now to sierra after all hail the algorithm. february 1, i just need rhinos in tigers, in the pool, post to the brink of extinction. $1.00 to $1.00, he's discovered how they're 14 happy turned around a year on from russia. evasion of ukraine al jazeera looks at the impact office where events might need from here. rigorous debate, unflinching question up front muslim on tail, cut through the headline to challenge conventional wisdom. nigerians vote in what's likely to be the most closely contested election in the country's history. from those that will dictate those who confronted people impala, investigate the youth and abusive power around the world. february on a, just a year in depth analysis of the days headlines from around the world to try. right . extremely, there is real and need to be tackled as soon as possible informed opinions. why is the opposition concerned about this rather small between turkey, a appointment frank assessment? you know, that was a joke about the interim government that it's not in, nor does it got inside story. on al jazeera, there is a huge group of people at work behind our screens. they're called behavior architects, persuasive design, and it's all user experience specialists. and the power they have is massive that urge to keep swiping through your twitter feed. that's design. the way we all click . i agree to the terms and conditions. that's design, swimming lift, all right on kinda that's designed to we live in an online world or someone else is making. and most of us never even give it a 2nd floor. and actually, that's designed as well. i san francisco, it's the mecca, the tech designers home to silicon valley. this place pioneered the art of constructing, optimizing and enhancing. a lot of the technology we use every day. it's turbocharged the speed at which we use the internet and made navigating the web more in tools. but it's also given us a full sense of security. ah, i've lost count of the number of times i've clicked. i agree to get into a website. we'll have to do it as we speed around the internet. we face hundreds of these annoying pop ups and consent forms all demanding a response from us. and yes, i call them annoying because that is exactly what they are. they may look like that they're there to provide us with control, but the reality is far from it. when users click on, i agree to the terms and conditions that i see a privacy policy and they click on it. they may think that they're actually being given control of their personal data, what's collected, how it's used, and it's advantageous to companies for them to do that. because if something bad happens, the tech company convey you actually agreed to this. nobody ever read the terms of service. no one should ever be expected to. if we actually tried to read every terms and conditions agreement that we came across is probably the only thing we would do. it would half of the our day job because we're so long and we come into so many i'm in may have the veneer of giving control to data subjects, but ultimately it's, it's window dressing. woody hartzog is what you'd call a scholar of design. he's been studying the psychology ethics and legal implications of new technologies. one area he specializes in is data protection and privacy. now, before we get into this, there's a key term you need to know informed consent. this is a principle that comes up a lot in discussions of our rights on line, but it's easier to explain in the context of medical surgery. a doctor explains potential risks and worst case scenarios to the patient. once you're fully informed, you have the option to consent to surgery or not in the online well informed consent. it's what everyone says he ideal. but is that even possible? that only works under s at the very narrow set of conditions, and that's when the decision is infrequent like with surgery. and we don't have surgery all the time. it's when the risks are visceral. there are things that we can easily conjure up in our minds and then finally the harm is possibly great. things go wrong with surgery, you can get thick or you can die. so we've got an incredible incentive to take that decision. but of course, none of those things are present in the data ecosystem. we make the decisions quite frequently 10100 times a day. the harms are not visceral at all, are incredibly opaque. and finally, the harm is not even that great because the private modern privacy harms art a huge their death by a 1000 cuts. the spin from silicon valley is that there on our side, when it comes to how we control our data, long privacy policies are very confusing. and if you make it long and spell out all the detail, then you're probably going to reduce 30 percent of people who read it. however, take a closer look at the design of the buttons and the pop ups that were made to click . and it's clear that the tech companies have the upper hand in the data battle. design is power and every single design decision makes a certain reality, more or less likely. and what tech companies and psychologists and other people have known for years as the defaults are notoriously sticky. and so if you design the interface so that all the defaults are set to maximize exposure, then you're going to get a lot more data in the aggregate than you would if you said all the defaults to privacy protective, cuz people don't go in and change them so until we fundamentally changed the incentives, we're still going to see companies manipulating the design of these buttons and these technologies to ensure that you still keep disclosing data and they still keep getting what's the life blood of their business. most of us assume that when we go on a website and click that i agree button. the site simply collects information that we voluntarily choose to share. in reality, there are many layers to data collection and the mechanics of it are invisible. hidden by design, the studies, it isn't just the website you are on, that's money information. there are so called 3rd party advertising marketing and analytics agencies. also tracking you are using tiny bits of software, cookies, beacons, pixel tags, they scooped up incredibly detailed information, everything from what computer you're using to, how long you hover over. like, honestly it's been mind boggling. and all you really do is click a button. informed consent is a fundamentally broken regulatory mechanism for algorithmic accountability. it allows companies to continue to throw risk back on to the user and say, you know, here's a, here's a pop up ad i, here's a pop up banner that tells you about cookies that no one reads, and nobody really even cares about yet. we push forward this regime as though it matters to people as though group. if, if someone clicks, i agree, then there magically ok with all the data processing that's going to come afterwards. which is it's a little bit of a joke and it's, it's a little bit of a legal fiction. yet. it's a major component of every, almost every data protection framework around the world. once you've crossed the, i agree huddle will now in the clutches of the website. and this is where design takes on a whole new level of importance. the job is to keep you one of the most successful innovations in mobile and website design is something called infinite scroll helix . we only use it every single day. you see that little scroll endlessly through your please, without having to click. i'm on my way now to meet the creator of this function. his name is as a rescue. now he no longer works inside been tech corporations. in early 2018, he confounded the center for humane technology. all of our apps, all of our silicon valley companies are competing for our attention. and because it's such a cut throat game trying to get our attention or, you know, tens of billions of dollars, they have to increasingly point more powerful computers at our heads to try to frack us for that attention. and we've all had that experience of like going to you tube and you think i'm just going to watch one video and then like, somehow you shake your head an hour has passed and what, why, how is it the technology has hypnotized us. this tech hypnotize ation is key to what's called the attention economy. our attention needs a finite currency in the online world, and it's only as long as websites and apps have our attention that they have our business. and our data retention economy is just this. it says, if we are not paying for a product, well, the company to make some money somehow, how do they do it? they do it by selling our attention to advertisers or to other groups that want us to do something. they're trying to make the systems as effective as possible at influencing your decision. like quite as much information about you as they can, like who your friends are, how you spend your time. often they'll mangion with like how you spend your money to take all of this data to build a model if you imagine like a little simulator view that lives in the facebook server. and then they can put things in front of it. do you like, are you more likely to click this, this or this? or if we want to get you to hate immigrations, what kind of message would you, are you going to resonate with this message or this message? and you can see how just like this begins with just this race for it, for your attention ends up becoming an entire economy's worth of pressure with the very smartest mines in engineering. and the biggest super computers trying to make a model of you to be able to influence the kinds of decisions you are going to make . a few years ago, you tube set a company wide objective to reach 1000000000 hours of viewing a day. netflix create a read, hastings has also said multiple times that the company's biggest competitor is in another website its sleep. so what happens when you give algorithms the goal of maximizing our attention and time online? they find our weaknesses and exploit them in 2017, sean parker, a founding member of facebook. and it's vice president literally confessed to this at an event about how do we consume as much of your time and conscious attention as possible. and that means that we need to sort of give you a little over me in here every once in a while. um, because someone liked or commented on a photo or a post or whatever. and that's going to get you to contribute more content. it's a social validation feedback loop that it's like a, i mean it's exactly the kind of thing that a, that a hacker like myself would come up with because you're exploiting a vulnerability in, in human psychology. now it's not as though silicon valley piney, the tricks and tactics, whole addiction or persuasive design. many tech design is openly admit using insights from behavioral scientists of the early 20th century by the concept of randomly scheduled rewards, studied and developed by american psychologist b of skinner. in the 1950s, he created what's become known as the skin, a box, a simple contraption. he used to study pigeons and even rats. at the start of the process, a pigeon is given a food reward every time it picks the wood peck, or turns a full circle when the word turn appears. as the experiment proceeds through wards become less frequent, they take place at random. but by time the behavior has been established, the pigeon keeps pecking or turning, not knowing when it might get the reward. but in anticipation that or would, could becoming skinner boxes were pivotal in demonstrating have design, had the power to modify behavior. and if randomly scheduled rewards worked for pigeons. why not humans? skinner's concept is in fact at the heart of a lot of addictive design from casino machines such as slot machines to social media apps. smartphones are unerringly similar to spot machines. think about your facebook, instagram or pinterest fades. we all swipe down, pause and then wait to see what will appear. went back to those randomly scheduled rewards again. you'll swipe could result in and you comment on a photo when you like or piece of spam or software update. we don't really know and if that unpredictability, that makes it so addictive. natasha daschle is a cultural anthropologist who has spent more than 15 years studying the algorithms behind persuasive design. just like on a slot machine when you're texting or when you are looking through the news feed, you really never know what's coming down the pipe. you never know when you're going to sort of hit that jack pot so to speak, and when it's coming and how much it will be. ok, so the randomness is very important to keep you hooked in. i think the fact that we're damn bullying money in a casino isn't really that different than what we're doing. because in both cases, what we're really gambling is our attention and our time. right and across the board. we're sitting there sort of hoping for some little reward to come, never knowing when it's going to come on in all cases worse, we're sort of sitting alone with the machine. there's no natural stopping point. i think that the similarities are quite striking. we check our phones over a 150 times a day or it just like pull it up, put up, pull it up. it's the 1st thing we look at when is the last thing we look at before we go to sleep, it's like what we're glued to it. and that's, that's by design. we now have like over 2000000000 skinner boxes in people's pockets. we are running the largest experiment like psychological experiment that the world has ever seen by orders of magnitude one out of every 4 human beings on earth has skinner box, which is learning how to uniquely target them. super fascinating in the abstract, but sort of terrifying what you think about what it's doing. the research has been done. the evidence is clear that digital technology is being designed with the intention to make us attics. it's not a secret in the tech industry, either. 2 of the biggest tech figures, bill gates, in the late steve jobs admitted they consciously limited the amount of time the children were allowed to engage with the products they helped create. the problem is real, but the media can often sensationalize the issue of tech addiction and make it harder for us to understand it. think that the reason people maybe reach the metaphor of crack cocaine is because they see that as a sort of high speed, hyper intensive form of addiction. and you know, while i don't use that metaphor myself, i do think that within the spectrum of media addictions, there are certain ones that are more high potency, you can say. and those are, if we're going to use the language of addiction, they have a higher event frequency. think about. d horse race, right? you go to the track and you've got to really wait for that event to happen if you are rapidly engaged in an activity. twitter thread, that is more high potency has a higher event frequency, which means each event has an opportunity to, to draw you in more to reinforce that behavior more. so i think we really can apply the language of addiction to the different media. i have an ongoing frustration which is that whenever i feel for a 2nd, i have this impulse to reach into my pocket and pull up my phone and then i get angry at myself because i say that that's not right. just just enjoy the moment, right. just d v with yourself for a 2nd, and then i get angry at myself that my phone has that much power over me, right? i'm angry that, that i'm subject to the design of a technology in such a way that i have difficulty serve, resisting it to lure. but of course, everything about the technology is built to, to, to create that impulse to, to make it feel as though it irresistible. if there's such emphasis put on free choice and being able to be a consumer and you make decisions in the marketplace about what you want to do if you have free will. but at the same time, the very people who are promoting that notion of the person as a consumer sovereign are operating and designing their technology with a very different human subject in mine. it's somebody who can like a rat or a pigeon, or any other animal, be incentivized and motivated and hooked in and have their attention redirected. and that's really interesting to me because it is a kind of return to skinner. i think you wouldn't have heard that in the eighty's or ninety's that wouldn't be even been creepy to think about someone designing your behavior. but now it's become accepted that you can be a behavior designer. and behavior design was one part of what as they used to do in a previous life. however, you have now one of a growing number of industry insiders who are taking a more critical stance towards each really enjoyed talking to him. let me wondering, duffy regret his part in inventing infinite, scroll down to be like humble about it. if i hadn't invented it, if it would have been invented, i just haven't been the right place for the right time. thinking about the right kind of thing. but yes, i do regret it, but i do think it talks to like the navy of being like, oh, here's a cool feature and making it. and even if it's great for me to user without thinking about the effects, that'll happen if you can scale it up to a 100000000 people or 1000000000 people where this little thing that i went around to like twitter and google and all these other companies like guy you should adopt, it now is wasted, quite literally hundreds of millions of human ours i'm sure all of us have had someone or another say to was stopped looking at your phone or why you so addicted to social media. and before i started this series, i thought maybe there was something wrong with me. i just had no idea how deliberately designed our online experiences and how these design algorithms are made to pull us in and pull. haven't i asked everyone i spoke to. how do we change this? can we change how

Related Keywords

United States , Washington , Arta , Zabaykal Skiy Kray , Russia , China , Togo , Belize , Ukraine , Kenya , Nigeria , Sudan , Israel , Mecca , Makkah , Saudi Arabia , Lima , Peru , Berlin , Germany , Katara , Arkhangel Skaya Oblast , Turkey , Chinese , Nigerians , Russian , American , Benjamin Netanyahu , Eli Cohen , Al Jazeera , Natasha Daschle Isa , Mariana Sanchez , Sean Parker , Wood Peck ,

© 2025 Vimarsana