vimarsana.com

Card image cap

A high security presence as he was buried at a cemetery in cairo rather than to his hometown of his family had requested a warner still gathered in motor home province of sharks are yet to pay their respects hundreds of residents prayed amid tight security and. And a spokesman for hamas in gaza praised most in support for the palestinians during his life in politics but the house expresses its condolences to the family of the former egyptian president Mohamed Morsi who passed away yesterday our people appreciated the national an arab rule that president morsi had with the palestinian cause as a member of parliament and as president of egypt in supporting palestinians rights and refusing these really offensive on the gaza strip especially in 2012 islamic scholars a former hamas leader and members of the egyptian they asked for a have attended a ceremony for the former egyptian president and qatar our reporter was there. Absentee funeral prayers of just concluded here in doha with hundreds of people gathering to pay their respects to the former president morsy the crowds overflowing from inside the mosque on to the carpark outside and beyond significant figures participating including the former leader of the political wing of the. Resistance Palestinian Resistance movement from egyptian actor is he going around or sonics corners. As well the significance of this gathering is the fact that the people who post here want to send a message to the egyptian authorities and despite attempts by the governments in egypt to prevent any similar you know president place in egypt and despite the states media essentially admitting to. The events of the death of. The legacy of egypts only ever democratically elected president is a legacy that goes beyond egypt and one that will continue it will not be forgotten. As far as these people are concerned Mohamed Morsi did not only represent significance because he was the president of egypt but because he was the only democratically elected 11 who represented the aspirations of selfdetermination of freedom by being that for a. President chosen of the gentlemen 25th revolution the presence of people like this is also a testament thats most of us here as a champion of other pauses in the struggle for freedom and palestine and other causes like the Syrian Revolution and so forth so prayers like these taking place in doha as well as malaysia turkey and other countries is testament. To not only our lives on through the day a Small Community of educations around the world but among the other arab and muslim communities who are seeing Mohamad Morsi champion of the local 1st. Gunmen have killed at least 41. 00 people in separate attacks in central mali the attacks took place in 2 different villages Ethnic Violence has search in the area in recent months the victims were mostly ethnic hundreds of people from the dog communities have been killed in fighting over land and water u. S. Secretary of state might compare has stopped saudi arabia from being added to a list of countries that recruit child soldiers. Goes against the state departments recommendation based on news reports and Rights Groups assessments they say the kingdom hired civilities children to fight in yemen canada has approved a plan to expand the Controversial Oil pipeline the 7000000000 dollar project will link canadas landlocked oil fields were a major port its faced opposition from environmentalists those are the headlines the big picture is up next. The world is slowly waking up to the growing impact of ai. This exhibition in london is all about our relationship to Computer Technology but ai remains in its infancy still struggling through glitches in the system yet were handing more and more decisionmaking over to the algorithm. Im Corey Kreider im a human rights lawyer i investigated a drone strike in yemen that killed my client faisal banally jobbers family what i found was a semiautomated targeting system used to describe who was and wasnt a threat but as the big picture showed previously on the world according to ai Artificial Intelligence isnt just used in military targeting but in everyday policing as well in this episode will delve deeper into the power potential and prejudice of ai as people seek to use it to shape our world. Were. One of the best established blind spots where the police have been seeking to use algorithmic material to accelerate and inform their practice is predictive policing so the way that it works is this lets say that youve got a historically over Police Community a community who are Poor Community of color the algorithm takes that data and basically turns around and says you went there before you probably want to go there again and so what everybody found and study after study has shown this is that it amplifies the next celebrate the process of over policing communities who are already over policed. In the United States the history of Law Enforcement has an unfortunate link to racism going right back to the era of slavery. In the 18th century groups of white men on horseback paddy rollers as they were known would go out on patrol looking for in slave people whod fled to freedom. If caught those whod run away would be horrifically beaten mutilated or killed. The paddy rollers would be assigned an area to patrol. It was known as their beat. There is no question that policing in the us contacts certainly has grown out of slave catching so lets go back to their original origins people who tried to escape and being enslaved were often hunted down and brought back into some type of system of control. When slavery was abolished the patty rollers would evolve into what became formalized police units and they didnt just abandon the road beats. Just as before africanamericans continue to bear the brunt of repressive Police Attention. With every crime recorded every stop and search every arrest every sentence a criminal history has been recorded data has been generated data that in the 21st century has been used to make predictions about the future of crime. Were now in the era of predictive policing. The got its start in los angeles with the l. A. P. D. Around 2012011 was actually the brainchild of an anthropologist and a mathematician brantingham and George Miller were 2. And of social scientists who are studying crime patterns and they realize that certain kinds of crimes are sort of viral are most contagious in their way so if theres one burglary in an area its actually statistically more likely that theyll be other burglaries in that area as they came up with an algorithm and they pitched this the l. A. P. D. And the l. A. P. D. Would went with their 1st experiments to see if it worked and for them it worked enough that it began and it suddenly took off. The research that went into jeffrey brennans algorithm was for the most part funded by the pentagon. His initial work used military data to forecast insurgent attacks in places the u. S. Had invaded such as iraq and afghanistan. But his focus shifted from battlefields abroad to supposedly enemies at home. He created a Company Called credible that uses data from u. S. Police departments to tell Us Police Departments where to look for criminality. What happens is that when. Vendors show these tools to the police the police say thats exactly what we were going to do. And sometimes we describe this as selection bias meets confirmation bias and they live happily ever after the idea that we use Historical Data to train machines and algorithm to predict into the future is very important so for example if you live in a community that has historically been over policed you are more likely to have your algorithms point right back to those communities that have been over policed if you have people who have been over arrested like africanamericans and latinos and degenerates people then your ai is going to point to those people once you feed that data and say ok i see theres more crime here let me go find some more. Well guess what youre going to find if you go look for more crime you are going to find more crime where you look for it and not where you dont look for it. But then they added in this extra element that every time there is a Police Contact you get an extra point. So in some ways theyve created an almost selffulfilling prophecy for the people because police are directed to go find the people with the most points then they go find the person who has most points and then the person gets extra points so tomorrow where are they going to go this guys extra point the consequence of that is that the database starts accumulating a lot higher fraction of all the crimes committed by black people than committed by white people so you turn algorithm loose on that itll say wow black people are really dangerous but we can ignore white people. So what happens is that. The algorithm. Calcifies or embodies the bias in the policing. I there for a really good summation and really going to be there. Yeah lets go. On is one of the lead organizers of the stop l. A. P. D. Spying coalition a collective that campaigns against what it believes to be growing Police Surveillance and criminalization of the local community in 2018 the coalition took the Los Angeles Police department to court forcing it to release the details of its Predictive Policing Program theres 2 layers to put it to policing one is a community in a Location Based where algorithms are used and the Company Principal has developed that algorithm which was owned by Jeffrey Brown think it was a professor of anthropology and has a long history himself how this thing was created on the on the feeds of afghanistan and iraq directly coming from the border from the war zones and the other piece is operational laser which is a person in a Location Base for the police in program lou. Stands for los angeles strategic extraction and Restoration Program and the reason why its called laser is that the creators of lasers said that we wanted to go into the community with medical type precision and extract tumors out of the Community Like lead from a decision thats where they came up with the act and so they came of the acronym as people are tumors the exact fact i think is not really. What credible and laser claim to offer is a one stop crime prediction shop the pitch is to tell police not just where crime will occur but also who might commit crimes in the future. The l. A. P. D. Was using these technologies to decide where to deploy their police patrols. Focusing resources on socalled crime hotspots flagged by these. So this is all the hot spots for a particular time period hot spots are created by the algorithm the prep for longer where they use the information long term crime history or short term crime history and then they create these 505500 square foot hotspots on what basis how are they deciding this so the world to put it very bluntly theres a lot of pseudo science and now its being presented as these computers are released neutral and they would predict where crime may happen. But predictive policing doesnt just flag up a place with laser it also sticks to a person the l. A. P. D. Maintain something called a chronic offenders bulletin these bulletins are undisclosed reports on socalled persons of interest people the Police Believe to be likely to break the law. This risk is calculated using a points based formula based on data from Police Records field interviews and arrest reports this is pulled together. And scored by algorithmic software created by the defense contractor pal and here a company with close ties to the u. S. Military. So how do you get yourself on to the Laser Systems are these other things that identified these risks so if youre stopped and a field interview cards filled one point so if you know there is it or if the police stop stop youve got a point one point immediately you get a point and this individual was stopped the same day 3 times so the treat its a 3. 2 right there if there had been a previous arrest with a gun 5 points if you had any Violent Crime 500 parole and probation is 54 and and identified is again is gang affiliated 5 point. When it comes to the chronic offenders bulletin points can mean prison but its not just about locking people up for hemant the data suggests increased Police Attention at the borders of a historically deprived area called skid row. Which helps keep the poor contained from the more affluent neighborhoods nearby. So this is like a beachhead so think of the defense of financial district yet from poor people you know when we talk about hot spots you know you will see the dirty divide how the proximity of extreme wealth and extreme poverty coexist right here about 2 blocks from. Absolutely. It was going oh man doing good to me and corey general. Secretary of. The building right there. I meet Steve Richardson who goes by his street name general doe gun hes a former prisoner and skid row resident who now works with the coalition campaigning for greater protection for the local community. Say you guys have been doing work on this predictive policing stuff right what does that look like out here on the street to people who live here so predictive policing rolls up and a lot of weight because i mean skid row is ground 00 all experiments that happen you know so this is poor folks of course so all a little programs l. A. P. D. Spy programs everything that they come out with is 1st tested right here a day 1st last to say for seriousness right here was grow 110 x. A rope maker the most told this community not only in america but 2nd in the world to baghdad got all kind of patrols on skid row so we got the cops on motorcycles we got regular cars we got. We got detail cops theres all polish like a 15 block area what are cops on horses right smack in the middle of the house you know they have no things that continue to come out here of a lot. About 80 percent of people here are black right about 80. People suffer from something. Mobile disability maybe physical you know semi or mental and all of us is full of. The most arrested person on skid row was a woman 8 and moody sousa rested 1081081084 violating 4118 right 4118 day is up with this book called to say you can sit sleep a lie on public sidewalk so our only crime was she was homeless and have anywhere to go and was forced to sleep in public space she got arrested 118. 00 times for being 108 a year just for just being in public space and it was all over you know based on a lot of predictive stuff like that and the point you you doeg on and you have and are making is that this is a practice that goes way back right over policing in this Community Goes back decades and then that information from that then gets fed into the computer and the computer turns around and says well go back and do some more of the same thing right now and the computer before the information gets in the algorithm is designed for policing so the algorithm would create outcomes that an agency wants to achieve and this is really the key point and the outcome that the agency wants to achieve in this community is cleansing and damage when. We walk further along the tense began to thin out as to the local residents gathered on the sidewalk were its obvious were approaching the outer limits of skid row the hotspot boundary hammond had pointed out earlier. This is like if on the storm a host of hot spots that a person from skid row would be walking into and this is where you will have more policing waiting for people who are for him than waiting for people to give them tickets waiting for people to throw them against the wall right. For people to intimidate and harass and demand to believe the neighborhood. But few weeks after we left skid row the l. A. P. D. Announced that it was canceling the Laser Program the pushback worked police admitted the data was inconsistent but the l. A. P. D. Says the predictive policing tool pred pull is still in operation. So lets think about the incentive structures with some of the predictive policing tools that weve been talking about what does it say about the incentives and the problems were going to have with these tools that youve got counterinsurgency software then essentially used for Law Enforcement purposes i hate to have such a sinister. Interpretation but i think its about opening up new markets to sell this software to 0 and Law Enforcement in the last is you know been a great market for lots of military technology is quite frankly i think theres actually the opposite incentive to get it they have the incentive to get it wrong Predictive Policing Software has an incentive to make the sale with police so their incentive is to is to make predictions that are as close as possible to what the police already believe is correct so given that its really hard to know. If ai has been tried on representative data or or not if we have real reason to suspect for example that there might be bias than isnt there a question about whether the system should be used at all well i think thats the fundamental issue is that were seeing the deployment of all kinds of automated decisionmaking systems or ai over we want to kind of characterize it and we dont know the effects until after the fact after the damage has been done is primarily how were learning quite frankly about what doesnt work and i think it goes far beyond bias i mean were talking about aggregating data about us Building Data profiles that for close certain types of opportunities to us and whats more dangerous i think in the digital age about this is that you know in the 1950 s. If you tried to get a mortgage you were you were black and try to get a mortgage at a bank and you were discriminated against you were very clear about what was happening that discrimination was not opaque and when it moves into a Software Modeling system what instead you have is a banker whos like you know im sorry dr noble you just cant have it and i dont really know why and so that lack of transparency is one of the things that i think were kind of trying to contend with here and this just becomes a wholly normalized process we dont understand or with you know the the models for actuarial science for determining whether youre going to pay more insurance for example because you live in a particular zip code doesnt even account for these histories of racial segregation housing covenants real estate covenants so just because we look at the zip code that doesnt tell us about this long history of discrimination that has sequestered people into particulars of codes those are the kinds of things that i feel like over time will become harder and harder to see i think one of the things that i find. Worrisome is that we talk about data being collected for these kinds of systems and for the most part they just collect of some completely different purpose it just happens to be there in policing data is created by the police doing what they do theyre driving around theyre stopping people there occasionally arresting people and so forth that data gets produced and then is used in a predictive policing model its not collected for the print predictive policing model thats a 2nd order effect thats used because the data is already there and it turns out that it is a terrible way to predict where future crime will be because what police do is not collect a random sample of all crime they collect the data they can see this is true in most of the places where people are applying i think it is useful to detect where bias is happening and simulation can be important that i think thats true however it doesnt necessarily allow people to have again this conversation that i have been discriminated against its just sort of leaving the expert analysis to make that discovery when in fact there are a whole bunch of people that wanted to be homeowners or you know wanted to move house and they dont really understand why these decisions are happening so as a Data Scientist whats your take on this how do we build a kind of test for when its appropriate at all to use Machine Learning and when its not the question should be who bears the cost when a system is wrong so if we unpack a particular system and we say ok were building a Machine Learning system to serve ads and the ad that were serving oh this customers searching for sneakers but we served are boots ad. Oh dear we are wrong there no one cares thats a meaningless meaningless problem the consumer could care less we get along ads all the time were trained to ignore them lets compare that to a system which makes prediction about whether or not someone should get credit. In a credit based system if were wrong the consumer who should have gotten credit doesnt get it or the consumer who should not have gotten credit does get it in both cases and in particular in the case where someone who should have gotten credit does not get it that consumer bears the cost of the air she doesnt get whatever it was that she needed the credit for to buy a house or a car or Something Else the company that failed to offer the loan may bear a small cost but there are a lot of customers so they dont really bear much of a cost and so when the customer bears the harm. We can predict that the harms will be greater because the people deploying a systems a little incentive to get it right. We know that if people of color are over police or poor people are over policed and over arrested they are also likely to be over sentenced. Machine learning isnt just used to predict crime its also used to decide whether a person should be given bail or how long a sentence a prisoner serves. Criminal courts in the state of florida and use a predictive Sentencing Program called the correctional offender management profiling for alternative sanctions compass. In 2016 journalists at the us news outlet pro publica investigated compass and discovered an apparent racial bias at the heart of its algorithm. Investigative report and one of the things that they found in their hand a review of all of their records was that africanamericans were twice as likely to be predicted to commit future crime i found it incredibly interesting for example the story of died one of the reporters told that there was a black woman a young black woman who had taken a bike and one of her neighbors frank ahrens and kind of ridden it around and the person here on the bike said bring that bike back and so she did it but a neighbor called the police on her and she spent 10 days in show and the Compass Software gave her a score of 8 out of 10 that she was likely to commit a crime again. And that and they looked. White man who had a history of Violent Crime of history of being in and out of jail and the software gave him 3 years so he was more likely to be replaced. Once again the bias in society was revealing itself in the machine. In a war torn city in iraq a magic documents the stories of the survivors recording their hopes and dreams for a peaceful future after american troops withdrawal. But the conflict is far from over. He turns the camera on himself when i still take control and his family are forced to flee no where to hide a witness documentary on aljazeera. This is a really fabulous news from one of the best ive ever worked in there is a unique sense of bonding where everybody teams in. Something i feel every time i get on the chair every time i interview someone were often working round the clock to make sure that we bring events as i currently as possible to the viewer thats what people expect of us and thats what i think we really do well. They wanted 43000000 homes with the weaponry that was 6000000000. Theres no and they more because theres always a small. Goodness. In essence we in the United States have privatized the old Public Function your shadow on aljazeera. This is there im daddy navigate out with a check on your world headlines the us President Donald Trump has kicked off his Reelection Campaign at a rally in florida the state was crucial to his victory in 2016 and will be again next year if he hopes to retain the white house were going to keep it better than ever before. And that is why tonight i stand before you to officially launch my campaign for a 2nd term as president and. The u. N. Has demanded a foreign dependent investigation into the death of egypts only democratically elected president Mohamed Morsi there was a high security presence as he was buried at a cemetery in cairo rather than in his hometown as his family had requested for mourners though gathered in motor home province of south to yet to pay their respects hundreds of residents prayed amid tight security and the village. And a spokesman for hamas in gaza praise morsi support for the palestinians during his life and politics the house expresses its condolences to the family of the former egyptian president Mohamed Morsi who passed away yesterday our people appreciated the national an arab rule that president morsi had with the palestinian cause as a member of parliament and as president of egypt in supporting palestinians rights and refusing these really offensive on the gaza strip especially in 2012 gunmen have killed at least 41 people in separate attacks in central mali the attacks took place in 2 different villages Ethnic Violence has surged in the area in recent months the victims were mostly ethnic doggoned hundreds of people from the dogon and for communities have been killed in fighting over land and water u. S. Secretary of state my home pay has stopped saudi arabia from being added to a list of countries that recruit child soldiers this is then goes against the state Department Recommendation based on news reports and Rights Groups assessments they say the kingdom hired sudanese children to fight for the u. S. Coalition in yemen canada has approved a plan to expand the Controversial Oil pipeline the 7000000000. 00 project will link canadas landlocked oil fields to a major port faced opposition from indigenous communities and environmentalists those are the headlines on aljazeera its back to the big picture next. He. You know. The risks of bias baked into Machine Learning arent just confined to law and order upon release prisoners must reintegrate into a world that is increasingly automated. Today for them as for you and me opaque computerized systems will help decide their access to state welfare to private finance and to housing take Credit Scores these are shorthand for a persons financial trustworthiness in many ways Credit Scores are the gatekeepers to opportunity and increasingly theyre produced by algorithms fed on data blind to context and history. If that Credit Report comes back with a low score that means this individual is supposedly a high risk so you begin to sort of just go around in a circle. Low credit score criminal background cant get housing because you dont have housing you cant get a job because the job that youre applying for requires a permanent residence. There for again are stuck in this cycle of an opportunity youre at the whim of a machine driven system that decides on the basis of different criteria that are on been announced to you. This is one of the darkest topics of. Their human biases in targeting on the on the battlefield their human biases in who gets loans their human biases in who is subject to arrest and these human biases are horrible couldnt we fix it with algorithms that wouldnt be biased but then it turns out the algorithms are perhaps were. 1st the algorithms have refined the worst of human cognition rather than the best because we dont know how to characterise the best. I went to the work rebooted conference in the heart of the Tech Industry San Francisco california to see if a i could be used to bring out the best in human endeavor some people are going to do well some people can do less well i met ben prng who heads the center for the future of work at cognizant a Multinational Corporation specializing in i. T. Services. I know a lot of people anxious about the whole notion of bias within the algorithm and so one of the jobs weve speculated on the to be creative is what we call an algorithm bias order to which could be a sort of morphing off of the traditional kind of core order row to make sure all of that the reason unconscious bias within. Algorithms are going to production environments within big business is so that people can reverse engineer decisions made by software you do look at Job Opportunities opening up you know have said that you do anticipate some job losses in certain areas yeah occluding some that actually people you think havent seen there is a class of new software theres a motion the last couple of years in the industry its called republic process automation. And you can get a team of 500 people down to 50 people thats the reality of whats going to happen in big business is that a lot of that kind of white call you know skilled semi skilled work midlevel mid skill level what is going to be so you know is replaced by this kind of software in the snake denying that some people will be kind of left behind in that in that transition so what other jobs do you think that ai might. Open up in a 5 or 10 years time so we came up with this job we call a walk or talk which is this idea that you know in a lot of. Towns around the world certainly where i live in that lot of seniors theyre very isolated so what if there was an imbecile platform where people in the neighborhood could log on so the platform ive got a spare hour on a tuesday afternoon or saturday morning i could go and walk and talk with a senior in my neighborhood so People Living in the kind of gave the economy a living a kind of portfolio style set of jobs they maybe drive. They maybe drive a lift they maybe do their house through the they may do things through task rabbit what if they could literally monetize that spare time they have to go and walk and talk with a senior that doesnt sound like a Technology Based job but that would all reside on a ai infused platform in the same way that the. Most of the people who do care work are women and women of color and guess what guess whos been taking care of other peoples kids since they were inflamed and brought to north America Black women this idea that somehow these historically oppressed suppressed communities are now in some type a better situation because theres an app interface between them and the new people who want that work done and then call it a fascinating new gig ing opportunity i think is just completely nonsense the experience of marginalized people basically foretells whats to come for the entire population 8 degrees of control or lessening of autonomy. A real difficulty in confronting. And sometimes resisting these systems. Some say if you want to know whats to come with ai you need to look to china. The chinese want to be the primary Innovations Center for. Both a potential driver of more social instability but at the same time the Chinese State thinks that i can use this tool to call social address. China is home to 1400000000 people its Capital Beijing has more surveillance cameras than any other city in the world facial Recognition Technology is woven into everyday life getting you into a bank your residence checking you out at a shopping till 800000000 Internet Users and weak Data Protection laws the Chinese State has access to colossal amounts of data and chinas credit scoring system aims to go far beyond finance. There is this ambitious goal to have a National Unified social Credit System that would assign a score to citizens to judge whether they were their behavior was politically acceptable socially desirable. The plan is for all chinese citizens to be brought into the social credit scoring system in 2020. And uses data everything from financial records and traffic violations to use of Birth Control and processes that data through algorithmic software to give people a score for their overall trustworthiness. A high social credit score could mean better access to jobs loans travel and even Online Dating opportunities a low score can mean being denied some of the modern benefits of sit. Probably the most troubling aspect of social criticism is not necessarily the social Credit System itself but actually the application of some of these facial Recognition Technology is to expand the surveillance state and to check behavior of citizens in the Western Region of job where ethnic minorities waders have been disproportionately targeted in terms of their location being tracked 247 whether theyre going to mosques which areas of their travelling to and that has been powered or is in the process of being empowered by facial recognition algorithms being connected through security integrators. Autonomous region is home to chinas weaker population. And ethnic Muslim Minority that has faced systemic forced dissimilation. A small fraction of the weaker resistance to this oppression have turned to violence. Including attacks on civilians. Leading president g jumping to embark on a socalled peoples war on terror. Aimed at stamping out weaker separatism and imposing a secular ideology. New an ai led technologies particularly facial recognition by the latest weapon in xi jinping crackdown. Some reports have indicated that it was a database that tracked 2600000 residents of. Tracked where they were going and that database had labels of sensitive locations like whether they were going to a mosque or whether they were going to this particular region job so that was updated. On a 24 hour basis and that database had i believe more than 6000000 records so it showed it was tracking these people real time. Are now in reeducation camps. So thats a pretty significant departure from normal life where youre forced to study in a camp and repeat party monstrous. Its a stark picture of how Artificial Intelligence can go wrong the Chinese Government deploying ai to track and suppress its own minority populations. Facial recognition checkpoints engine junk use deep learning technology to identify individual leaders cross checking them with Data Collected from smartphones to flag anyone not conforming to communist party as unsafe a threat to state security. Has become a test bed for authoritarian. This harsh system of control may seem a world apart from the west but systems like social credit actually have some parallels. In some ways if you think about the origin of some of the signs the social credit coming from some of the major private businesses in china how different is it really from a kind of experience or an equifax or one of these set of Credit Rating agencies that actually do collect also very granular data on westerners and that data is then shared with all kinds of other entities and used to make consequential decisions in current operation i would say that there are different i think the difference will be when its not just your Financial Behavior one its also your social and your Political Behavior that gets observed and oftentimes the social Credit System becomes a projection of our own fears about what is happening in our societies in the west where its not necessarily what is object really happening in china that is important but its about using whats happening in china as a way to project. What were fraid of. When i think about chinas millions of wiggers being tracked 24 sevenths by a and potentially put into reeducation camps i think about the black community in the United States i think about predictive policing and i think kind of east and west one of the problems and worries ive got with ai is the way that it gets road tested on communities of color in the court. The way the district knowledge start being developed is not empowering people its empowering corporations they are in the hands of the people who hold the data and that data is being fed into algorithms that we dont really get to see or understand that are opaque even to the people who wrote the program and theyre being used against us rather than for us. Theres this incredible informational imbalance isnt there that even as a handful of companies are acquiring more and more and more detailed information about each of our intimate lives weve got in some ways less and less information about them in the way that they operate its nonsense when you think about the way in which fraud and corruption are words that get pointed at poor people who get tracked into these high highly surveilled systems. And if they dont participate they actually have no other option you dont get food if you dont participate you dont get to go to school if youre not in the system properly tracked and so i think these kinds of things are. Are the questions again and that also might need to be regulated you know beyond kind of the technical regulations one of the limits there is that so much of the pressure or focus in those movements is about perfecting the technology instead of thinking of more broadly about like what are the values that were trying to implement and. Who are they in service of now see to you worked a little bit with the Obama Administration didnt you on trying to determine how we make some of these Automated Systems more accountable to us did you find that that was a useful exercise how did that go i think there was a genuine interest in and thinking about what might be harmful what might be helpful what should we think about it now in order to forestall or prevent particular outcomes that we cant undo thats right further down the line and so there was a lot of interest i think whats happened since then is there has been this increasing crescendo from industry saying these technologies are inevitable whether or not you like it theyre coming. And what that creates for members of the community for citizens consumers is. Increasingly a sense of despair or resignation well we might not be able to do anything about it and given that increasingly it looks like governments are actually punting to Corporate Governance structures or cut Corporate Governance bodies it can create a sense of despondence id like to pick up a slightly different but i think related part of that to do with a im the kind of supply chain and actually the labor that is involved with some of this Artificial Intelligence because i feel like the phrase ai sometimes kind of hides like a lot of human labor thats used to make a given system work right so youve got people in kenya whove got a label images to train software for self driving cars or people in phoenix right on very little wages looking at videos that would come up on you tube looking basically all day every day at a stabbing or a beheading so that that stuff can be taken off and you and i dont see it on our social media feeds is that one of the kind of problems we dont see a hidden problem of some of the Artificial Intelligence economy that theres a lot of human labor that is required to prop it up surely you dont see the data janitors who pay the day right theyre not the ones that are. You know in our line of sight as things like silicon beach expands we dont see the sort of dis aggregated geographically dispersed nature of these ai companies and who all is involved in making and cleaning data right and i think thats thats highly problematic because it is contributing to this sort of magical or that surrounds a i can do all of these things efficiently and instantly and yet theres this whole kind of body of people that contribute to that and the fact that in many cases their labor rights are being disrespected i think is also a cause for concern. Yeah i mean i think we know now for example from researchers i think of my colleague at u. C. L. A. Zarb roberts whos done all this work around commercial content moderators bringing them out of the shadows so that we actually understand that there are huge. Dispersed Global Networks call center like environments where people are doing this kind of moderation that you talk about. You know one of the reasons why i think we previously didnt know about them is because theres such a deep investment by you know the sector and thinking at least in the us context of the internet as a free speech zone for example and that anything goes but of course we know that anything doesnt go i find it always interesting when i hear the Machine Learning experts talk about how how crude in many ways things like kind of visual mapping is like you know is a table a table is the cat a cat right still trying to figure out these really rudimentary kinds of questions and yet when we see tech leaders in front of congress they say things like were going to take down you know damaging content of violent content content you know live murders live suicides with i sed to protect workers and i think you know thats really interesting because ai is not there. Corey right to me. The role of ai in medicine is to. Make better predictions but to make the doctors lives better if you just look at the camera and smile. But there are some fields where ai is already there and has the potential to do great good. It could well transform the way we Practice Medicine a lot of what were trying to do Machine Learning or big data in health care is to predict these healthy to disease transitions so really tracking your trajectory over time right and use those think id trade. So lets have you think about bills on here at the lab 100 clinic at mount sinai hospital in new york city and relax i go through and ive driven health check that generates a heap of data so now im going to hear results right providing a more complete understanding of my physical well being. With access to this kind of information doctors could save lives and potentially millions of dollars along the way one of the most mature areas in medicine is the application of ai to imaging data and actually deep learning came from Image Analysis and video analysis it was really well tuned for that type of thing so an example looking at radiology images and diagnosing a tumor or you know finding a hip fracture those tools were already well tuned for that task i mean finding cats and videos but i think whats clear is is that the ai is at least as good and men and sent many cases in reality as a human is equivalent to a human radiologist might be more like Airline Pilots in a way so Airline Pilots are kind of there for you know takeoff and landing and then the plane flies itself for the most part but i think what radiologists are going to basically be doing is looking at the radiology image and basically rubber stamping it for. Legal purposes really until we solve that problem with ai. Ai that prevents disease what could be better but in a world where people have to pay for health care what would be so great is that private Companies Use your ai Health Profile to charge you more. The future of ai and health doesnt just depend on the tech it depends on our values is health care a human right. Should a person predisposed to Heart Disease or cancer because of their low income or ethnic backgrounds have worse care than those better off. The aim should be decent standards for all not a 2 tiered system. There are so many positive potential applications of Artificial Intelligence that would change the world for the better one is a very obviously the pattern recognition that ai is good at has proven incredibly good at spotting malignant tumors right an incredibly powerful and inspiring medical advance that ive seen some papers on just in the past year but the technology is going to shortly underpin all aspects of our daily lives very shortly some form of Machine Learning Artificial Intelligence will determine whether somebody can loan whether somebody gets a mortgage whether somebody gets bail whether somebody gets paroled and as weve seen it may well determine matters of life or death in a military context so the stakes could not be higher the quality of your Decision Making absolutely depends on the quality of the material that is coming into it and we have seen and other uncertain human contacts such as policing that in math data rinsed their Machine Learning algorithms has a distressing tendency to replicate and accelerate all of our preexisting human biases. If you have a whole. Technological culture infrastructure. Language that emphasizes the lack of human responsibility and instead emphasize a system where there are these artificial agents we pretend they have agency but whats really going on is receding to trick you manipulate each other then theyll be more and more trickery and manipulation and if we want to reduce things like turning text we have to emphasize human responsibility. The drone attacks that killed my clients family showed just how much responsibility were handing over to technology. But we can take that responsibility back our curiosity and drive to innovate has been pushing the bounds of what we can do with Artificial Intelligence for decades. As ai is used to make more and more decisions about us from targeting to policing to social welfare it raises huge questions while i would be used to target minorities or clean up our air will it destroy our privacy or treat disease will it make us more unequal or fight Climate Change these are questions that should be decided in the boardroom of a Software Company what happens with ai is everyones business the world according to ai is our will and its up to all of us to be sure its a just one. Hello again across United States were still dealing with a flood threat anywhere across parts of pennsylvania all the way back down here across parts of tennessee now the big problem is is the standard storms that keep trailing in the same locations day by day and because of that we see a lot of rain in the same location so as you can see here on our forecast map plenty of rain across much of the east as well as the southeast coming back towards the Central Plains were also going to be seeing a lot of thunderstorms in this area not as severe as what we saw just a few weeks ago but still going to be a problem across much of the area and some of those showers are going to be quite heavy so not really seeing a break as we go towards the end of the way here towards the west though its going to be dry a little bit cooler in San Francisco with attempt a few of about 18 degrees were going to see plenty of rain across much of central and southern parts of that area were going to be seeing a lot of rain across panama as well up towards the north though the problem has been a lot of rain across the bahamas and things are going to be Getting Better by the time we get towards thursday a Beautiful Day and vanno with a temper them about 31 and then very quickly across parts of argentina things are Getting Better over the next few days that one system that was bringing the rain is making its way towards the atlantic were going to be a little bit cooler here in with a temperature of 40. Bottles in cameroons rivers. On englands street. Plastic is everywhere. But if bottles can be fishing boats. And bubble gum wellington boots what more can be done with this plague of polymers. Earthrise reimagining plastic. On aljazeera. Aljazeera. Where every. Ive always been fascinated by space but the story of the space race isnt just about the men who risked their lives to travel into the unknown but the ones who held those lives in their. Grandfather and his colleagues who worked on the space suits they designed the space suits apollo 11 was his triumph. And the perfectly designed space suits his legacy putting man on the moon on aljazeera. This is aljazeera. Youre watching the news our life from our headquarters in doha and any not ok that coming up in the next 60 minutes. I stand before you to officially. To launch my campaign for a 2nd term. Donald trump kicks off his Reelection Campaign at a rally in florida. The u. N. Calls for an investigation into the death of egypts former president Mohamed Morsy in court as mourners pay their respects in his hometown

© 2024 Vimarsana

vimarsana.com © 2020. All Rights Reserved.