Get more intense. If you appeal forces near the mountainous areas surrounding the couple of the to greater region mackellar, the to go Peoples Liberation front leadership who are engaged in the conflict with the government and at the suburb so that they have a very to defend their positions to the last month, the chief executives all facebook and twitter are pairing before members of the Senate Judiciary committee. The questions about social media and the president ial elections is the 2nd hearing in 2 months. The head of nato has warned against the removal of u. S. Forces from afghanistan too quickly, reports suggest the child, the administration is looking to have the number of personnel by january that would leave around 2 and a half 1000 troops in afghanistan. And that also be a reduction in the wrong. Meanwhile, iran says any attack on its Nuclear Facilities would face a crushing response. Thats following a report from the New York Times that says President Trump was seeking options to attack. And you criticize oxfam, says g 20 may become trees, have sold 17000000000. 00 worth of arms to saudi arabia, since it intervened in yemens war and 2015, the charity says that is 3 times was going to yemen in aid. At least 80 refugees have been killed and Dozens Injured when the engine of a boat exploded near cape. Around 150 people were thought to be on the boat may be from senegal and gambia. Protesters in thailand have rallied in front of the parliament in bangkok as lawmakers consider changes to the constitution. Demonstrations calling for the resignation of Prime Minister proof general have been ongoing since july. German police have arrested 3 suspects in last years brazen. 1000000000. 00 dressed in costly jewelry heist the consuls green vault contained europes largest collection of traces. Karen algis, theyre up next to will be all while the algorithm sees them. But like i can unlock my phone with my face and you can access your bank account with your voice. And fingerprints are often the Key Information on a national id card. All of this face voice fingerprints. There are biometrics, unique algorithmic measurements of us that are revolutionizing the process of identification of biometrics a far from perfect. They convenience and seeming infallibility comes out across. Most crucially, our privacy biometrics are individual, unique so much so that theyve always served as a Gold Standard for i did the cation with really high levels of accuracy and Strong Security fingerprints and d. N. A. Databases have been the mainstay for Police Investigators for decades. And across many parts of the world, people who are literally used thumbprints in place of written signature. Stephanie is, has been researching the growing use of biometrics. Theres also your face now which is being recorded. So thats just your facial point. Thats called facial Recognition Technology. Your voice is biometric data. Theres also something called gait analysis which is how you walk. So those are ways that they can identify you. And another way is behavioral biometrics that might be your online behavior. So how do you use their mouth, where you click on things as you go through the internet, but even how regular they are posting on facebook with a lot, but you can, can get just from people ordinary life. And thats why its so important to have this debate and fighting if we all are giving our consent about whether or not we want to. And if so, under what circumstances and what regulatory checks the world is on a mission. A mission to give everybody a legal identity by 2030. That was a target set by the United Nations as part of it, Sustainable Development goals, campaign. The key segment of the population that the u. N. Is focusing on is the more than 1000000000 people who currently have no way to prove their identity. The un verified include millions of refugees, traffic children, homeless and other people and never get a chance to establish documents. And create a digital footprint that so essential for modern life. Here exactly can the United NationsWorld Food Program is using Biometric Technology . Iris scans to provide aid to the camps 75 1000 syrian residence. Refugees can shop for their groceries with the blink of an eye. No need for bank card or registration papers. This estate is quite aptly named ip. When a shopper has their iris scanned, the World Food Program system verifies the persons identity against a biometric database held by the u. N. High commission for refugees. The u. N. H. C. R. Then it checks the account that confirms the purchase and prints and i pay a receipt. All of this happens in seconds and according to the world, food program is not only makes transactions quicker, but more secure. Here in jordan, we use biometrics out indication for human reason 1 100 percent accountability on the identity of the person could chasing. And using the assistance that we provide. And 2nd me to facilitate that adoption process of the beneficiaries by not using the card by not using a pin in camps, which is an environment where beneficially tend to go to the supermarket. More people posting the money for them going with their own iris. Its easier than going with a car, but it could be higher is unable shopping process is both fascinating. And this is a super high tech system thats been rolled out in what you could call a low rights environment. Sure, people here are under the protection of the United Nations and have more rights than they would in the walls, rings of the countries they fled, such as syria. However, they also have little choice when it comes to giving up their biometrics program. Taking somebody biometric data from them is about the most personal data that you could take. These are not people who know thoroughly are in a position to ask for legal representation to have this explained to them. 2nd, if they dont want it. What is the alternative that they can exercise instead . Are they using behavioral psychology as something called nudge the area to make it where its just easier to hand over your data . And then you get your food and your clothes and your money faster. Because that would be unethical. Were testing out again, extremely experimental, really Invasive Technology on people who potentially have some of the right protections of anyone with a middle class person living in france or germany or the United States, the united kingdom, or sweden, consents to use their iris to pay for things or to transact, probably not, its easy to see the immense potential of the idea to track a dispersement smooth out payments and reduce the chances of corruption from the world. Food program says the benefits go even though they are able to monitor Shopping Habits and traditional take and theres a possibility in the future that the Credit Histories of the refugees could help them. Thank you. Counts getline. They also think theyve got the security bit covered. So the reprieve regulates the management of theyve got to produce through it, but im sure theyre going to remember everything. Its got soaked through the agreement. We are able to access to be sensitive, which again, does not include me just the case. Id need for money, but because we are confident that that the being good is what public. The reason why we are doing that. I would really buy the privacy and in fact, assessment on the project to go out and see if that a new thread in the water we had able to talk of them and address them. Cope with me before they come to us. U. N. H. C. R. Remain fully committed to their Biometrics Registration Program so much so that theyre rapidly expanding it with the aim of be active in 75 countries by 2020. There remains lots of problematic questions that are yet to be fully answered. Such as is the tech foolproof . Who has access and how can anyone plan for the unforeseen issues to come up . These are the kinds of questions that have made. Other aid organizations pause before jumping on board with Biometric Technology. In 2015, oxfam voluntarily imposed a moratorium on its use of biometrics in its work. It stated, given the number of unknowns around most effective operation and governance models, and risks of this incredibly Sensitive Data falling into the wrong hands. We felt it was best not to become an early adopter one field in which biometrics has long been just is security and surveillance. And facial recognition is one of the most popular right now. In china, theres been an exponential increase in the use of facial tracking and Artificial Intelligence to monitor citizens. The United States also currently operates one of the largest facial Recognition Systems in the world with a database of 117000000. Americans with photos typically drawn from drivers, licenses. And in the u. K. , Police Forces have been trialing life fishel recognition since 2016. A public spaces such as shopping centers, football matches, protests, music events, and crowded city spots. So this green band thats behind me here in Central London is part of the facial Recognition Technology trial thats being run by the metropolitan police. And what its doing is its basically scanning peoples faces when they walk past and then comparing that to a database that has want to defend this all suspects on the met police say facial recognition could enable them to more easily protect people, prevent offenses and bring offenders to justice, however, privacy groups such as big brother, watch, say the technology is authoritarian and lawless. The groups legal and policy officer or a ferris even goes so far as to say that facial recognition is possibly the most dangerous surveillance mechanism thats ever been invented. This facial Recognition Technology can capture up to 300 faces a 2nd which could be around 18000 faces in a minute. Its a vast, vast number of people whom the police can identify, check against Police Databases for that police or immigration. So what were seeing is police. I dont being able to identify people in seconds, but put so much power in the hands of the state and the police, which i think is fundamentally wrong. Its not democratically accountable because theres no legal basis for this. So this is an intense, intrusive, and, or thorough tarion surveillance technology. While advocates for facial recognition would debate some of course assertions. One thing is undeniable. The technology currently being used by the u. K. Police is dangerously inaccurate. Latest figures show that 96 percent of the met police socalled matches were misidentifications. And this Research Showing that many facial recognition algorithms would disproportionately misidentify darker skin tones and women. Because theres a new mirrors and they very ranging from poor quality c. C. T. V. Images to the fact that the algorithms are trained so to speak, using faces at a mostly white and male. This technology, it looks like a really nice, quick fix to the fact that we have not got as much money to pay for human intelligence operations. So it sounds great in theory. The problem is, if they work very well on people who are not white men, which is quite a lot of the population on the planet being arrested, wrongfully means that you get put into predictive policing algorithm. So the more often youre having contact with Law Enforcement, the more you are at risk of being stopped again, even erroneous lee and also people in your network because they build the network out of number just about you. Proponents of facial recognition in the u. K. Will argue that issues with accuracy can be fixed, they arent wrong, technology can always be improved on. Whats a bigger concern is that currently there are no laws governing the use of facial technology in the country. Whether its the state using it or even private companies, i think whats really troubling at the moment is the technology is being rolled out without legislation and empowered regulators. This is not technology to have a very good track record of being countable. So i can find out who is using it under what circumstances, what weve done with the data where that stored, whats the track record of cybersecurity on keeping that data protected. All of the things we have no idea has just been rolled out when people feel that theyre being observed all the time. That has a really chilling effect. The things like your right to protest, your right to go to a job interview, to hang out with some friends, to go to church. These are things that perhaps the state doesnt have a right to keep an eye on. The met police have defended the trial saying they quote, and that members of the public or through post isnt leaflets. But if the trial i was at would be the word id use. There was literally huh. People rushing through the space and the chances of seeing the tiny signs reading the leaflets, or even understanding what the unmarked van was being useful when minimal. I stopped a few people to see what they thought of the trial. Not the level of invasion of privacy, but then we live in not wild, in my opinion. I think its a good thing to have facial recognition because as long as youre not doing anything bad and it also helps the police track people down, to be honest, the Way Technology is going at the moment. This will be the norm all around the world. So i think we just need to get used to it. If youve done nothing wrong, there is no issue. I think if you really believe that the state has never done anything wrong to its citizens, then you have nothing to fear from this technology. But as we know, no state has a perfect track record, and we should not be putting so much power into the hands of the state and the police. Take a look around you in the world. The technology is already being used by certain countries. All you have to do is pick up a newspaper and see people who are being incarcerated in concentration camps in china. Right now. Biometrics data is part of that. Thats how theyre monitoring those people and tracking them. And anyone who comes into contact with them, right . So theres your proof of concept of what could be done. Now its really easy to go. That would never happen here, but your government can always change, right . So history is full of examples, but even in the broad democracies, in times of war, in times of economic difficulty, people get voted into power, who change. So you have to think about how a system is being built and what it could be used for years down the road when theres a very different political flavor. The u. K. Collects biometrics from another key segment of the population. One that many wouldnt even consider. Children. If youre aware that schools have been recording the biometrics of children for the past 20 years, it is estimated that since 1909 approximately 70 to 80 percent of children in the u. K. Have in acted with some sort of biometric device in school. Picking is a parent campaign, official rights and creator of the biometrics in school blog. I think companies are putting the tech into a School Setting because you put a compliant population in school. Children might ask a question if theyre being surveilled little bit more than general population, some because they dont know any better. The concern i have with biometrics in schools is that the way back in 1999 and throughout the whole of the next decade in 2000, is that where as an Adult Population when using biometrics at all, not even on phones. And suddenly we had children as young as 3 and 4 using their fingerprint to get in and out of school systems. The growth of affordable Biometric Technology means that fingerprints iris scans, facial recognition, and infrared palm scanning have been used to speed up access to canteens, libraries, registrations, payments and lockers. A big selling point, of course, has been security by metric unable to access is seen as a foolproof way of keeping School Building safe. However, a big concern is how robust the systems are, who has access to the biometric data. Is there a process for deletion and what happens if the system is compromised . I also sent the puppet occasion a few years ago. Freedom of information request about have they checked the software . They tech encrypted standards . Is that adhering to sort of International Standards of the hardware . Is it secure . Nobody can answer. No, no, weve never text system. No, with that method at edwins national standards, its just seems to have sort of been going under the carpet and they really is aware of. You know, whats in schools, whats being sold to schools, who are thoughts as to whether or not theres been any biometric data, gretas, for entire generations of British School children. Questions of consent around their biometrics have been bypassed to a great extent. It was only in 2012 that a law was enacted putting in place processes for consent to be given or withheld. The overall effect of by metrics in schools however, is that the sharing in use of very personal data and the implications of surveillance be normalized. Thats millions of british children whove been taught to understand that its no big deal to hand over your body data in order to get a service or a product. They dont understand how it can be abused unnecessarily. Theres a reason that they should understand it because nobodys helping them to understand it. We havent had public discussion about it, the test, but its not that isnt necessarily the tag because weve got the tech already except and if you go into schools and you desensitise and normalize the surveillance technology, the smart thing is theyre already, you know, pretty good object and so i think theres a good argument sort of for all hold to be a little bit wary of the word smart and especially when its loaded with smart cities or smart notorious because it is to sense the assailants. It would be one thing if extensive biometric systems would be just used by governments or state funded organization like the un it would make the lack of accountability or inaccuracy or outdated security protocols any easier to live with. But at least across many countries, governments can be questioned, pressured to give answers of some form. The reality, however, is that biometrics are increasingly being used by private companies, shopping malls, recruitment agencies, online, d. N. A. , and ancestry services, and even private security companies. All of them taking and using our by metrics and finding out how the technologies being used, what daughter is being stored and with whom its been shit, not just today, but also in the future. Involves a lot of probing because these arent transparent systems. Even some of them are seemingly benign, data driven parks can pose a threat. A lot of people, for instance, are really interested in finding out about their family history. So theyre handing over their d. N. A. To Companies Like the unique combination of the Worlds Largest d. N. A. Mitri databases. Who can show you a more precise picture of your origins. You as a citizen, have fewer rights about your d. N. A. With a private company than you do with Law Enforcement. Right there, like that in your head for just a moment and all of the implications through and from countries with your d. N. A. Could be used to reveal all sorts of, for instance, predispositions to Health Problems that you might have. And in countries where there isnt National Health insurance, and you have to pay to be insured for your help, that could be used against you and you would never know maybe even how they got the data. Because this is all being potentially traded by 3rd party brokers because its not illegal yet because no ones regulated it by metrics are really powerful. Datta said, and theyre being used in the 2 id you to more reliably track you, but also to judge and to make assessments about your personality and your behavior. There are companies that offer this exact take. For example, on its website, it says that leverage is ai and video to provide comprehensive, candid insights. When a candidate takes a video interview there creating thousands of unique points of data. A candidate to verbal and nonverbal cues give us insight into their emotional engagement. Thinking and problem solving style. According to high view, it services are already being used by big employers like unilever, vodafone and dunkin donuts. This is something thats increasingly used in the real world, just that the share range of things we can, i do, and the ways that we can use data to affect peoples lives. So for example, Machine Learning systems are used to recommend adverts and shop ensnares, but theyre also used to assess people for jail sentences. And so if those algorithms have got problems, whether they be technical in accuracies or biassed within the algorithm, we need to start addressing the trend of issues. Ryan kelly is a researcher on computing and Information Systems at the university of melbourne. Hes been involved in an elaborate lie metric experiment to raise awareness about the potentials and limitations of biometric analysis. Its called biometric mirror and i gave it a go using nothing but an image of my face. This is, tim produces a detailed report and its assessment of my age, race level of attractiveness, and even aspects of my personality, ranging from happiness and wed miss to aggressiveness and responsibility to teach the algorithm to do this. Research is us human volunteers to judge thousands of photos for the same characteristics every day, yet you really need to know her. You know, youre not one of now its easy to laugh at the results or shrug them off as just a bit of fun. But theres more to it than that. So one of the reasons its important to teach people about the limitations of Artificial Intelligence. And these kinds of analyses is because people might assume that because its done by a computer its objective and correct. And what they might not realize are actually an operation like by much marrow draws on a data set of faces that have been rated by people. And so those ratings contain human viruses. And so one example in the dates of biometric maris, the, anybody with a beard is classified as aggressive. So of course i am classified as an aggressive person by biometric merit, even though i dont think i am, i hope im not. So if an operation like this is the plot in the real world, immediately people are classified perhaps unfairly and in ways that arent accurate in my sight. So for example, you can imagine a scenario where you have a set of job accounts and you want to make it easy for people to kind of filter them based on say, responsibility. And so somebody who is responsible for that task might say, oh i can use biometric merit to identify people who are high and responsibility without really realizing that its not an accurate thing to do. There are various problems associated regardless of those problems. Biometric technology is being developed and used at a rate that far outstrips the pace at which regulations are big created. In many senses, it feels as though were sitting on a ticking time bomb. We dont even have enough to push field of ethics for technology. Theres voluntary codes by companies, these are not legally enforceable. You with a citizen or a consumer cannot use these to protect you in any way to derive no comfort from not. So i think were entering a really interesting space in terms of what it means to be human. Because as we become a more quantified world, theres going to such a temptation to take all data about you and reduce you to zeroes and ones. That is what is coming and whether or not you want that to happen and have to be something thats discussed now were going to technology out and saying that the things going to change the way that we work and live within the next 51020 years to me thats really worrying. We need to elevate ethics for technology, right, to the top of the agenda. My advice would be to know that no data is 100 percent secure. You should always be able to know who is taking your data, how its being used. What right, you have to correct or amend it if i want to or not, you can delete it. And that goes for Law Enforcement or any Government Branch in your country. But also any private company that you might interact with, children using technology, who owns technology in the question, and are they sharing it with anybody . And i think generally being prudent keeping the additional footprint in a minimum. And its a good thing to do also as well. Dont use. Great. So do use technology enjoy. Its amazing information, but just be very that its your data and data. There is a huge group of people at work behind our screens. And the power they have is massive that urge to keep swiping through a twitter feed. Thats just not the way we all click. I agree to the terms and conditions that sucks to most of us. Never even give it a 2nd thought. And actually thats designed as well. Ali reexplore is how designers are manipulating our behavior. And the final episode, all hail the algorithm on the jersey of frank assessments lucas and his dog. Anyway, the protestors are going anyway, either the battlefield evolution and indepth analysis of the days global headlines. Who is it thats really out there on the screen inside story on aljazeera. Jump into the stream and julian on Global Community bio diversity is biosecurity. Is that essential for our species to survive . Be part of the debate . I know you have ideas and you can be part of this conversation. When no topic is off the table, the police are not neutral and all of these cases goal here is to terrorize. And heres the other part of this. Theres no consequence, this stream on out is there. The health of humanity is at stake. A Global Pandemic requires a global response. W. H. O. Is the guardian of global health, delivering lifesaving to lose supplies and training to help the worlds most vulnerable people, uniting across borders to speed up the development of tests, treatments. And that keeping you up to date with whats happening on the ground in, the wood and in the land. Now, more than ever, the world needs w. H. O. , making a healthier world for you to everyone. Ready a deadline for rebels to put down their arms pulses if the o. P. s military swiftly carries out airstrikes near to grad capital. But al, this is al jazeera live from doha, also coming up. Social media executives in washington on capitol hill, answering tough questions from congress about censorship. And the president ial elections, friendship bees debate a controversial new bill to protect police. It could start for human rights and severely limit free speach