Last matters that i am, vice celebrities, influenza, attend experts to talk about all playing loved data. And yet today, nothing less the south. All these things in more and the new season of the fun, the make sure to tune in wherever you get your thoughts costs, enjoying the conversation. Because you know, its last matter, the chief visual intelligence, the race is on and the whole world is watching a i can paint, but i can go and can even help paralyze people walk again, but theres more to it, it can monetize and it can amplified this information is a ios of nation. Well damn nation today on ship. The hey, Art Technology is pretty advanced. It can ride academic essays, identify cancer cells, or even replace people like me. But with a i, the new gateway for miss use as well. A crime is on the rise, one d, b, d, for example, available on the document. It can ultimate the perfect fishing attack, but thats not all. Hope francis in a pop project. The image went viral earlier this year and was generated by a lot more that could be seen as a joke, so called deep stakes can cause great harm. Just recently, the Republican National committee released this a i generated election in the clip, warns of at the scope and future should the current President Joe Biden make it to the white house again . I am not morgan freeman. What you see is not what you see is a defect, a Technology Driving a i scans a deep big kind of go, a Chinese National out of 600000. 00 us dollars. A scam or use a software to impersonate close best print on a video call manipulating go into revealing his bank details. Thats not the only way i can be used to. Manipulators. Thats even more allow me is the potential use case of lots language models and providing this information that is messages, but a deliberately intended to be misleading. And here i worry that large language miles might enable propaganda or its scale. And in a way thats tailored to what weve never seen before. As for example, the News Agency Reuters tested the chinese a chatbox earnings and found that it was reluctant to answer questions about chinese president , gigi and ping, or about more recent events in china is history. Its unclear whether this was intentional for a programming flaw. But it shows just so easily a i can amplify misinformation. The me i has also been known to discriminate against people, artificial intelligence, machine learning, and all these. Lets say complex complex algorithms and data sets are jeff, placate the biases that the humans have. A are recruitment systems, for example, have been shown to be biased against women. Its been proven that facial Recognition Technology is less accurate for people of color with people being arrested for crimes. They never committed experts and politicians in many parts of the world want to regulate the use of a i. In july 2023, the Un Security Council met for the 1st time to discuss the risks, even the head of the Company Behind to, to be the same oldman is proposing more regulation for a i at a Us Senate Hearing in may 23. He pointed out the dangers of the technology, i thought the also says the opportunities outweigh the risks of those i calling for you. I development to be stop completely until regulators catch up by googles Top Executive fund up to try also one say i to be regulated globally. Thats because he claims his technology is as dangerous as nuclear weapons, which i seems to be serious. The executive wants to make certain things unavailable for users. Googles texts to video ai for not key will not generate clips depicting people. For example, hundreds of researchers and entrepreneurs are calling for a pause in a development among them signed up to try and test lubarski longmont. An open letter for such an a moratorium had been signed by more than 33000 people. But the call could also be an elaborate corporate move for big players to catch up with the top developers. After all marks new company x a i is working on its own chatbox truth t p t. Even googles chat box part cant keep up with chapter p t just yet. So one reason for recent calls for our fries could be economic rivalry. Clicked up to china approve, theres more to it than just the can nomic interest. The government, the, it has realized the potential a i has to consolidate its power to thoughts in china. Seem to be reluctant to say anything critical about the regime and since 2019, the government has been using ai to monitor people and evaluate their behavior. The you, on the other hand, is looking to band a. I is the valence legal standards for a use of being drafted. But the so called a i act still needs to be given the green light was this image created by artificial intelligence. Its not always easy to tell. Content or products generated by ai are set to be labeled in the future. One of the pillars of the use new a i act. We ask for a, a degree of transparency. We want people to know even when they are interacting with that safe check bought that this is not the person. Its a check. The priority is the regulation of an application that might interfere with peoples fundamental rights. We have to try the to a breed your, these 2 approaches of pearl fundamental rights and protections. And on the other end, they need to sustain innovation. And development of the ai, a ai will be regulated to varying degrees depending on where its applied. For low risk tools such as intelligence, spam filters, there will be fewer requirements. Strictly regulation is in the works for applications that could more seriously impact to users lives. For example, a, i tools that pre select job applicants for the customers credit ratings, alias systems, themes too dangerous will be banned altogether, such as bio metric Surveillance Systems for social scoring applications. Companies that use a us, well have to register it in an e u y database a systems and how they operate. Itll be open to the public to ensure the maximum transparency is the leading the way for a future with a i or is it putting the brakes on innovation . The impact of these regulations is already showing. For example, in india, one of the Worlds Largest tech markets for Many Companies that are working with europe and companies, they have to what in art about the regulations. Because europe is sort of taking the lead in terms of the regulating a stable lightning data for many people in india, a, a is already playing a central role in their work. So well start thats coming up both in terms of developing new algorithms and also using a systems for various applications from dating to finance, through his agriculture you know, across the board. So, so thats the very exciting part about it. But for a long time, indias government wanted nothing to do with regulation. Despite experts, warnings, the European Parliament passed its version of the act, setting the tone a applications are to be regulated, much like the block chain based web 3. The discussion is well under way. As to how the laws should look. So one is it needs to be bias pretty. You know, it could be gender bias, it could be community buzz, they should buys these things. The 2nd is accountability. You know, i can barely, now i system. I can use it in many applications, but then if the, if the phase who is accountable is it me, the vendor of that system is, is the person who deployed this for now the debate in india is focused on the protection of user data. The ethical aspects of our usage. However, having taking center stage Data Protection is hotly debated in the field of a i largely due to the fact that the eyes are trained with vast amounts of our data. Companies like albany, i google meta say they wont sell. I data, but ultimately users have little to no control over what really happens with it. To chat g b t collects and stores all entered requests, even if the users delete conversations with the box next to the data and user profile, the chat box can store information about the device used, such as the ip address and precise to your location. Ideally, a chat with an a, i should be like a conversation between real people. But this can lead us to reveal much more than intended a companys handle user data to improve their services. Many Companies Use the input to train the ai itself. The thing is, once the system is seen that data, it is then its memory, you know, just like us who months i see a person. And then you say it is like person from a memory. Its not happening. Same with 40 i. If you get already days it from its members. So what does that mean for us use us in any case, we dont have much more personalized advertising. Microsoft, for example, recently started using chat, c, b, t and its Search Engine being information game. This could be used towards tailored advertising. We should be mindful of what we tell a i systems, for instance, employees that Companies Like amazon to allow to use templates for work for a fee of leaking trade secrets, the one that so many countries are working on a loss right now. And let and america however, there are different concerns. Argentina, for example, uses a i to monitor its price, its still regulation isnt the top priority for latin . American governments says research or via treats, present each a less of the method. Yeah. Not of this world. The north and also divide in the field is bradley. Many cut it out in america is more than capable of training. People see that it, for individuals educated in our universities, are quickly drafted from the region by working i local, or even by moving to the entire and a lot though it is impossible to compete with the incentives research that exists in the sphere. I mean, if a foreman with us when he wants to add it at the outset on not a bit on the flipside companies and rich countries outsource a lot of the human labor behind a i, i, well, my not why not. There are many simple jobs here, like paperwork or training the a i itself can pick job, so you dont need much school enough for me seem poorly paid, precarious work is outsourced to the so i cool, well specialist trained our universities work in the normal. So i meant that, but it could be so because its the website, its on the interface. Initial corporations have been using this strategy for decades. One size fits all rules for the industry may change that in the future. Single pulls government is relying on voluntary self regulation by companies when it comes to a i. This could make single pool, a global hotspot for innovative a i development. But it could just as easily pay the way for misuse. What this means for you is this will only become apparent in the future. So is regulation hom, for to innovation . Or should i be more tightly controlled . Let us know what do you think that, that for me is you next time my, the eco africa. You can look and be sustainable. We are trying to construct ideal for ways to do means when he comes to the test when he comes to flushing, our design is just our to our tools to gauge everything, to be taught. Thats the message from the younger man design and you go for it goes to 77 percent West Central Africa have the highest rates of Child Marriage in the world. A violation of human rights off 15 at this stage, would you want to be married . No. Because of being a girl child, i believe to say there is more or to nice and getting married at the very young age most is being done. Its a fight this cool practice. The 77 percent is 60 minutes on dw, we are all set and were watching closely. We all seem to bring you the story behind the news. We wrote about unbiased information, all 3 months. Done the we all know that it is important to make a good impression on the close issues a lot about what sort of positive you are. I am under 3 know do and today on equal off because