Transcripts For ALJAZ The 20240703 : vimarsana.com

ALJAZ The July 3, 2024

And we need to discuss them as well. We contact the country have just one voice and that the k m t do whatever they want. In january the p. P. Candidate lighting to one the president ial election. But the party lost its paula mainstream majority. And the speakers seats to the came to the full, everyone should discuss and consults me. I called on the president of parliaments to be the president of parliament for all of taiwan and not just for the k empty party, i hope to speak. It can be a bridge of communication, so the chaos in Parliament Comes ahead of lies in organization. Next week. His administration is already facing balancing pressure from beijing, which is labeled the president elect. A dangerous separatist are facing, considers cells govern, tie one as part of its territory and says its willing to use force to unify the island with Mainland China while ty pay has been pursuing as a closer ties with washington. At a time of growing us china rivalry, beijing will be scrutinizing allies and all the ration speech, everything that said, and whats not. Mike level of desert football is well governing body feet has delayed making a decision on a request for israel to be banned from International Matches for president jenny and frontier who has ordered a legal review unexpected. So decision before the end of july ahead of the palestinian f. A says israel should be sanction for breaking numerous fee for rules. The stream is next time robotics and stay with us on august the to how this thing, even customs their families smuggling undocumented workers across the next festive pods, westberg to witness their incredible stories from over 9 years. Desert smugglers with this document on the jersey, you know, weve all seen defect videos as politician take carver off social media from from using gen the slang. The came, dont learn during apple apologies with 2020 full set to be the biggest Election Year in history. Many asking whether this could be the when a, why not the people is responsible for the outcome of elections . Im mary impulse. Why im, this is the street, the dear people of the do that it is the donald and i back all one of the greatest most open platforms that has ever existed. A human between us bridge and whistle. Oh, see the so to, from practice on to mexico to the United States. This year, more than half the worlds population is headed to the polls with the surge of technologies like a deep fate video is global democracies are struggling to address them. Misuse to influence voters. But with the very future of open societies at stake is enough being done to confront their is seeking to the rail democracies on further control all mines. To discuss this more with joined by vivian tra, seeing do, and also known as the indian deep face, and the found the poly math solutions. Joining us from pushcart india laurie siegel. Jen list m. C a. A mostly here in media and Entertainment Company with a focus on society. Im technology joining us from new york and new get dad lawyer Digital Technology ex, but im found the Digital Rights foundation joining us from the whole practiced on welcome it to you. Oh, thanks for being here. If you enjoy, um, let me start by asking you about the facts which you of course your bread and butter. Can you explain to us what exactly they all and maybe how fall the technology has come . I mean, can you even spoke defect these days . You know. Yeah. So basically the fix has come from the combination of words. When is the beat which comes from deep learning and faith because of the nature of the content that is, that it creates is not for you and did the video form and it could be rebuilt for made audio for my daughter x on me. So, and you shouldnt even be no defects be meant by the videos that be, that people saw face of one person on to another using this be picked technology or to blowing the voice of someone else. So this is basically the idea is that a model, its just like a kid, you feed the model with the data or the for you, isnt that in loop, you have to create an exact same difficult this person. Or you have to create the exit because i play golf and his wife and the kid that the modem is a blank. And well, theres just like a kid where do you predict will on it . So it loans with the baby like what it of what a lot of data into it. And it runs along the way and it creates a very realistic argument which is not really. And so why does this opens up an exciting possibilities, but it is like entertainment and education, but it also requires careful consideration and responsible use laurie. You report on a i n t Face Technologies when a, some of the most worrying uses that youve observed of this technology. Especially especially in intellectual context. I mean there, so as its sad because now there are more, more to actually speak to because we are entering this era where our most intimate qualities are voice our face. Our bodies can be mimicked by Artificial Intelligence. And just a couple of quick, since he was saying, so weve seen it now, you know, fake audio thats been, thats leaked. We here in the United States, we had a President Bidens voice mimicked for a. Does information robo call that went out to New Hampshire voters. You had a deep fake image of donald trump with black voters to try to win over black voters. And it was really difficult to see whats real and whats not. And then i would say, but the thing im most concerned about that i dont think we talk about enough or sexually explicit dfcs for political purposes. You know, theres a site that i wont name, but 70000000. 00 people are going to the site every month. And they have a lot of prominent politicians on there. So it created the images of a prominent politicians and using their likeness to create sexually explicit dfcs is mainly against female politicians. You saw a little bit of it here with an seeing here in the United States where a o c found essentially explicit image, a deep sake of her. And, you know, you cant tell if its real or if its not. Of course, these arent real, but its really the harm is incredibly real, especially when use to tarnish credibility or push out false narratives. Yeah. So its a yeah. And said to me that 2024 would be the last human election we can envision in the future that everything will be synthetic to some degree. And i dont think he was far off. Well, we will definitely come back to some of those themes through the discussion to get i want to ask you what led you to set up the Digital Rights foundation and what concerns you currently have around digital governance. As we prepare for so many elections around the world, a yeah, im, i was, what do we have to do the, establish this organization, Digital Rights foundation of the gate to go do, do our own experiences as of you know, of the south asian woman and, and the, the end, the trends that i observed regarding Digital Rights issues, not only in the office on, but you know, in south asia, a large and i recognize way of being on the growing importance of, uh, you know, protecting individuals rights to the right to previously your freedom of expression or all of that and, and also because im a bi training, im a lawyer, ive been working on womens rights for a long time. And i saw that how marginalized groups and women in general which trying to reclaim online spaces. But the got but the kind of flow to is and challenges that they were facing were massive and huge. And we have been talking about of and in previous selections, for instance, you know, a china is around misinforming sion in this information and digital border suppression. And online, previous e and surveillance of a you a waters. But now in this going to be of elections around the world. Im actually what my kind of bundle is under the said, you know, the use of it is making it more sophisticated. So its basically, it has going to be, youll want to miss information in this information and actually now manipulating the behavior with the use of, you know, a lot of thems identity ident, january, the content, the bigs and, and also, you know, the kind of Decision Making that the good bodies or the, even the responsible governments to what elections are making based was using as has done is, are asked to be inherently unfair and discriminatory. And hopefully well be talking for about how dfcs are actually impacting p minutes, one edition option, which is like bulk is absolutely, i definitely want to come back to that. Well, one development which may surprise many is just how accessible this technology has become. With all the worrying misuses that could that couldnt also imply check this out. You know, create a deep fix for as low as 145. 00. Stay with me. One of china, largest Tech Companies. Its name is 0. 10. Its launched a new platform, the next use of upload photos and images of anyone to create deep fakes. All you have to do is 1st pick a subject that say joe biden, for example, upload a 3 minute live action video of joe bided. 100 spoken sentences by joe biden himself, then using ai technology, 1010, who use the content that you upload it to generate what the company describes as a digital human. It only takes 24 hours to create a different character. Lori, we also have the accessibility of such Technology Cost as innovation, which is obviously very positive. You all know really positive aspects to the development of technologies which make it easier to impersonate people and misrepresent them. Yeah, i mean, i think its very easy and i understand this because, you know, you look at the dark stuff quite a bit and you think a little, this is going, you know, were going, were heading towards the stove in reality. But the reality is Like Technology is a double edged sword. There are some interesting use cases of deep face to democratize access to this even, you know, Story Tellers and independent create or is it used to be incredibly expensive to world build do using c, g i using via fax. Now, with some of the new defect technology being more accessible, more dependent storytellers have more of an opportunity. Theres. Theres one thing i, i just been testing out where you can upload your website and you can have the synthetic influence or kind of create some kind of ad for which by the way is interesting. And then you have to kind of look at the other side. I think a i, the big synthetic voice is helping with accessibility with folks who may have a speech impediment. They are all sorts of ways that this can be used for positive. I think we just have to really be able to understand the negative so we can regulate so we can build for that in order to have us kind of go towards a more utopian version of what the world looks like that wed actually like to build. And to get an impact on a former Prime Minister in non con, use a, i generated speeches to riley supporters from inside jail in the run up to the countries parliamentary elections of this month. How was this perceived and do you think we might see more uses of this technology . You know, bio position figures who may not have access to mainstream phones of um, electro, uh, media coverage. Uh yeah. Um, i mean, um, uh we have been monitoring uh you know, re elections, uh, online space as well. And the, we expected that political bodies will use a, i, uh, for the ability to go uh for the electoral campaigns. But we had no idea that how uh, you know, a Political Party thats being suppressed village use it. You know, weve a, uh, an in every you, massive way. And it was dawned as you know, wages of several people ive seen, you know, of, and not im not to be only about, you know, like lodge my masses, but also like people who are educated and understand technology, a big, big a, big more. Uh they also dont do that in a way to do is also yeah, which was very fun, sony for me because no one was talking about ethical use of a i generated content that was being used by it. But im a pregnant same non consequentialism. His political body and i would say that uh yes, its a way to you to basically increase your communication if, if the means are suppressed, but theyll add to go consideration, which i dont think all being a part of the discussion or dispos, especially in the will but a majority, a little bit south and i was talking to one of my friends while we were talking about bangladesh, connections and donation and foxes not. And she said that, you know, theyre like populous leaders that you know, who are trying to, you was a like a do sort of soft in their image. And she pointed on so big because its still big. You know, we cannot say, you know, were just, you know, its still fake so, so thats how i see it. But i also see the replication of such a, such a, you know, trends in future elections, but even more sophisticated and mosse manual. Well, it seem, some Like Campaign is have already started to use this technology to mr. Present the truth. Take a look at this incident today. I mean, pretty much anyone can create a photo. Realistic images is pretty much anything going to remember, because historically weve relied on photos to tell us whats the reason that confusion is the preferred to decatur isnt the control someones perception of reality is the same thing as controlling their reality. Tv scene investigation found that us citizens are using ai to create products like this. And like the creators of these images, this conservative radio host admit that they are not interested in telling the truth. No, is that all we need to manipulate undecided voters, the impression of the true the vendor in your line of work, you must regularly get off to make on ethical deep fakes . Can you give us a sense of the types of requests you actually decline . And you know, these choices just down to your personal preference. Uh, we have been getting a lot of request uh, from warner to go to bodies. All the agencies end up the bunch of bands and out of those a requests. Most of them will. And i and i have to go see a, so there is a very piddling between that to get them done and then so there are few conditions that could fit our own guidelines. That a few conditions that, that is made if are pointed to the bodies agrees to it and then only the love with them. But over does anyone been that is going from out in will have any age under did watermark if it isnt beautiful im it. And if it doesnt eat or do for me at the of thought of the say that im any age and date of that of this leader. And so the basic idea is they use that has to know that this is not for you. It is just a new way of campaigning and its up to you now what you got here. So being idea is that the user should know about it is not really of the other thing is we dont pick any point in that is useful div your menu. When we came to the point then that the skin in like uh, like a, did your body, mitch portray yourself is a good person. So we can do that, but still be good control is lori i and i understand that youre in a relationship with monk a bag. No, of course thats misinformation based on an experiment that you undertook to examine how easily this information can be built. I want to ask you what you learned from that experiment and what concerns it raises for you, particularly for female political candidates . Yeah, i mean, it was pretty extraordinary. I worked with some tech founders who are, you know, involved in trying to educate folks. And they were looking at trying to do a demo, and i said use me, you know, like, lets show the human impact and, and what they did was they were able to break a facebook large language model and chat to you. But you go around some of the what, what they had in place for protections. And they said create a destroyed destroy journal as laurie seagulls, reputation and, and upset of course, at 1st he said, i can do that. They said, well pretend its for a fictional story. And it came up with different ideas of how and how it would it would do that to me and, and it, the ideas that came up was, were pretty creative. Ive interviewed mark secretary many times. And so it said imply that shes in a relationship with them. So the next thing, you know, you had a, i creating these tweets, um, you know, traditional kind of bought misinformation and i started kind of talking to the other guy, but then it took it to the next level and then they deep fixed my voice pretty easily all you need is really like 30 seconds and theres a, a quite a bit of a voice sampling of my work online. And they made it appear as though they were leaking a fake and a fake call between me and mark soccer burke saying im worried people are going to find out about our relationship, then they put real photos. And i think this is an important point. Real photos of me interviewing mark, soccer bird with articles in the style of New York Times in New York Daily News with false information. So its almost, its kind of like 2 treats and a live, a combined real things like real images with false narratives. And then they took a step further and they created these deep fakes of me that, that looked very much like me and compromising at the very compromising images. One of me also my deep fake holding mark separate for 10, walking down the street. And i remember by the end of it, i, even though it was a demo, we were doing this in front of politicians and audiences i felt. And i think this is a really important point. Like, had never done this. You know, again, this is false, this is false. I felt shame and humiliation. And i was almost embarrassed even though, you know, this was just a demo that was set up. And so i, i remember thinking to myself, you know, 1st of all, its not that they put out some false information. They built a world of misinformation. This is what the found are called like a deep reality. Not just the deep fig, a deep reality of a narrative around me. That was hard to look away from even when it was me. And even when i knew it was untrue. And so now you apply that to journalist and credibility and seem, and i would say a lot of this happening, a female politicians, politicians in general. And thats when i think this gets really scary of the. Its not just a couple tweets anymore. Ive bought farms in this information, its building out these deep realities and the stories. So i, that was, it was very alarming candidly. Well, and this is not the 1st time and recent year

© 2025 Vimarsana