vimarsana.com

A central medicine. The walk out has shown little sign of a breakthrough almost as little. We want the government to show the will to talk and unify its channels to dialogue and immediately stop insulting training doctors and treating them like criminals. This former entering doctor has tendered his resignation, which likes thousands of others, has been left unprocessed. Most of his striking peers amounting to about 73 percent of south koreas Junior Doctors have kept their silence turning to their seniors at the main doctors lobby group to fight on their behalf. To others, i put it under the cam. A Emergency Committee doesnt want the government to cross the rubicon. If we fail to seize the opportunity, we will lose the korean medical system. Considered one of the best in the world and neither doctors nor the people want to see the result of that failure. The reason for this strike is unchanged, death, theirs are against a medical policy package and a plan to significantly raise the number of new medical School Entries by 2000 next year to get to 10000 additional doctors within a decade something the government maintains is non negotiable, but patient groups are a great saying that each delay feels like a death sentence, and that doctors need to return to their posts with mill conditions attached. Meanwhile, president use on your Approval Ratings have gone up with a key selection less than 2 months away. You know skim, alda 0. So or right, thats it for me and the team for this our as always, our website alpha 0 dot com has the latest on all of our top stories up next announces 0 studio beat, the 10s of thousands of chinese asylum seekers. Are risking their lives in a dangerous route to the us where in america. But what happens once they make a big impact to of a special investigation . 1. 00 oh, 1. 00 east, aust. Chinese, my friends, the American Dream is worth the risk and the sacrifice on out to 0. The Artificial Intelligence is invisibly running our lives and is expected to bring far more for found changes to key manage the button. Part one of this discussion, meredith whitaker, and Camille Francois challenged many of the notions that we have about a, i think theyre not technologies of the many. We are the subject of a, were not the users of a i in most cases when you get to the bigger models that are trained on more data. Graceful biases and stereotypes get worse. And in this final episode, these amazing women discussed big tech and how to navigate the world of a i. So how do we make a less discriminatory how to resolve its issues with privacy . And how do we tackle the surveillance Business Model, the well hello. Hello again, meredith. Wonderful to be here. Camille. A wonderful to be here with you. Last time we chatted, we talked about risks and ai. Why are some people worried about a i taking over the world and destroying to mattie . What is this thing they recall ex, essential risk. Where is this coming from . How do you feel about that . Oh wow. Well, accidental risk is, is a thrilling story at a little bit know an emotional level. Its very activating to think of, you know, do and conflict and these sort of in great power scenarios. And its kind of catnip to a lot of powerful man. The idea that a, i this, you know, this scrape data, a big compute, you know, big models is going to somehow find the escape velocity to become sentients and superhuman. And we had better hold on because we either need to control that powerful, powerful, powerful ai or were going to be superseded by it. Theres no evidence that x a central risk is going to happen. I think theres a lot of questions around like why now did it catch on so powerfully. And i think a part of this was the answer that question that you know, i know, camille, you think about this as well . Is it, its, you know, while there are some true believers for whom this is very meaningful and i dont want to take that away from them. This is also an extraordinarily good advertisement for these technologies. Because what military, what government, what Multi National doesnt want access to this hyper hyper hyper powerful a guy doesnt want to be the one whos sort of controlling it doesnt want to imagine themselves at the helm of death star. And this advertisement also serves to distract from the fact that these systems continue to be discriminatory. And that discriminatory capacity continues to accelerate the fact that the systems are used by the powerful on those with less power in ways that often obscure account ability for harmful decisions. The fact that were talking about a technology that is built on the basis of concentrated surveillance power like the world has never seen. Right. But we can erase all of that by being like, look over there. The terminator is coming. You talked about it, its not exactly a new idea, right. Nick, fostering wrote super intelligence now 10 years ago. Its a book that sort of focuses on that idea that a, i will look, celebrate to a point where it can no longer be controlled by human and will pose an ex, essential risk. The fact that today this concept dominates some of our conversation on safety is meaningful and were re, sons are because were at the Pivotal Moment where we have governance, for instance, for the 1st time saying hey, we would like to organize in to discuss with the safety mean in the context of a i and so we have government is coming to the table, we saw it with the a i safety summit we so a series of 1st declarations of 1st regulations. Theres the white house executive order in the us to hear a stream of process coming out of the g 7. And so there is this urgency to define what is it that were worried about, and that we want our elected representatives to protect us from and to focus on when we talk about the safety of a i. So i think youre right, it doesnt mean that everybody should laser focused on avoiding terminator scenarios. It also means that we need to focus on the very immediate harms to society, the biases of discrimination, and the surveillance implications which we havent talked about just yet. I see youre, youre, youre a surveillance eyes. Oh yeah. Yes. I was sure you get there. Smells in the furnace of concerns over surveillance and privacy. And yeah, i think we were around google at and i think 2014 or so we met. But that was, that was the post snowden era. Right. So we came out of the ninetys in the us with a Regulatory Framework that had no guard rails on private surveillance. So private company could surveil anything. Right. And they could surveil it in the name of advertising, right . And so we get, you know, after the ninetys and this sort of, you know, permission list surveillance. You see a lot of very cozy partnerships between the us and other governments and these private surveillance actors. Right . So, you know, getting data from them in certain ways, brokering relationships, convincing them to create backdoors and their systems. And this is documented in the snowden archives, which of course happened in 2013. So this is a message on the system. Take a moment to define whats a back door. A back door is a, you know, generally intentional flaw in a secure system that allows and a 3rd party access to contents or communications. So it would be a if were using an encrypted system. So you and i are texting and we think that is secure, but in fact the code is allowing a government or a 3rd party to access that and to surveil our communication. So backdoor is sort of, you know, the colloquial term for a flaw in the system. That allows that kind of access, i think here the critical security concept is this idea that you can have a back door thats only for the good guys. And so if theres a hole in your system, theres a hole in your system. I think thats why we care so much about strong entering encryption and making sure that when we say a system is secure, its secure for everybody and from everybody. Yeah, it either works for everyone and that means i cant see it. That means the u. K. Government can see it. That means pollutant cant see it. That means x, y z hackers cant see it or its broken and we can all see it. So you were saying 2013 big moment of reckoning and Silicon Valley over privacy. Yeah. And those concepts of surveillance. Yeah. And that was, you know, kind of the world i lived in, right . Watching this privatized surveillance opera as that google, you know, that had been justified on, you know, hey, we have a duty to our customers. And were just giving people more useful ads and more useful services, but snowdon kind of broke that open, right. And since then, theres been a kind of an easy situation where, you know, encryption has been added to some things, but the, the pipeline of data and Data Collection and data creation continues because that is again, monetizing surveillance is the economic engine of the tech industry. And so again, what happened in 2012, there was a recognition that this Surveillance Data could also be used to train and to an a i. And that these a systems were incredibly good at both conducting surveillance. So think about facial recognition, think about productivity monitoring. So i think that we have to read as almost a surveillance derivative, right . It pulls from the surveillance Business Model, it requires the Surveillance Data and the infrastructures that are constructed to process and store this data. And it produces more data as of heightens the surveillance ecosystem. That we all live in, you know its, its also what i is observed working on this information and on troll farms in 2017. And i was doing some field work. We know before that around 2015 or, and working with those journalists in human rights activists, including maria, who were so often targeted by governments. Their phones were being hacked. We were very concerned about making sure that they had secure software. We could secure their phones secure their computers. They were very much under heavy surveillance. I remember they were the 1st ones to say, hey, theres something a bit off thats happening on social media. And we think its harmful. We think its violence and we think its related to the hacking. We should take it seriously and we should try to uncover whats really going on. And we should really apply the same rigors and tools that we had in our work on Cyber Security and say, we can analyze this, we can do forensics. We might even be able to attribute it. If we see networks of fig accounts that are deployed against a journalist or a human rights activist, with the sole purpose of silencing them, threatening them and we might be able to hold a few people accountable in this process. We were of course, sort of slow to do them as an industry and that created the sort of greek reckoning of 2017 right. What it took for Silicon Valley to care about that is really the us president ial election of 2016. And the fact that russia was able to use what we now call toll farms. Right . Series of emphasis, fake accounts to, to have, you know, to have a campaign against these president ial elections and what, what followed after is a full year of Technology Executives having to go to congress and justify why they had listed it. So that i think was also sort of familiar, really Pivotal Moment where i some new foundations where establish for, okay, may be we now live in a word where as a society, we feel that Technology Companies have a responsibility to protect democracies to and that we feel Technology Companies have a responsibility to tackle this information and then to think about how their technologies can be abused to minutes away the elections. That is also something thats coming up for us in a i, in a really interesting way. But what i am concerned about in addition to those very real, very pernicious problems that happen when you mass scale a Global Information and social platform, you know, again, incentive eyes for sort of click engagement and profits and surveillance and advertising is that the solution space in my view seems not to go far enough. So you have Something Like the you case Online Safety act, which is this massive omnibus bill that was catalyzed through these very real concerns, right. What do we do about these problems . But they rarely look at that Business Model and the sort of, you know, they take as a, given these mass social platforms. And then the solutions often look a lot like extending surveillance and control to governments. Expanding the surveillance apparatus of Large Tech Companies to, you know, government chosen ngos, or government actors who will then have a hand in determining what is acceptable speech, what is acceptable contents, but is not actually looking at, you know, how do we attack the surveillance Business Model that is at the heart of this engine. And so, you know, this is very real for me and were in both space in the us. But we now have, you know, books being banned in certain states. We have, you know, Reproductive Health care or health care in general, unavailable to many people in states where Reproductive Health care has been criminalized. So, you know, i really worry about these problems with platforms about the way they exacerbate hate and allow trolling and just information. And i also really worry about the solution space when that is handing a key to governments that would lock up a woman and her daughter for accessing health care that would banned books and that, you know, across the world are trending toward the authoritarian. Absolutely. And so what we need to think is also a diversity of these platforms, right . Platforms that are not tied to these surveillance capitalism. Business models platforms that can put security and privacy 1st, that can operate in a public interest. And i think thats what were doing with signal re not wondering a little bit about that ourselves from the sense of the ninetys and doing it jamie yourselves, from the sins of denying me speaking to a ranada commentary. Write this as well as fashionable commentary now, and its, its hard so it makes a trip to or come into the work. It is a trip to work commentary. So lets talk a little bit about whats happening with signal. I was very excited to see that you published a piece about how much it takes to run signal. Yeah. And you said it costs 15000000. 00 a year to actually operate this technology globally. Why did you do that and out, how are you using 50000000 that year to the ext cigna will work at scale. Well, we did that in part because we are a non profit, a rare non profit, not a fake non profit like open a, an actual, a non profit that operate in a tech space, again, dominated by this Business Model. So we one wanted to be accountable to the people who rely on sit and all the tens and tens of millions of people across the globe. We use this as Critical Infrastructure who donate to keep us running. And we wanted to offer a Cross Section of just how expensive it is to develop and maintain highly available Global Communications infrastructure. The sort of, you know, Free Products and services and this shed light on how profitable this industry is. And how significant the monetization of surveillance is in terms of a revenue generator. Were a non profit because the engine of profit is invading privacy. And our, our function are so focused is creating a truly private communication app where we dont have the data. You dont have the data, you know, if the cops are facebook or anyone doesnt have the data because its only available to you. But then the question is okay, with out the data to create the revenue to cover 50000000. 00 a year. And by the way, 50000000 was very cheap. So how are we going to guarantee privacy while supporting, you know, what it takes to actually produce an app that works for everyone. And were, you know, i think that question is way, way bigger than signal. And i think its one we need to be asking of every company out here, wheres the money . And that sort of is nice not to how we started this conversation, which is making sure to that the money goes to tackling those very risks, to safety, to moderation, to privacy, right. Making sure that the investments are also just being, you know, keeping in line with the, with the was texting and managing those. So its your technical harms. That is a good segue for us to take a few questions, either on the infrastructure, inventing new resourcing models. So we spoke a lot about corporations, governments, federal, i think its almost embarrassingly easy how individuals, everybody in this room hits the consent button when you want to read a thing on the web page. And we lose all of that data. How do you go from being the mind not at the off individuals need to recognize that protect their own data was, is the easy access to information im giving that up . I dont think this is a matter of individual choice or individual blame. We cant function in this World Without using these services, right . You know, we have to do this to get a job to function in the workplace, to go to school, to have a robust social life in a world where so many of our, you know, public spaces and ways of communicating with each other had been hollowed out by these platforms, i actually think it can be really dangerous to make this a, you know, an issue of individual will or intellects or consciousness. I think, you know, what were talking about is that a deep or collective issue where our lives are shaped by the necessity to use these systems. And where, like look, facebook, create ghost profiles for people who are not on facebook in order to fill in your social graph. Data tells you something about the people arent represented of the data the same way. It tells you about the people who are. So im encouraged by im new frameworks that are emerging that are maybe helping us thing a little bit more collectively about our data. And so for instance, in the united states, lot of people are working on this idea of data trust. And this idea that you have data rights and you can also work with organizations who may were present to your data. Ryans make it easier for people to collectively say yes, i will. And trust a non profit that its trusted to make sure that i can exercise my right i and this part this, this entities to, for instance, can be also collectively bargaining to make for that. Again, collective right . So being represented, i think that were heading towards new, new frameworks, new governance mechanisms, new regulations where we think a little bit more collectively about our data. Lets take another question so far our conversation or your conversation has been pretty us and tracked. And you know, rightfully so, but what do you think about the, essentially a i arms race between the us and china, and what it means for the wasting ship between the 2 countries as well as the impact on the rest of the world. You know, they are very valid concerns about that kind of, you know, the potential for the misuse of this technology. Im not going to dismiss those. But for me, this is a, an economic arms race, you know, which whole the us are, china is going to engage, you know, as much of the world as possible and kind of a client states, right. Provide the infrastructure, provide the api i is provide the sort of, you know, affordances, so that, you know, they can both extract the data and sort of, you know, and revenue from in various countries and, you know, maintain control through these, these companies. So i think, you know, i think theres a lot more to say about that framework. You know, when we talk about a race, we really need to be asking like, where are we racing to . Is this a race to the bottom up sort of, you know, to polls of an economic surveillance state that are exercising massive social control over the rest of the world. And is that a race we want to win . When i think about, you know, government instead of rushing to make those investments and us talking about those arm race, of course i also think about the fact that we have little agreement on what are the legitimate ways to deploy a i in military context in conflict context, how does a shape the laws of war and also, i mean guys that we have investigative reporting that if targeting has been done by a i, that theres a massive i apparatus and that, you know, were witnessing a significant, significant, unspeakable civilian casualties. I think then in that context, i think these are very important questions. We are in friction in gene in the world where we have multiple warheads and conflicts and seeing governments accelerates to face building, deploying those types of new technologies in conflict context. Most give us pause and help us asked to what are the rules of the road for the deployments of these technologies in these contexts to so you know, youre, when, when we talk about arms race, this is sort of 1st where my mind goes. Sure. So well take one last question that all the information vacuum site now, which are good breeding grounds for this information. And what do we do when you know, you may have this push back on the Online Safety now act surveillance, bowers and that. So when you see government and big tech centers, fusing that powers is the solution, they break up big tech. I like you and very concerned about the sort of metastasis of surveillance and censorship powers. Again, in the hands of, you know, governments and corporations that dont always reflect the Community Norms or, you know, the, the social benefit or the interest of the marginalized or etc, etc, etc. So i dont have a solution for like the one weird trick to do to solve it. But i think its going to require social movements because again, youre looking at sort of entrenched power and a kind of government. Government is willing to weaponized the language of accountability and the language of reducing big tech harm in context where that expands sort of the big tech model or the authority to governments. Right . But what we havent seen are sort of in bold regulation. There is not political will to use the Regulatory Framework. So i think there needs to be much more demand. And i think about it almost as like, you know, a kind of dignified stance, right . Like, we dont want to live in this world and we should have the imagination. And i think, you know, like the, the deep optimism, right . That is willing to recognize a world in trouble in danger. And, you know, terrible peril isnt looking away from that with pollyanna eyes, and then is demanding changes to that with a clear map of just how bad at this, thats a very elegant freezing to say that we, we should. And we are able to have an alternative futures alternative models. And then to to say in d like this is, this is not how we want to live with technology. Its not being a luddite to say, these are not models that should continue, right. Lets invent alternative features that are more rates preserving that are better for society that are better for the planet to thats and so huge climate implications and everything that you just said around surveillance, capitalism that we dont talk nearly enough about. Yeah, i mean is that history of computation that actually traces it back to plantation management techniques that were used to discipline control and surveil and slave African People as part of the transatlantic slave trade. And ive written on this sort of a history of computation as taking templates from those labor control mechanisms at the sort of birth of industrialization. What paradigms where they reflect reflecting and reflecting and know that doesnt mean we throw them away. That means were mindful and we just like in that kind of punk rock spirit, we demand more of them. I love that. I think that is the perfect ending. Lets embrace that then. Correct. Spirit lets and then more, lets invent better futures. Thank you so much for this conversation. Together, the 2. 00 to 4. 00 episode to the special c these on a i. Weve gone beyond the headlines and hype spoken to scientists and industry leaders, working to align profit motives with safety, as well as examined the coded bias that is already impacting our world. When real life is now stranger than fiction, we need to step back, look at the history of a i how its impacting economies around the world, how it is affecting violence and more fear, and what steps we can take now to make a i safe and ethical for us, all the stuff ation is being used as a what the only way to stop cutting costs ation is to allow you then people are surrounding it. Sparks gold with the flour just hard to break it and show us the level of difficulty because of the shortage of humanitarian aid. We have no blanket, no food, and no rule to. I prefer the desk to this humiliation. War is not killed as yet. It seems that we will die of hunger and cant find food. People say that theyre stuck between death star bases and displacements, the listings in the gaza strip. As is there, as long continues. Theres a deliberate mission of posting in humanity, in western media, and it needs to be questioned, sustains coverage that actively humanizes as readings and actively humanizes palestinians. This is not the time for doing this to kind of wait tracking those stories, examining the journalism and the effect that News Coverage can have on democracies everywhere. Here at the listing post expo 2023. The world, the fascination of joint and lets discover a better world expo 2023. The ha killed while waiting for food is really snipers and tanks open fire on crowds of civilians, just west of gus, city, the serial. Then you have good to have you with us. This is elsa 0 lives from doe who also coming up the is really army insists its soldiers opened fire because they felt threatened by the crowds, yet provided no evidence to backup those claims. Nearly 5 months into israels war on gun

© 2024 Vimarsana

vimarsana.com © 2020. All Rights Reserved.