vimarsana.com

A symposium that i am cosponsoring with an america in stanfords Global Digital policy innovator. This information has become a newly potent rat in the digital era. It it is weapon eyes are most cherished freedom. Discord and undermine democracy worldwidee. Here in the u. S. We saw russia deployed disinformation as one of its tools when in the words wof the molar report the russin government interfered ino the 2016 president ial election and sweeping and systemic fashion. Alarms have been sounded throughout american intelligence community. Department of Homeland Security and justice. By many of the experts are generously taken time to speak to you today. They warned that these attacks are ongoing. They are accelerating and they are unlikely to be waged by russia loan Going Forward tiered House Oversight subcommittee on National Security heldy a hearg on securing americas election infrastructure and political discourse. I was pleased to be called on to testify. I spoke then about how the infrastructure of our elections ts not just the physical election will apparatus, but more fundamentally the state that american citizens have in our elections. I spoke about how that has been under malicious attack from our foreign foes from Disinformation Campaign. When the russian government spent millions of dollars to interfere in our election, that directly implicated the ban on foreign spending in our elections that is part of federal Election Campaign administered by the fec. When they hit the source of that spending, they violated the core disclosure mission of the agency. The problem is so much larger than that and cannot be meaningfully addressed if each of it abuse it to our own narrow lens. This information, the united character of our United States. This information exacerbates our division and distorts the truth that binds our common experience the grand master and human rights activist, a close student of russias long history with this information puts it this way. The point of modern propaganda is not only to misinform or push an agenda, it is to exhaust your Critical Thinking. To annihilate truth. Well, today wee will prove that our Critical Thinking is not nearly as exhausted. That truth has not nearly been annihilated. Gratified that this event has drawn one of the largest gatherings ive seen as my time as a commissioner. We have overflow space, standing room only. It is really, really terrific to see you all here. We have current and former officeholders. Scholars journalists and advocates. We are fortunate to be joined by luminaries. Representative Stephanie Murphy of florida and former Homeland Security secretary. A number of representatives from the Tech Community are also in attendance including folks from google, microsoft, facebook and twitter. Some. Some of them are on panels, and all of them will have opportunities to ask and answer questions andel make comments. I thank you for coming and i welcome your participation. My cosponsors and i seek to launch a conversation. We have brought to this room many people with many different perspectives. We have bided many others. This is a beginning. We have a wealth of expertise in this room and i intend to do far more listening than talking. Some of us here today, like me,w swore an oath to protect and defend the constitution from enemies foreign and domestic. It is fitting we are gathered on constitution day. 232 years after the constitution was ratified. To Work Together to defend the system of government created by the constitution which is indeed under attack. Large and challenging. We will not solve all of our problems today. We definitely will not solve them by working in isolation. We have to Work Together and we have to see the big picture. Defendants of democracy and stanford universitym global dit policy andfu qubit are. America has been a strong and consistent of values in a leader in spotlighting the risks to free speech and open discourse posed by disinformation. I want to thank them for the vision, focus and energy that they have brought to this joint project. Stanford university Global Digital policy incubator is another leader and innovator bringing together governments, Tech Companies and Civil Societies to develop technologies that advance human rights and Democratic Values. Thank you to the executive director of gdp i eileen for all of your insights. I wont take the time to fully expand on the many accomplishments, but one really does bear noting. N working together they were instrumental to the passage of the first ever on Internet Freedom in 2012 which way down the principle that human rights must bege protected online justs they areu. Offline. A little housekeeping out of the way. My ministry to staff asked me to point out to you the two exits to this room. Right over here and way in the back. Doing my Flight Attendant routineo. Please exit in an orderly fashion and head to the staircase by the water fountains which is in between the two doors. I hope the fire warnings are never necessary, but i certainly hope they are not needed today. I want this to bring light, heat and discourse. In that spirit, i am am very pleased to bring to you one of the countrys leading lights lights on the fight against this information, senator mark warner. Vice chairman of the Senate Select senator warner has been at the forefront of the committees ongoing counterintelligence investigation into russias 2016 election interference. In a town increasingly torn by bitter partisanship, senator warner has worked hard with the chair the committee to keep the committees efforts i partisan. Senator warner is recognized as one of the preeminent voices in the ongoing public debate surrounding social media and user privacy. The cofounder of thepr Cyber Security caucus. One of the bills would be tough knew all my data breach protection standards. In other dresses social media use of patterns, deceptive tactics that trick consumers into handing over their personal data. If its digital, senator warner is all over it. I am delighted to have him here today. Please join me and giving him a warm welcome as he delivers our opening keynotes. [applause] thank you very much. Thank you for those kind comments. I will also agree hopefully we will not be rushing for the doors. If you happen to see a posting on your device, rush for the doors. I want to try to get this right where i can actually put on my glasses. Thank the chairwoman and ambassador donahue for hosting me today and for putting together this very thoughtful and timely event. I will try to do this without glasses here. I want to thank a longstanding defender of Free Expression and public discourse. Unfortunately, like the institution today, that discourse is under assault. Its under assault from dark money. In each of these cases and others, we see openness, adversity and accountability being undermined. The internet optimism of the 1990s and early 2000 obscured this trend for many of us weird a bipartisan consensus that the internet was the mock resizing. Facing major portions, major policies on to underlying optimistic assumption. To our stance on china and the revolutions. As a result, we now face serious policy challenges. Today, and i say this somewhat reluctantly, we need to rethink that optimism. We need to come up with a new set of foreign policies that are based in the less optimistic or at least more realistic notion of Tech Knowledge he and the internet. To be sure, this is a somewhat surprising place for me to come from. I have kind of been known as a tech guy in congress. Like many policymakers, i share the consensus that these technologies and the companies that were largely positive forces. We have seen how the misuse of Tech Knowledge he threatens our democratic systems, our economy and increasingly, our National Security. Attack on our democracy. Awakening a lot of people to this truth. We know that the United States, faces serious serious threats in the cyber domain from both state and nonstate actors. Not to mention the threat of misinformation and disinformation efforts. And increasingly by those who have copied their playbook. As a result of this recognition, we are finally beginning to have some overdue conversations on privacy, data transparency and other Critical Issues related to socialnv media. We also must confront the way that domestic actors have also exploited these technologies. More broadly, our position as a Global Leader on Technology Issues have been weekend by the retreat of the United States and n the global stage as well is by congresses unwillingness or inability to formally mark policy responses for the challenges we face. Frankly, i worry worried that this administration haphazard approach to trade may end up exporting and internationalizinr some of our worst policies. And while i am encouraged with the governments aroundor the wod including the eu and begun to fill this, the need for u. S. Leadership, pragmatic Technology Savvy policy has never been greater. Sharing of the Senate Intelligence committee, i spent years on the only bipartisan investigation into russias attack one our democracy. The truth is, the United States was caught flat footed in 2017. Our social Media Companies fail to anticipate how their platforms could be manipulated and misused by russian operatives. Frankly, we should have seen it coming. Many of the technologies relied upon by the russian ira were not new weird the exploitation of recommendation algorithms. These and other techniques where longstanding tactics. Re we even saw Early Warning sign of the context of this context and something called back in 2014. You will call that it was a concocted and connected Harassment Campaign waged against women in the videogame industry. It foreshadowed how bad actors could use a variety of Online Platforms to spread conspiracy theories and weapon eyes Facebook Groups and other seemingly you know what i mean [laughter] online communities. We also missed the warning signs in the international context. Over the last few decades. Adversary nations like russia have developed different conception of information security. One that stands cyber warfare and Information Operations into a single area. I feel that we have entered into a new era of nationstate conflict. One in which nations project strength less traditional military and more cyber and information warfare. Our adversaries are not necessarily using highly sophisticated tools. They dont need to. Using fishing techniques and rattling unlockedne doors. In many ways, we brought this on ourselves. We live in a society that is becoming more and more dependent on products and networks. Yet, the level of security and integrity we accept in commercial technology is shockingly low. I think about the fact that we still do not have basic Security Standards for ift connected devices. As a society, we continue to have entirely too much trust in Tech Knowledge hes that began to exploit. Some of the private sector have begun to grapple with these challenges, manyha remain resistant with the changes in regulations needed. Lets face it, it may come as a shock. Congress doesnt have its act together either. Its not enough to simply improve the security of our own infrastructure computer systems. In a coordinated way, to deal with the adversaries and bad act theres who use technologies to attack institutions. Internationally, we need to develop new rules and norms for the use of cyber and information opportunities. The opportunity to better enforce. The norms on cyber attacks. Also need to bring Information Operations into the debate. In addition, we need to build international support. Addressing the internets potential for censorship and oppression. We need to present our own alternative that explicitly embrace and free and open internet. We need that responsibility to extend not only to government, but to the private sector there as well. A company who helps the regimes versions of the internet are just as big a threat to a free and open internet as government actors. We need to realize that the status quo just is not working. For over two decades, the unite states has maintained and promoted a completely handsoff approach. Today, a large Tech Knowledge he platform are in fact the only major part of our economy without a specific sector regulator. For years, we told the world that any tweaks around the edge would undermine innovation or create a slippery slope towards a system. Instead, our failure to act happened. Many countries have gravitated towards that chinese model. In part because we have not offered pragmatic value space alternatives. And we have seen how laws originally intended to promote good headgear like section 230 which was meant to incentivize effective modernization. Used by large platforms. As a shield to do nothing. Just last year, the americans were defrauded to the tune of 360 million by identity thieves posing as military service members. As the New York Times reported, these are not sophisticated state actors using fancy marketing tools. They are pretty basic scammers and internet cafes in west africa. The truth is, facebook faces no meaningful pressure to do anything about it. These are defrauded americans. Nor the impersonated service members. Making matters worse, facebook faces no competitive pressures. Section 230 was born out of an era of more vibrant competition on the way. It rests on the now incorrect assumption it cites to pursue modernization because they felt or at least the assumption was that users would flop to other providers if their sites became dangerous or abusive places. Obviously, that restraint has not worked. This is just one example of the internetbu government regime we have convinced ourselves and try to convince the rest of the world is working just fine. Obviously, that is not the case, at least in my mind. Instead of dealing with the misuse of the platforms, these Large Companies haveis externalized the responsibility of identifying harmful and illegal act of two journalists, academics, independent act to this. Rather than promoting pragmatic rules of the road, the United States continues to promote an approach to technology. Whether that is refusing to sign or including new platforms safe harbors and trade agreements. Last summer i put forward a light table. Help them anyways help help me acknowledge my staff who many of you know that lays out a number of policy proposals who are adjusting these challenges. They intercept with a number of the issues that will be discussed today. I hope this will get the conversation started. We can start with Greater Transparency. I think folks havewi the right o know if the information they are receiving is coming from a human being or a bought. Legislation. C it would require Greater Transparency and disclosure for online political ads. Companies should also have a duty to identify inauthentic accounts. If someone says they are mark from alexandria but they are actually forest from st. Petersburg, i think folks have a right to know that. If a large Facebook Group plans to be about texas pride, but its administrators are constantly walking in from other places, again, the users following that group should know that information as well. We also, as i want to lay it out , beyond the ones ive already put forward, need to put in place some consequences for social media platforms that continue to propagate truly content. We saw facebook get flatfooted in the face of rudimentary audiovisual manipulations of a video speaker pelosi. This does not load well for how social media will deal with more sophisticated manipulation techniques as we see the area come up on us. Platform should be granted. Greater access to academics and other analysts studying social trends likee this information. Instead, we have seen a number of cases where platforms have worked to close down efforts by journalists and academics to track misuse of their products. We also discuss a number of other ideas around privacy, price transparency, data portability, interoperability. It is my hope that the companies will collaborate and be part of the solution. Wild wild west days of social media are coming to an end. To conclude, obviously, there is a lot of work to be done. Democracies have been at the forefront of technological innovation. In order for democracies to continue that leadership and preserve our own social discourse, we need to question some of the outdated assumptions about these technologies and put forth policies that are truly consistent with our values. I applaud you and your partners for putting together this forum today. I think it will be an important part of this debate. I actually believe there still remains opportunity to make this an area where there is no suchi thing as a republican or democratic solution. Making sure we not only preserve our democracy in 2020, but we dont allow these platforms to become the advocates of hate and abuse. Has to be one of the most important issues we face today. I appreciate the opportunity to be here. Thank you all very much. Hepplause] questions, comments, suggestions, criticism. [inaudible] thank you for the talk. Much harder for academics to get access to the data they need to build the models that catch in real time the start of Something Like Kamala Harris destroyed. Can you help us restore that access by creating safe places to work on this data . Engage withac all the platforms. When the details come out, we see that for example on the campaign ad space where there was a platform to replace campaign ads in an area where they could be looked at. Difficult to navigate. In the short term we can continue to to work with you on pressure under the platforms. I am hoping persuasion can be used. Ultimately, if not, we have looked at legislative solutions as well. I hope you will get with my staff. Senator, my name is catherine fitzpatrick. How would you redact. Strict interpretation and violence. How would you go about fixing that. Sometimes people confuse section 230. Section 230 kind of an adoption. They basically said lets consider these platforms as just being a connection of the pipess they have no responsibility at all for the content. That mights have been the right answer in the late 90s. In 2019, when 65 of americans get some or all on the platforms , i think it needs to be reexamined. We have seen the platforms that i said, section 230 and we have already seen areas where we have legislative exemptions. Trafficking, child photography, all making. Matter of fact, this is one of the things that i would hope would come out of these discussions today. Section 230, how we might rethink it. If it was really a fact dirt towards promoting moderation or platforms that i would argue. If you get tired of platform x, you couldw equally move to wife. Kind of darling a tool. It used to be really hard from one to another. I am looking for i content ideas of what it may look like. To my mind, and this is a bit more controversial, i think you have seen australia and the uk move in this area, at least indirectly related to identity file you asian. Wearing about the questions of abuse and misuse of information. Identity is another approach. In america, that may not have as Much Negative consequences if you are political organizer in egypt than a validation is a huge issue. I actually see 230 and identity as high and it may be figured out what the right balance may be. I think that that is really where part of this debate ought to head. If i would welcome thoughtful approaches on how we can focus oni that. On a a technical basis, and arae operability will be a great addition as well. That will go back to the original premise we have. If you do not like kind of informationor you are receivingn platform a, you can move to pop platform be. You cannot do that today. Crazy detail wise on this. If we do data portability and interoperability, another legislation, i find it remarkable. What the value is. All of these things put together. [laughter]he the questions having to do with the private sector here. At the election commission. Just in the news in the last few weeks, we are seeing private sector security with miss information. A a privatesector Security Response might then springboard and enable a response to electioner security when it coms to misinformation. Rate question. First of all, how in the heck do we come to 2019 and think that the protection of our Election Security should be a partisan issue. Would we ever think the protection of our electric grid or Financial System should be a partisan issue . It is crazy. I would argue that if i could get every piece of legislation ive got in this area is bipartisan. I will get this piece and then i will ask you more specific tiered may be some people downtown that do not understand this. If a Foreign Government tries to intervene, the obligation is not to say thank you. It should be to tell the fbi. Second, we ought to make sure because we can improve the security of our voting system. At the end of the day, as the the chairwoman knows, the system relies on peoples confidence. What are things we can do to improve peoples confidence. Make sure there is a paper pull back up. Make sure we know who is advertising in the same Disclosure Requirements that you had on television and radio. Number four, lets go through some of these issues around identity, section 230, basic rules of the road for the social media platforms, so, again, we have better protections and recognize our elections ought to be decided by americans trying to influence each other. If they are outside forces, they at least need to be identified. This is an area where we have been extraordinarily relaxed. Not just in the short term, but the fact that there has never even been the notion of a Liability Regime around software. I think it is pretty interesting commentary the fact that we are literally buying the bipartisan legislation. United States Government is buying annually billions of internet, of things connected devices. We have not put in place a diminished security standard for those devices. The amount of cost that it will take to rip that out if you find a vulnerability downstream. It is extraordinarily challenging. I do think, and this goes in the private sector. Well beyond elections. I am not sure the United States alone can affect fiber norms. I think we need an international framework, basically the same thing around chemical weapons or landmines. There were tools of Cyber Weapons that also should be inappropriate. If nationstates or others use them, the attribution requirements and the ability to punch back should be elevated. I think there really needs to be an engagement with the private sector. An area for too long has been an afterthought. The potentiale vulnerability tt wer have is enormous. I keep thinking of the Nuclear Power plant where we are going to spend millions of dollars or tens of millions of dollars protecting the security againstn cyber and the bad guides now come through the microwave in the staff kitchen. The Community Needs to be more engaged. This is much more fun than what i am doing on the normal floor of the senate these days. [laughter] to more than i will get out of your hair. Im sorry. Nice to see you again. Thank you for working on these issues. To a questions. Should the information quality act apply to government webpages and Government Social media since we have an issue, apparently with this information government accounts. Second of all, what do you think should be done with respect to political campaigns using in this next bit. Could that be accomplish with this book . An agency that takes business from campaigns. Going through and pursuing some of the technical. Question. I am not sure i fully know what the information quality act is. Let me get here. Let me actually have a little knowledge before i try to just pump the answer. On the second, this really is not a, i would hope in terms of standards of behavior within the political process, we could start with an agreedupon code of conduct that campaigns would all voluntarily agree to. I think in this area, starting with, i hate to say industry selfregulation, that some front and that is a voluntary agreement. The ftc and the business practices. So far at least i am not sure all the democratic campaigns. A few of them half, others have not. I think that that is pretty remarkable that there has not been that agreement in terms of would all agree would be maliciousw behavior. The question we all have to hear is, one thing to use tool to try to generate interest and followers. How we draw that line between, you know, using sophisticated techniques in a legitimate way versus the kind of absolute misuse, that lines needs to be a little bit more clearly drawn. You want to pick the last one, ellen . I name is michelle. I am the executive director or for Media Literacy education. First, i wanted to say thank you for having this discussion. I would say we saw this coming. A lot of educators, a lot of scholars. We are so happy that these conversations are happening at this level. I am taken about this information that does not include education. I do not know how we solve this problem without taking a very hard look at our Education System and how we are preparing our students to succeed in this world. I do not want to be negative, that i think we are doing a very bad job at it. I just want to get your take on how do we have more in depth conversation about how we are educating the next generation. First, i want to commend your work. Two,er i think, a basic componet of k12 ought to be Digital Literacy and the ability to have some kindli of basic ideas of hw you spot false content, inappropriate content. I think there is much here we can learn. A country like sweden, for example, rather than trying to put legislation in place, i do not know how well they fully did on their education of their country before because they were in fear of russian intervention. I think there are lessons there. Much furtherth along in terms of identity evaluation tiered there is a nato backed center that is in this space and i believe, gus would probably know this because he knows everything about the budget, but my sense was, there were tens, if not hundreds of millions of dollars dedicated to that center that was part of the great grab taken for the wall. I am not sure that is making us necessarily safer. Taking the commitment away from that international collaboration. I think weve got to get our practices. I do think we need to get just as we see Civics Education and other things happen as a state level and as a former governor, i think the idea of you guys or other groups advocating Digital Literacy in a k12 curriculum is increasingly important. Finally, one of the things, i would argue our failure to act on these issues around disinformation, misinformation, i think indicative, unfortunately, of a larger retreat from our country, not just duringre this administratin of america being willing to not only do the innovation, but set the standards and protocols. I would argue since sputnik, every major technological information, if not not invented in america, invented in the west, we still still set the standards. In the case of social media we have seen the eu step up, the uk, and australia on content and privacy. Individual states. We are also seeing this in a host of other if the standards are set back in china, thats a huge security concern for us. As we think particular china, a very different mindset about how Technology Tools to build a state that would make 1984 blush in terms of how much more sophisticated it is than that, its a real challenge and i dont think its policy makers, we talk enough about it. Might have final point, i feel theres a general feeling that there with the congress. We get to the details, were not interested in those details and he thinks it will affect our business model. I think at the end of thes day, it will be a huge business mistake. We are not going away. Whats going to happen, all thats happening is the eq from individual states are simply raising the floor. So when we do actually legislate, youll end up with just much tougher regime anyone if you actually work with us in a collaborative way, finance. I go much more detrimental and at the end of the day, one of the reasons why ive been concerned, these are the National Platforms and i have been worriedwi about replacing them with chinese ones that have no production at all. But again i appeal to those of you in the companys, you need to come to the table, not just we would like to work with you, you need to be part of this conversation. Were not going away ande wild west days are over. We are one significant away. It may not even be in the realm of politics. The ability to have market manipulation. We are one event away from potentially congress of overreacting. The next event could be extraordinarily dramatic. Thank you for being here, i want to commendra everyone for this kind of gathering. It is terribly important. I dont have a lot of this figured out but i look forward to the results of todays discussion. [applause] that was really great and inspiring and going to be tough act to follow. We have some great speakers to come. I want to say weve got people standing, we kind of over reserved to be cautious but people can take those seats. I remind everybody, if you havent noticed by the plethora of cameras in the room, its a public event, open to all. Its being recorded and live streamed. If there are discussions depending on rulemaking records, wifi password is posted on the store. The print is little. Restrooms are across the hall. Thats all the housekeeping matters. Our first panel is on how new technologies affect the way people think and what we are running from global experiences. It will be moderated byy donaho. Thank you all for coming. [applause] let me start by giving big thank you to alan and suzanne for joining forces in this event. Its been great to keep up with you both. I need to give a big thanks to the staff, especially camellia, you did larger share of work here. I want to say the goal of this and well talk about this opening panel, the goal of this event was basically to help generate public interests and political will to combat information in the lead up to the 2020 elections. We seek the disinformation threat is a bipartisan issue as enator warner said that the disinformation threat is simultaneously a National Security issue, Cybersecurity Issue and threat to our democracy. Ta the animating energy for us today was our shared sense that theres been an inadequate level of public outrage or official response to theth foreign disinformation threat. At the same time we recognized a big part of why the disinformation threat is so confounding, the domestic doctor. The bizarre ways that foreign disinformation mixes with authentic discourse, domestic media, political commentary and speech of our own elected officials. Free expression, access to information is supposed to be the lifeblood of democracy, political speech is the most highly protected form of speech in our democracy but this is now being turned against society and undermining quality of discourse and confidence in our election. Confounding part of this threat is that most disinformation around elections is neither false nor illegal. Its often a combination of factual information that has compelling political framing to get the facts emotional punch but this framing does not necessarily turn facts into false, they are not technically false. Even more, sharing false information is generally not illegal and democracy given First Amendment doctrine, content based restrictions on the basis of falsity simply will not Pass Congress or supreme courts worthy. There may be some limited carveouts or false information related to procedures but we will hold back. The point is protecting the integrity of information around elections which is different from infrastructure security. Data security andar campaigns. Those are really important issues. Disinformation is a much more nuanced and complex challenge. Many different stakeholders impact quality of civil discourse and funny will be needed to combat this threat. Ma to the private sector. Digital information platforms and social Media Companies do have a unique and substantial role to play. Given how they affect the speed, scale and amplification of disinformation as well as complex dynamic that they know better than anybody else. The big question is, what we want private sector to do and what is their responsibility in protecting freedom of expression parks as senator warner said, robust moderation is a good idea. We need to tread carefully and s. King the private sector to do what we do not want government to do. Which is to take noncontact based on political assessment of truthfulness. Thats what authoritarian governments do and we dont want istho there. If we push private Sector Companies to undermine our own freedom of expression principles in the name of protectingri democrats, we end up serving the end whose goal it is to erode confidence in the feasibility of hearing to our own Democratic Values. Ultimately democratic government itself. We cant sit by and undermine the importance, equally important right to democratic participation. This is what we are here to help resolve. Theres no single obvious level for combating information at bottom line, this is on all hands on deck approach. Our hope today is to flush out nonregulatory solutions, assess the tools that are already beig utilized by Global Stakeholders and facilitate greater sector coordination. Probably most important, bring the public into this conversation and help build resilience to disinformation, which is the comments made at the end of the left about thes importance of Media Literacy. Our goal here is to place the u. S. Election integrity challenge in a global frame. See how the u. S. Election threat is one veryy important data. In a much larger global trend. We have an embarrassment in terms of our speakers. Look at how Digital Technology is being exploited. Well hear about Brain Science about how humans process this information and well talk about a range of counter disinformation to rules that have been tried around the world and assess what may be transferable to the u. S. Im going to introduce our speakers. Toie starting with secretary of Homeland Security who now cochairsno with Prime Minister denmark, Transatlantic Commission on Election Integrity which im honored to be a member of. Assistant professor at Vanderbilt University who conducts research on how people process information. Peoples perception of truthfulness. Camilles chief innovation officer, she leaves the work to detect and litigate information, media manipulation. Global Public Policy lead information integrity and he advises product engineering safety team that google and youtube. Ad susan, former mcc commissioner is now a fellow of the Public Policy center. She chairs the High Level Working Group on content moderation online and Free Expression of which we are all participating. Nate is legal director and hes going to speak about a counter Disinformation Campaign ahead of the 2019 european elections and see what lessons we can learn. After they speak, five or so minutes, well turn to alls, of you for your questions, comments and experiences we will start towith michael. [applause] thank you for sponsoring this event. Just to dive into it, if youy look at Information Operations, active measures during cold war, efforts to use information to undermine the unity of effort, you could be recognized that this is a domain of geopolitical conflict. There is a reference to this concept based on seven years back on the chief of russian general staff arguing that a domain of manipulating the mind of your adversaries so you undercut their ability to resist and essentially disrupt the unity of effort. I should say this is not just about elections. The effort to use or recognize information for purposes of undermining your adversaries of effort is more broadly in any democracy or freedom and the idea is to paralyzed your opponent and make them shrug their shoulders and give up. This is not new. If you go back 100 years to the early stages of the soviet union when they formed a common term, from the very beginning, they sought the ability to propagate confusion and undermine the morale of your adversary as an element in efforts to dominate the world. During the cold war, we saw propaganda in efforts to manipulate populations in order tow achieve results which they wanted to do. Weve seen where if you look at what happened in europe, in the early part of this century, youll see russian efforts to use information manipulation, influence and money to drive behavior in a way that would favor parties or politicians that were viewed as pro russi russian. If youe look for funding in russia in the first decade of this century when her right wing party was experiencing financial difficulty, thats one example of this kind of effort. A way of propagating and influencing the discourse in a way that was favorable to russia, another dimension of this kind of Information Operation or active measure. So whats different now . Why are we so focused on this . There are a couple of elements that have changed, that havent undercut the information or disinformation but that have made them much moree dangerous and difficult to control. First, the information out there and a number of different sources has really amped up of where we saw even ten years ago. A lot of that is social media. The ability to use all these different avenues to influence what people read and what they hear and drown out voices that may be inconsistent or contrary. Some of us are old enough to remember the networks, they basically tried to balance. Thats very different from what we have now. Theres no arbiter probably the russians. [laughter] anybody after and thats one element. Social media has also been amplified by Data Analytics to micro target particular audiences. After you speak to everybody, make sure here is just specific to people drink their things. There are two other elements that have increased the risk. One is the fact the Mainstream Media and effort now to drive revenue by getting people to turn on to your particular media outlet. As a tendency for the Mainstream Media to amplify social media. In many respects, what you see f on twitter or facebook is just an invitation to get Cable Television or talk radio to focus on particular plot line in amplify and propagate for viewers. Finally, the release of the data allows people who want to pursue their gender to getal access whh they can publicize or distort to drive that message. Its not just russians, although if you read the mueller report, you see how plain the efforts were to get involved inha the 26 election. I think weve seen influence operations coming out of china and around and maybe even more important, its us, we are doing a lot to promote the information ourselves. Sometimes we arene doing it or encouragement of foreign actors. Sometimes we are doing it on our own because of various extreme views which then think amplify and use tools to propagate but its not just the adversary thats doing it, it involves our own citizens who take extreme positions and exploit these techniques and technologies to propagate them. I conclude by saying twowo thins on the horizon or closer than that, we think about this to the next level. Artificial intelligence, the amount of data after about what people are interesteda in, its so fast that nobody can compete or analyze it inth real time. Thats what Artificial Intelligence is about. Those of you following cybersecurity generally, the d chinese in particular have been accumulating vast treasure probes about citizens in the u. S. Some of it is stolen, some tamed legitimately but should be used to allow somebody to target particular people who may be susceptible in particular messages. This challenge will only become more. We are talking about affecting the election. I dont know that i am as concerned about efforts to keep people from one candidate to another as much as i am about suppressed voters who want to encourage others. Heres an even more challenging question. What happens after the election . Lets say there is dispute, take your mind back to 2000 with the bush campaign. Imagine that occurred or something similar and you had a conservative effort to drive disputes. That could affect not only publics confidence and outcome that could affect the ability of the u. S. To function over a period of months which would be for our adversaries. We need to start thinking now about what ways you can validate and adjudicate the accuracy of Election Results so we dont have 2000 on steroids. Thank you. Im going to lay down a couple of things you said that are really important, the geopolitical dimension, its not new but there are new dimensions related to social media. You highlighted the importance of docketing and professional media reporting last most important, disinformation isnt always about changing peoples voting, its about suppressing the vote and or eroding confidence in the outcome which could end up being the biggest problem. Now we will turn to lisa and hear about this information is processed in the human brain. Im a cognitive psychologist, i study how we process true and false information and what things are made memorable. I want to talk about why misinformation is a problem. If we are all smart, why cant we realize something is false and not have it affect our belief . Why does it change our minds . To start, im going to put my professor hat on and ask you questions. You have to yell out the answer. In the biblical story, what was jonah swallowed by . Big fish, whale, depends on your translation of the bible. For many animals of each kind, how many did he take on the ark . [laughter] most of you yell back to even though all of you know it was noah and not moses who took the animal on the ark. This is something we call knowledge neglect. They have relevant knowledge in our heads yet, we fail to use it in a given situation. We often fail to notice errors in what we read or hear and for errors can manifest our thoughts and beliefs. Imagine your friend tells you new interesting facts, is likely to happen online or social media, how do you decide if what they are telling you is true or false . There are at least two ways. One, think through your prior knowledge, does this make sense given what i already know about the world . Want the quicker and faster way of going with your gut, does it feel true . We know and a lot of situations, humans take the easier and quicker path and go with their gut reaction. Over 30 years of research has shown we use our prior knowledge to determine truth but also rely on a lot of other cues. One of those is how easy a sentence is to understand or process. One of the big things that increases that processing is repetition. The more times something is repeated, the more likely we are to think its true. Hundreds of Research Studies have shown repeated facts about to be more true than things youve only heard once. This happens even when you have prior knowledge. V weve got a few studies looking at even among people who know the cyclops is thees legendary id diet the greek psychology, if they read this twice, the minotaur is a legendary one id greek mythology, they think its more likely to be true and if they only heard it once. Repetition is increasing your belief and false statements even with prior knowledge. Some people have done research on using typical false newsle headlines. And like trump and clinton fake news, when you repeat it, people think its more true. This happens irregardless of their political beliefs. Whether it goes against your actual political beliefs, you stillr think it increases in truth with repetition. So why does this happen . One of the big reasons is because its effortful to consult our prior knowledge. We just go with whatever is good enough or close enough that we think is true. Our brains are an amazing machine. Even the best a i can barely do what a human brain does. Look out into this room and recognize the scene and everyone in it. Our brains are also really lazy. They dont like to work. They will take shortcuts whenever they can. A shortcut is when something is good enough, close enough, we assume its true and move on. That makes it difficult to notice errors in what we read and hear. What can we do . Prior knowledge isnt enough in itself, i showed an example. Even though its cyclops, you think its true and repeated. But it helps if i asked you how many animals of each kind did nixon take on the ark . You all noticed the error. There are limits, prior knowledge can help but the big thing that helps, thinking deeply and critically about what we are reading. Taking a second to pause and think, how do i know this is true . Where is this coming from . Giving and accuracy focus and norm is useful. Having people think about how do i know this is true or false . Franklin roosevelt once said repetition does not transform a light into a truth and whileis its true on the face, it cant change the actual truthfulness of the statement, he is wrong in terms of what it can do in our head and mind. It turns out repetition does have a strong impact in whatf e read and believe. [applause]e] hopefully we welcome back to your. About how we build resilience and deeper Critical Thinking and accuracy norm. This goes to this question we had earlier about Media Literacy and how we build a more effective program. Were going to turn to camille. Want to talk about threats and a higher market of disinformation. When we think about russia, i think theres a bit of russia technique almost. I think it gives us a marker for what we have learned. The u. S. And Silicon Valley will be something they should have seen coming. What have we learned now and what are we still ignoring . I spent time looking at the details of the russian interference campaign against the u. S. And other nations. When i was working with the Intelligence Committee in 2016, 17 and midterms and 18, these are some things perhaps that are surprising, theres still a lot that we dont know about what happened in 2016. For all the headlines and data Silicon Valley shared the wonderful reports and discussions and hearings, we still have major blind spots in understanding how russia targeted the u. S. And what we have to learn from that. Us want to give you two examples. A lot of people think about the research agency, people on staff creating messages to target the communities in the u. S. Across social media platforms. The ira was not the only russian entity involved in producing this on social media. Rather major integrative is doing. Military intelligence. Silken valley has shared details and data on what the ira did. On dont have a grasp on collection. They are more funded and more persistent, its easier to just descent on the ira as a network. It also moderates because the gop you is responsible for the most complex techniques that we saw in this campaign. You hack someones email, you make it week. Those are the techniques that the core of our Network Public spews, it attacks us exactly where we lack in our own democratic presence. Ount of targeting messaging. When i worked with activists who were on the receiving end of the Russian Campaign in 2016, a lot of the targeting was done through direct messages. We had never seen any of this. We only had a sense of how much of that activity happened. And it was so much more insidious. And so we have to recognize that some of the techniques are things that have an impact. The messaging is, of course, a key tool in targeting the media. This also targeted bya foreign adversary. I know people dont want to talk about russia every day, i could if you ever wanted to. Theres many foreign players in the game, sometimes i hear other people are doing what russia did, thats actually historically untrue. For us this but others have been doing this for other longer. I think it is a testament to how much we were heaved started targeting the u. S. On social media early as 2013. This was us waking up to the fact that a series of foreign adversaries has been using the techniques to target our conversations for a very long time. We have seen a lot of details, on the recent Iranian Campaign and we are seeing the first data points on how china is using social media to target the american available on saudi arabia and the fact that they have built this apparatus. These foreign actors have their own preferred techniques, preferred targets and communities, and it is important to analyze them separately and together to try to understand what are the telltale signs they contain when they come from here or there. Adversaries are also engaged with us in a catamounts game. Sometimes it was interesting around midterms in 2018 where he saw russia, and specifically the ira come back again if we were given away to assess how much they had progressed. They were better at hiding their traces. So the tactics were new and frankly more videos. This is not something we can detect once the takedown. This is something that we are going to have to engage with in the long run. A little bit of a catamounts game. That being said, we have to be very straightforward about what we have learned and how things are evolving. Very quickly on the present of a for higher markets, this part was a little bit more fun. A bit depressing but looking at old that you can Purchase Online and people are selling is quite disturbing. The four higher market of this disinformation is growing every day. It has small players, people who hack into other pages and tell them, and a very large mercenary like troll farms, individuals, and it is global at this stage. It used to be more domestic than what it is now. If you look at a country like the philippines it has a strong industry of farms and what they are reporting is international and global business. What do we do about that . There are two important legs to attacking the problem in the first is detection. We work very hard on better detection techniques, including with our Silicon Valley partners. The idea is to send find a forensic signal to see if someone is manipulating the public discourse. Sometimes it can be through bots if you dont have a good budget. If you have a little bit more budget perhaps you can do more subtle and complicated using actual farm. This will not be enough because frankly a lot of this for higher market is close to what and marketers are doing. And what technologies are being developed. There is a gray area. Heres an example. In 2018 we saw a candidate suddenly was pushed by exact same thing at the exact same time. It created massive chaos in Silicon Valley. The candidate must of hired troll farm or maybe has hired bots and something is wrong. This is disinformation and we have to take it down. And they had built an app and their supporters have downloaded the app and granted access to their own social media accounts and had agreed to participate into this one push contain. They have downloaded the app and full part of something they had agreed to. They gave Silicon Valley a pause saying, what do we do with that . Is that okay . Is that coordinated authentic behavior or is that how people will be containing because of our lack of serious dialogue on what we are willing to accept our social media or not, we will find increasing amounts of gray area situations like that as we head toward 2020. I would really encourage the serious conversation with candidates and parties and pr firms on what is an acceptable practice on ocean media and what borders on disinformation and what is simply a modern and creative use of digital tools. I think without that we are bound to have very complex and Difficult Conversations that will not help us in our institution. So the two big rings here, i would keep this in mind how worried are you about coordination between all of those adversaries. Russia, iran, china, saudi arabia. Will they be coordinating . For all of us we have to be thinking about the norms of political campaigns and what counts as an acceptable political strategy, would we want campaigns to be doing and how they use versus what counts as inauthentic coordinated manipulative behavior that we want to stop. That is a really hard question. I think we will turn to simone. Maybe you can help answer that question. Thank you and good morning. Can you all hear me . Ice is this better . Yeah. Good morning and thank you. The comments so far have been very insightful. Obviously for youtube, the challenges raised by malicious actors who would try to use the platforms to deceive our users and harm them, our mission which is to connect them to useful information, but also to our business interests. These are the kind of challenges you have to deal with is the early days of our platforms. More broadly the way in which people try to either elevate their content in ways that were inauthentic for the purposes of scamming or making a profit. Crosslinking websites and so on. That the challenges raised by disinformation when it comes to the functioning of democracy are the top of mind for us and they have, we take that extremely seriously. I would add to the we take them seriously during but not only during elections. We try to have responses that extend beyond the scope of civic events and precisely because it can impact those that run the information wait for elections to begin to try to issues and they use quiet time to plan their efforts and plan ahead. We try to stay ahead of that. Very briefly i will go into the high level approach as we deploy across multiple platforms including google and youtube. We thwart these efforts with the understanding that point in which our job is done and each time we do something new the other side does something different, as well. Then i will go into some of the emerging threats we see around the world and how we try to stay ahead of those. Im happy to dive back into these points during the q a. In terms of how we approach these issues at the company level, we have three major types of in addition to collaborating with others in the industry and educators and basically our products. The first we try to address the challenge is by designing a system so they make quality count. By this we mean they try to form at the algorithmic level an understanding of what sources are authoritative on an issue and elevating those in response to researchers were over the course of their experience. That is also as it turns out what our services were built on to begin with so even though we dont have a perfect keep innovating. The Google Search made more than last year so it is not a done deal. This is something we have been working on for quite some time. The flipside of that is of course when it comes to elevating one angle, we try to understand the recommendations and what constitutes reduce the spread of the content and the recommendation on youtube. That is of course it is trying to understand what kinds of factors and behaviors are malicious actors going to use to try to deceive our systems and to gain them. To that end, we have had for quite some time now, and policies, rules of the road, sorry, that provide a sense of what is permitted and what is not. The systems are automated and human teams try to catch bad actors that would try to on these policies. If you had created a piece of content that you want to propagate very fast, you would try to gain our systems in that way. It turns out looks this referring to. Working these inauthentic forms is at the top of mind for us. We also have policies about the civics that are harder to catch and invest a lot of research into and have teams looking at the representation, that it is not okay to present ownership or to impersonate another channel on youtube. So that is malicious actors. The third layer is to provide context as often as possible by providing them a navigation with the information that will about what they are looking for. These things are things like information panels and begun youtube you would see the panels on google and on youtube that show you whether broadcast channel is you see those with breaking news on youtube. We also have ways to provide you more perspectives or holistic full coverage function what we have on an issue, not personalized and explores available to show you. Those of three ways that we try to counteract on our products and it is always a work in progress. We double down when we know elections and other types of similar events. We know we are likely to see more attacks or efforts to thwart our systems. It has been interesting to observe over the past few years as we have seen variations of these challenges in numerous countries around the world how much the local specificity of each society in which we operate a simple example is the notion that in many places around the world Group Decision apps are a significant part of information discovery. In ways that are way more pronounced than here in the United States. That changes what vectors of attack for malicious actors. It still means we try to get to our platforms delivered different point in the process. We have to be mindful of that as we expand into each new country. We tried to stay ahead of their goals. It does not mean we have to start from scratch every time we do have to be mindful the last thing i will say is beyond those two local specific needs, there are two other flags for this group. One is there is a rise in concern around media, whether those are ai generated for a new challenge for us. Youtube is seen from its very early days individuals manipulating the content of videos by editing them and splicing them and so on to deceive users. It is something we are mindful of because it is quite Cost Effective for the malicious actors who use them and he can have some traction. We do have policies around those to make sure we can work them and we have systems trying to catch those but it is at the top of mind for us. The other one is as large platforms get more friction for the operations of the various actors, it might be for them to give up their own spaces in which they cant reach a smaller audience like can reach scale faster and with less friction and then try to come back to larger platforms. That is not something we have observed widely of this point but a couple of the many ways we try to look at they do next . And how do we stay ahead of that . And trying to stay ahead of threat actors by looking at what they do and understand how it may our systems. Two point i want to underscore how hard it is to delineate what now counts as an election contest because the time raymond the stuff that happens way before and stuff that happens after all matters to the integrity of the election. The other question i would ask, this point of making quality count and elevating quality, what kind of resistance you had to that approach from domestic political actors. Who may resist your assessment on the basis of quality. I guess we will now turn to suzanne. Thank you, eileen. If you begin with the promise that freedom of russian is fundamental to democracy and government mandated removal of content is deemed false and deceptive, it is a challenge for democracy. Western democracies are stymied and how to tackle content that is odious and manipulative and is disseminated with the goal of dividing society and destroying our faith in institutions but it is not illegal. Most of the lessons that we have learned and the group that i formed has learned in looking at civic laws and regulations and proposals and initiatives, public and private, really point and what not to do. It is hard to come down to what actually will work while protecting freedom of expression. Governments often lumped together, he speech and information, viral deception, in an effort to regulate platforms per se. Some european Member States in putting the uk, at least for the next couple of weeks, and france, propose new regulatory regimes using models, either Financial Regulation or broadcast regulation. The uk white paper would anoint a new regulator who would not police truth on the internet that would mitigate the harm caused by disinformation and on my manipulation. And that constitutes threats to our way of life. It purports to focus on platform behavior rather than content itself, recognizing it is impossible to catch all harmful content. Platforms would go and ill defined duty of care to the public and they would be evaluated on whether they have taken proportionate and proactive measures to help users understand the nature and reliability of the information they are receiving and to minimize the spread of misleading information. They will be subject to a code of practice requiring them to beef up transparency and include clarity around political advertising. We are seeing that is a major theme across the board. Cooperate with Fact Checkers and boost authoritative news making reputable Fact Checkers less visible to users. All of this is good and the focus on harm to society we clearly have a Chilling Effect on free and platforms are more likely to remove more content to avoid heavy fines. The amorphous duty of care might lead to proactively monitoring legal content which again is very troubling. The Uk Parliament is expecting to address legislation in the fall that will set up a regulator to address illegal content leaving the question of legal content that is harmful for a later day. In france the enacted a law in 2018 to address relation of Information Online around elections that imposes strict rules on the media for three months leading up to an election and gives the authorities the power to remove fake content spread by social media and block sites that publish it. It requires platforms to publish the amount of money that was involved, candidates can sue for removal of contested new stories and importantly, judges will have 38 hours to rule, but it does go through a judicial process. Last may the french government also issued a paper proposing innovative regulatory regimes that would focus on platforms behavior, again, not content. It would examine transparency, terms of service enforcement, and redress for those who were harmed. It avoids regulating the content of self. The European Commission has put forward a package of activities to address disinformation in the run up to the European Union Parliament Elections this past may. It expanded Digital Literacy and supported quality journalism. It elevated Fact Checking and provided a network for Fact Checkers and it also promulgated a colder practice. The commission itself has battle between regulating content and the notion that it does not want to be a ministry of truth. It is an impact of how they lived under communist regime and in parts of europe. The code of practice is a self regulatory measure. It applies only to platforms and there are only a handful of platforms and advertising trade associations who actually signed on to the code. It encourages transparency, literacy, Research Access to data and add transparency. Because of the acceleration of closing accounts and labeling bots and prioritizing relevant reliable information, it created greater cooperation between platforms and the eu government. It still however, the eu is talking about regulating the platform and will more likely introduce legislation once the new commission is impaneled this fall. They are looking at the Digital Services act, which would also address the ecommerce the eus version of the safe harbor for platforms. In conclusion, basically there is in looking at efforts that have been taken so far, apart from some of the nordic and baltic countries, which basically have avoided content regulation in favor of public education, Fact Checking, and encouraging quality journalism, basically there is no Silver Bullet to address disinformation in a manner that is true to freedom of expression. The consensus however is that transparency with respect to political advertising is something that needs to be pursued and it is best to address behaviors and actors as opposed to the content itself. Thank you. That is great. I want to get the audience thinking after our next speaker. He will ask you to come in but to really grapple with this question, susan raised about how hard it is to craft regulatory response that actually works in combating disinformation but does not undermine Free Expression and also what you think about this distinction of not regulating on a constant basis for going after manipulative behavior. Does that distinction make sense to you . So reflecting on that. Our last speaker is david miller. Thank you. Can you hear me . Okay. So i cannot ask for better introduction than that because im actually going to talk a little bit about content. I want to talk today about policy solution that i think can help us bridge the gap between the Democratic Values of freedom of expression and ways that we can combat disinformation and i will talk about that in the context of some of the work that did in europe in advance of the elections for the european parliaments. Briefly for those of you who do not know us, we are a global civic advocacy and organization and have 53 members from every country in the world we are exclusively funded by those members and they take a very active role in helping us to do and what issues we work on and what campaigns we launch. Our members are deeply concerned about the threat that disinformation poses to our democracy to the extent that they funded a Large Program that we launched in europe. The european to combat the trolls. Our team of elves, and i will not labor the point, but we looked at and discover Disinformation Networks and European Countries that we reported our findings to facebook. Facebook took action on everything we reported but the post and groups and pages that facebook took action on, we estimate those post and pages and groups reached 750 million views just in the three months leading up to the european elections. I want to pull a couple of important things out of that. The first is any reports you may have read about the demise of disinformation in the electoral context has been greatly exaggerated. Second, want to mention that there are ways they can do detection and cando reporting. There are ways to optimize that. The top one that i would say would be difficult for us to get facebook to take action in a timeframe that was relevant to the Upcoming European elections. We were able to do that because we have a certain measure of ask us, relationships, and who were able to bring a certain amount of pressure or a threat of pressure to bear and that was too high a bar. There needs to be better and more open and easier methods or Civil Society to report information they found get that taken care of. The most important thing that i want to pull out of this is we dont know how many humans however many tens or hundreds of millions of people saw this toxic content, this disinformation, the vast majority today as we are sitting here do not know that they were duped. We are not currently doing enough to counter affect the effects of disinformation that has already gotten out there. Solution that i want to offer us today to start thinking about how to do that is correct the record. It starts from the very simple premise that of the tens or hundreds of millions of people who saw the disinformation content in europe, really the only entity capable of reaching them incapable of letting them know that they were duped are the platforms themselves. And so correct the record is a very simple idea that one piece of content has been disinformation by independent Fact Checkers, then the platforms can and should let each person who has seen, liked, commented on, interacted with, shared that piece of content, that it was disinformation. It was false or misleading. This approach has a few advantages. First and foremost, it is not censorship. There is no ministry of truth. We are asking you to go to the private actor nor government to determine what is true or false. There is no censorship. People are free to post and share whatever they like. We have seen Exceptional Results between control groups and groups exposed to corrections. We have done qualitative research. j online in the contest of elections, thank you. So big question for me is can we create a culture that does not want to be duped and [laughter] and when that becomes media of literacy campaign, we would like to now turn to all of you and see what kind of questions you might have for any of the panelists on any of the themes, just would like you to state your name and organization and the mike will be passed. I see one right there. Janine. I was a longtime journalist and now i teach at stanford, i appreciate the effort that youre doing on correct the record, interesting to hear from professor fasio on the extent when you factcheck people do change their minds, i know a lot of research that shows the opposite and once people are exposed to the mistruths, its very hard to change, and i do appreciate and the factchecking of that, if its not factcheck and there in info universe, you to factcheck incident amount as you alluded to, im interested also in Mainstream Media, manipulation of Mainstream Media to make sure we are not aiding and abetting that effort and i think theyll be a will among the major media organizations and abide to limit and on the interest of time i will put that one out there, when theres a decision made by a News Organization to report on disinformation thats going viral that they dont link to it and i know theres a lot active way screen shot so that you dont boost it but a lot of ways that the Mainstream Media can decide not to expose people, this brings me to my question, for me the share or the like button is a National Security threat, and a lot of you have talked about the spread of misinformation and disinformation when i caimg in came in an uber i had the opportunity to rate uber driver. Why is there not a way to score quality information and is there a way to put a pause therefore you spread something or share it to say this thing was checked to say, you know, this was not credible, are you sure you want to share it, something to just slow it all down to allow the Critical Thinking that professor fassio alluded to, i know that doesnt go with the monetary incentives of the companies to to spread and share, but is that something we can add to the discussion, thank you . Three important things, collect record and the role of Mainstream Media of disinformation as opposed to social media and then designing to friction and what people think of that, anybody want to take any pieces of that . Sometimes correcting misinformation, the best thing to do is not expose the people to false information to start with, once theyve seen false information youre in a hole that youre trying to dig yourself out of. They do change believes and researchers were concerned with backfire effects if i really believe this and then you correct i will actually believe it more than i did before that doesnt seem to really happen all that often, very limited, select cases maybe it happens in some informations, on a whole its not a big concern, the factcheck helps but they dont help as much as if you never see the false information in the first place. So one of the issues that i think we have to work through with Something Like correct the record is how much interaction with the false information do you need before you get the correction because what you dont want to have happened is youre showing the correction to people who would have never been exposed to the false information in the first place if they hadnt seen that correction. You want to make sure that they are engaging enough that they actually need the correction that youre giving it to them but if they do need it i think its a useful thing to have. Anybody else . Can i just jump in on the efficacy and i will be brief. Total agreements that the studies show that we are actually launching academic study in cooperation and i will be happy to share those. Real quickly, i think its an important question, we do look at all of the Product Design options we have at our disposal to try to reduce the spread of deceptive practices, tradeoffs for all of these, for the example you mentioned, there might be case in which you want to have fast chair happening because somebody might call for help and those are the ones you balance and things like elevating content or making sure to i vaid sources are ways we try to sort of get in the way of of malicious information getting out and probably more we can do and we think about the issues every day, but quite often solutions do have tradeoffs and not necessarily easy to gave nate without causing harm in some other cays. We help academic social data v done so for about 10 years, without oversimplifying, is how do you know it when you see it. Some things are easy to do with machines, the lowmanaging fruit, google, twitter, the easy ones, some things are hard, they require human judgment. The problem is diversity of domains, some cant tell the difference between marijuana and tobacco, my question in the context is, you get away from the specifics and what made google special is a look at the web and said on the web, not all pages are created equal. Some are strong and some are weak. Some are strong, specific domains, specific tasks, you have to have a way to rank humans as the way google ranks pages to beat the demons. So that goes back to the quality of sources and also humans and machines and how you rank humans, anybody yeah. That might be you. The only thing we do not as far as i can tell rank humans and nor do we intent to engage in that pursuit, however, we do have, of course, xre comprehensive teams that complement systems and make sure they look at the harder calls that you were mentioning that machines are very good at, we with not suggesting by my any means that we are doing sorry, insignificant trust and safety efforts across all of the company. State your question. Very quick. Just note that google ranks human, google yourself. You know, you left aside the question of google plus and everything around that but it was very much an effort to associate different people with different amounts for different terms, lets be honest about that, right, we are ranking humans, we are ranking output, we are ranking people and really important ideas, right to talk about what are some of the not varied in social media but trust in social media. I might know a celebrity is who they are but they are still sharing something thats false and i would like to see more authenticity of what youre doing, my name is alex howard, worked at the foundation and since then the question i might ask why we arent talking about cable news and talk radio, we are talking all about technology but yall know very well that these are two platforms of spreading the information, why isnt that part of this discussion and specific question, nice to see you again, thank you for your work on the issues, where is the current leadership in dhs on these issues, why arent they here, what did they do wrong in 2016, what should they be doing right now . Thank you. Yes, hello, kathy, question for google, you mentioned the knowledge boxes, well, u fact boxes are dependent on wikipedia, wikipedia editors are anonymous, only a small number of them decide the controversial entries and the entries can get lost after someone dies or major event, do you see yourself as dependent on google, on wikipedia that is the first search result on many topics and you feel that wikipedia needs to be fixed by you buying and fixing it . Im just going to underscore the point of cable news and Mainstream Media is getting its off the hook here and thats a big deal. But there were two direct questions one to michael and then one back, so why dont we start with michael. So, first, as i think i mentioned in my remarks, cable using and talk radio are a part of this and a lot of what is going on on social media forms is a desire to engage with personalities who are driving discussion on certain Cable News Networks around talk radio to propagate, you have to look as ecosystem, on the question i cant tell you why dhs why they were invited or not, i will say dhs is now doing two things, they are working with the states on helping them to raise their level of cybersecurity with respect to voting machine infrastructure and they have at least begun the process of trying to put together a strategy for dealing with foreign nations driven misinformation, it gets delicate with the u. S. Government because there are restrictions about what the government can do in dealing with the issue of rating information and quality of information, as to what happened in 2016 i think part of the problem, i wasnt in the office then but one of the problem was uncertainty about whether the government ought to publicize russian disinformation or whether the fact that in and of itself would be viewed as political manipulation, i think thats a problem. I tell you what the canadians have decided to do, the canadians have set up a panel of senior Civil Servants not politically affiliated and this panel in the runup to the election will have the opportunity to judge whether the degree of risk of foreign interference is sufficiently high that it warrants warning the public this is going on and to create a more less neutral independent arbiter of when you raise the red flag, should you not worry about either overreacting or underreacting as a political actor. To make sure we have resources in case those get gamed or hacked and that we dont have any other plans at this time that would be dedicated towards the panel sources. Unfortunately we are at time as i said we had an embarrassment of riches in terms of speakers and incredibly complex rich topic but this is just the beginning and to be continued, thank you all. [applause] [inaudible conversations] the next panel is where i hope we take our collective sense of alarm and get concrete about what can be done and youll notice in the title of our event today its not just democracy, integrity and elections but in the 2020 elections. Wewe have a time horizon of 14 months. I think michael put it off very well, in terms of what the scenario looks like, its sort of a butterfly ballot on steroids, if we have a close election, if we have a contested result and even just a couple of counties and there has been a Disinformation Campaign that is exposed that may have skewed the results, what i worry about is i dont know how we quite survived second election here in the country, a second president ial election that comes into serious question and at pan america we come at this at slightly different angle, we are freespeech organization and

© 2024 Vimarsana

vimarsana.com © 2020. All Rights Reserved.