Transcripts For CSPAN3 Georgetown Law Discussion On Combatin

Transcripts For CSPAN3 Georgetown Law Discussion On Combating Disinformation On Social Media 20240713

Teachers. As a teacher itch see have seen. I am a teacher, thats why i am voting. Voices on the road, on cspan. Next a look at how Political Campaign can fight information on social media. This is an hour and a half. All right everyone, well get started. Thank you so much for joining us today. We do original policy work and in collaboration of our faculty and students and spend a lot of time thinking of how we train lawyers and policymakers to better understand which better way to benefit our society. Election integrity in the Network Information era. Technology is reshaping many facets of our society including changing the venues and tone of information sharing and in the public conversations. Having these changes on our elections is a priority of the highest order. Since the elections of our foundation in an area is above all others. We need public trust. There is no question the public trust and Democratic Institutions is being challenged in seismic ways today. That to say we have a lot to talk about. You will hear from a wide variety of people thinking about these questions from a variety of disciplines. That match up is purely intentional and reflect our approach here at georgetown where we strive to break out of traditional and academic silos. We are approaching these questions to four different lenses in our program today. Looking at questions with information sharing and public discourse. The significant risk of Voter Suppression and turn out, Elections Security and lastly a specific conversation on policy solutions. Well have student presentation and post terrers in the back of room refleckiting of the work tt our students are been doing in the area. We are grateful this year of our editor and chief and the full Journal Staff to their work pulling together in symposium. So immediately i am going to pass it over to josh to welcome you and well start with the program. Thank you. [ applause ] good morning everyone. My name is josh banker, welcome to todays symposium. Thank you all for coming and for our wonderful guests showing up today. I want to extend a quick thanks to georgetown faculty, without your support, none of this would be possible. I want to say how proud i am of georgetown law. We are a student run law review focused on intersections law and technology. The hardest questions faced by legal scholars and technologies and policymakers. We covered online manipulation and regulation of Global Health data, platform speech regulation, deep fakes and their impact on the news. Robot Corporate Board members, the committee on Foreign Investments in the United States and the work of different technologies that impacts our daily lives. During our four years, we established ourselves as a place of ideas. Todays event fits neatly into that tradition, we are incredibly excited of the days ahead between the wonderful speaker that we have and the students that are here to present their posters. We have an event thats going to speak to the issues that we have seen throughout the past week. I look forward to hearing from everyone today. I know the journal is helping and bringing them forward to publication. I encourage you all to look towards that. Thank you all for coming. Without further adieu, i would like to turn it over to our professor carol to start us off now. [ applause ] thanks josh, thank you all of you for being here today. We are grateful. I am glad to have this panel on Network Information ecologies be the first of the day. I see it as establishing a terrain and a landscape that under lines what well be talking about the rest of the day. I want to prevent a warning along those lines. The terrain is rough. Our Whitney Phillips has a piece entitled the internet is a toxic landscape. The sub head on the essay was but we can fix it. I dont want to over promise but in addition to spending some time in surveying the heal healthscape, well be offering some metaphors and possibilities. Maybe for thinking of less hellish what it means to us. You have the four impressive people that you have here in front of you. What i would like to do in introducing them rather than keeping that information is telling you why i am so grateful why each of them are able to join us today and i am going to ask them to say all the good things they have to say to about 12 minutes a piece and i am going to ask a questions or two and open it up to all of you for questions in 20 minutes. Feel free to jotting things down or thinking about them as we are speaking. First to our left, we have mike anney. Hes thinking of the network sphere. What i love about mikes work was when i assigned it is the way he pushes us to think about what our relationships to look like and what our democracy to look like and what we want our public to look like and what we want our press to look like and consider the preconditions for us to get there. Then we have latisha bodi, professor in the communications cultures and technology program. Latisha has done really interesting things on her work on how we get things online. We have Whitney Phillips, an associate professor, whitney has been thinking of information disorder and including information for many years and has done vital work in getting journalists to see how they can contribute to pollution and giving them practical tools to address it. I find her use of ecological t metaphors to be powerful and helping me think through. Finally we have lam vo, a Senior Reporter, lam has made it her job to understand how the social web works and how to help readers and other journalists understand it better. Shes a leader in using data and journalism and she wills her power to make journalism more impactful and diverse and responsive to its audience. With that, i want to hand it over to mike to get us started. [ applause ] thank you very much. That was the kindest and generous introduction i have ever had. I appreciate it. Thank you all for being here. It is a total treat. What i want to do today as somebody whos coming to the question and the intern line as a communication scholar, as somebody who never practiced as a journalist and think of what the press is, i want to tell you a story of a study i did thinking of the intersection between platforms and journalists and offer three ways to think about this terrain that erin describes. Three ways of what platforms do and news and platforms. I am sure doing a little bit of a take on, sort of flop for knowledge or making for people or his arguments. A lot of the ways we think of describing people and think of who they are and putting categories inboxes is an act of making them up or con strustruc them or which parts we are paying attention to. What i want to argue is a lot of the language and language and c contemporary platform is making up and what kind of Political Action they get to do. I want to impact that a little bit. I come to this question sort of two questions. What assumptions do platform make about speech when they cast people as users . Theyre trying to convert people in a way or think about them as users of system and not necessarily political beacons and could use assumptions that platforms are making are being insight of government. This moment where we are struggling of what it means for the platform and engagement moment, understanding our pressure points and opportunities for government that may exist. A lot of work and science and Technology Studies is i try to think about can this concept of infrastructure be sort of a concept that we can think about and deploy against this platform government. If we see platform as infrastructure, what opportunities may that give us to think about regulating platform in different ways and think of tracing public dimensions or tensions of platforms. Infrastructures are one of these concepts that i want to deploy. What i mean by infrastructure and a lot of work that the infrastructure in a few different ways. One is the largely invisible set of infrastructure works best when they are if the background and we dont notice it. It is the condition for shared meaning. All the sorts of rules and consumption and values and language that goes into figuring out and how we think and assume together. We know how something is going to behave. When the system stays invisible, we should not see infrastructure if infrastructure is configured or deployed. They build time of what they call stable bases, you should not be questioning things or you should not be revisiting assumptions or should be thinking of the act of maintaining infrastructure but dont question it. Finally maintaining boundary work. Infrastructures are moments when professionals of different backgrounds of ideologies and languages are coming together to sustain something. A lot of ways our platforms meet a lot of these criteria. We dont know what it means to broken it down or know what it means when they are working. I want to offer that as a way to think about it. To get more grounded and concrete, i want to offer you a story of the study that i did. It was focused on this factchecking network that facebook has with five u. S. News. It changed a lot since you have been following this. I dont talk abouts the current configuration of this, this is as snapshot in time of a particular arrangement between a platform and News Organizations. What i want to do is use this as infrastructure to think with about what kinds of regulatory opportunities may exist. So just to give you a little bit of a background, this partnership that was formed immediately after the 2016 u. S. President ial election, you probably are aware that facebook was under pressure, sort of do something about this phenomenon of fake news. It continues to be in question but i want to drill down and focus on is a particular work flow that the partnership is established, that work flow centered around the dashboard. You want to study the thing that seems not interesting. Dashboard are fascinating even though they looked boring. Factcheckers would do to that dashboard and pick stories of it that they want to work on and theyll do their factchecking work and put it back in the system and facebook suck it up and learns some patterns and do some magic that they would not talk about and they would impact how speech circulated on their platform. This dashboard was a place where facebook engineers, free speech regulators and facebook and journalists all had to figure out how should we Work Together when we think about this circulation of speech of platform. What i want to think about that is how is this an infrastructure of free speech. How is this arrangement between organizations that works through this dashboard whether it works or not. A dashboard that none of us have seen probably. It is largely invisible but incredibly powerful for thinking of how it circulates. What assumption does this infrastructure make . Are the lems something that we can use or something of government. I want the offer three ways as potential opportunities or potential objects of focus for us to think about what would it mean to regulate a Partnership Like that, that was using an infrastructure like that, that had that kind of impact. I will go through them and q a that we can talk more about them. First is assume people talking of the informational idea of citizenship. Facebook to the feedbaacebook t feedback checker, it had the information and things that were wrong. Facebook talked about community comi coming through of truthful reporting. It is very much this Information Center model of what good public life looked like. Factcheckers also disagreed to sequester the information, whether it should be put in a separate area so people could still see the information. Whether you should juxtapose claims or counter claims. Almost no report in removing the information entirely. Factcheckers when i did interviews with them, yeah, thats too far. Well make on strong determination of what is false or true but it is not for us to decide that. Everyone for people heavily invested or making strong claims, there was a strong resistance. This idea that people share things because theyre passionate about it and theyre invested. The second thing i want to talk about is there is this idea in categories. Categories and stability of categories are things of the foundation of this infrastructure made it work c. First is facebook essentially out sourcing the construction of true or false to a set of professionals of these factcheckers, they can point to and say thats where the determination of truth or false lies. We dont want factcheckers to second guess themselves, we want them to be strong. Facebook defines popularity. In the dashboard, one of the columns was popularity. This is highly contentious column that factcheckers, first of all said i have no idea what it means. We asked facebook, what does popularity means . We dont know the mechanism to determine it. We dont see conservative media or no info. Words. Some words are not defined as popular because factcheckers surmise, maybe sometimes false or fake news make money. They wanted facebook to define popular for them to help them organize their work. One of them said you dont want to write something that have not gone viral because you dont want to elevate its visibility. Facebook has changeable words. They talked about false news and misinformation quite interchangeab interchangeably. Last what i want to talk about is how were infrastructure thinking of governing itself. How did it think of the circulation of speech. Probability seems to be this logic and way of dealing with scale that was consistently engaged with. Facebook manager Alliance Says when partners rate something as false, we rank those stories on news feed. This number of 80 came up a lot. Fact checkers surprisingly said i dont know how that number is calculated. We have no proof of that. I cant factcheck that claim. A claim that their partnership produced, i cant factcheck that claim. Thats a problem. Facebook and other places said they dont want to be an a arbitrary of truth. The last point i want to make on this probability thing is facebook is not along in this phenomenon. Youtube, twitter and amazon are all engaging with this and making some what arbitrary claim whether the network should work 80 of the time or 90 of the time or in what context. There is a struggle for way of dealing with scale that engages with probability. Okay, to finish i want to say why does this matter . I want to come back to hacking for a minute. In hacking there is this debate among computer scientists and philosophers in the 80s and 90s that can be resurvivvi. Be careful of the idea of the possibility. So what i try to do this quickly to say if we see platforms and journalists as creating speech infrastructures, if we do that, we can start to see some dimensions coming out of it and maybe those dimensions can be ways of thinking of regulations and opportunities going forward. Okay, i will stop there. Thank you. [ applause ] i apologize for spilling coffee on you. I am from the other campus, i have been here for eight years since the first time i have been to law school. Thanks for getting me downtown. I am going to talk a lot of what will echo of what mike already said but in a much less eloquent and big picture kind of way so you can look forward to that. It does suggest that we are on the right panel and maybe we need to be working together on some of these ideas, too. Thats great for me. Specifically, platforms are doing things right now and how people are thinking how platforms are doing things right now and based on my own research an alternative, a supplement that can help in this idea of combatting this information on social media social media, which is specifically user correction. Umm, so the problem that were talking about today is that the internet is terrible. A specific element of that that im going to talk about is theres a lot of misinformation on social media. I dont think i have to make this argument too strongly, i think people basically agree with me on that, although a caveat to that is that some of the recent studies that have looked at how much misinformation sharing on social media, theres probably less than you think there is, keep that in mind, the best guest based on Empirical Research is about 8. 5 of users have shared misinformation. So i think there is a lot to be done about this. A lot of the proposals that ive seen have been technologybased. So these are technology platforms. We think the problem is technological, the scale is technologic technological. Weve had misinformati informair but this is spreading farther and faster because of the nature of the technology its spreading on. What im going to argue is that there are ethical problems and empirical problems to doing that. This is uncomfortable for me because i almost ever say the word argumenment when im presenting, im very quanti quantitative research. Mike has already talked about it and others have talked about it it too, maybe we dont want Technology Firms to decide what is true and what is false. This is the first big problem. Im going to use an example which is, platforms have been taking a lot of action on misinformation related to vaccines. Here you see on the far side is pinterest, they are kind of the first movers on this issue. If you search for vaccines on pinterest you dont get pins from users. You only get pins from kind of, uh, Certified Health organizations. So here you see the w. H. O. And the aap, the American Association of physicians. Twitter and Facebook Follow suit on this, if you search on twitter, the one on the top, you get recommends to go to health and human services. If you search on facebook you get a recommendation to go to cdc. Gov. Interestingly all the platforms are recommending different organizations but thats a completely different thing to talk about. I think people think of this example as being the way that platforms can do this, right . Its easy enough to identify where theres lots of misinformation and then shut that down for users that are seeking out information. And this ties back to what mike was saying about, like, with a do we want users to be doing if theyre seeking information, thats a particular behavior we want to encourage and reward with good information and not bad information. So the idea we can box off all the crap peo

© 2025 Vimarsana