Transcripts For CSPAN2 American Bar Association Discussion O

CSPAN2 American Bar Association Discussion On Internet Communication Law July 12, 2024

Generals office and i worked with the sec. Can suitably over the years and we have panel with us today and we will have Georgetown University and former director of the bureau of Consumer Protections for federal trade commission and we will have matt sheared president of the computer and the [inaudible] in here to discuss the deferment of justices proposal we have Lauren Willard who is counselor to the attorney general. We are lucky to have this great panel here and ms. Willard will walk us through that puzzle and then for offered commentary about the consumer side of things and we have [inaudible]. Again i want to give a little level set and im sure most of the people who dialed in have had familiarity with section 230 as they understand what its about but i want to give history on these things and to get context to these venues and so forth. We talked about the Communications Decency act, section 230 with the recent or fairly recent but the wolf of wall street and in particular the jonah hill character who is brutally placed on nathaniel. He was an employee with [inaudible] of which was Leonardo Dicaprios to help clarify your memory and refresh it as to who these people are mr. Hill was the one who went up to one of his employees and said do more business or ill give you goldfish. He eventually said that was one of the few actors lines in the entire movie about his time in there but i reference this because [inaudible] probably did more for the internet than even known a lot when around 1994 they sued prodigy. Prodigy had a list that was underway from financial places and there were posts made on the list saying the stratton firm was engaged in major criminal fraud and easter cora was soon to be proven criminal and it was quote, brokers who live for living or get fired. They got sued by [inaudible] on their various libel claims in the court at the time analyze this on the basis of newspaper type theory liabilities of publishing libelous content. The court held that yes, prodigy was liable and the key instance that we are all familiar with its important for todays conversation to have an understanding is that by taking on trying to police [inaudible] prodigy was in itself taking on placing everything so if they were going to do any effort they would get caught and they could be held liable. One of or some senators with representatives in the house of representatives ron wyden caught this and thought it was an issue with the proposed what has been turned or what has evolved into Communications Decency act, section 230. Now, senator wyden was recently interviewed about this and said that part of the genesis for this effort was that if someone was exercising control and they have personal responsibilities of the saying as he said, nobody will invest in something when they themselves will be personally on the hook. So, with that effort as theyve seen postings germane to the growth of the internet and moving things forward they passed up medications decency act section 230 which truly has essentially defined at least through social media aspect of the internet as we both see. One of most important laws out there. So, with that in mind that is the history from the ice age known as the mid 1990s and if that is taken care of we can handed over to ms. Willard who will talk about some of the reasons of the developments and hopefully some of [inaudible] thank you very much. Thank you for listening in today. The department of justice started looking at section 230 of the cda almost one year ago now. It grew out of our broader review reviewing online pop forms and primarily focused on competition and antitrust but one thing we discovered as we looked at some of the concerns that people were about Online Platforms is not all had sufficient antitrust. Section 230 is one of those examples. When we had looking and listing to the widespread and bipartisan concerns that people had about the broad immunity under the cda we decided to open up an internal doj group and then in february of this past year we had a big public workshop over at the fbi and michael panelist with the esteemed panelist in that program and one last big event they had before everything changed. Then we had a series of expert roundtables with dozens of listening sessions and industry and stakeholders and fought long and hard about 230 and what we were hearing. Last june we issued a summary of our Public Events and roundtable as well as our key findings at the stage of where we think we need to go. We decided that based on everything we had learned through our engagement as well as internalizing that this is time to reform section 230 of the cda. Its a law that has been around since 1996 and the combination of the sniffing and technological changes in the last 25 years combined with very expansive Court Interpretations of the community and how Online Platforms have a wide array of Illicit Activity online and free to censor speech without accountability. We have put together a series of concrete but measured reforms in part two how you move the dialogue beyond the issue of that entirely or not at all so we have a series of reforms and i know my time is limited so i wont walk through every single one but i can say they fall into two general buckets. On the one hand there is a set of reform that are aimed at incentivizing platforms to better address illicit criminal activities on their platforms and on the other hand theres a series of reforms to help platforms be more trans. When an accountable when removing lawful speech. I read a critique of our report early on that it was criticizing, not for recording platforms to take down more speech and i thought well thats exactly right, we do want platforms to take down more criminal content but be more fair and transparent when addressing lawful speech. The law treats criminal and lawful speech differently so those buckets of both are important to work together. I think of it as book ends in a sense bird we dont want to have a scenario where you have uber censorship or liabilities but you also dont want the wild wild west in the internet with illegal content running rampant. To some of these in more detail, i think one of the categories that we looked at was bad samaritan carve up. The carveout for bad actors section 230 immunity. This is teen off of something that michael panelist matt mentioned in his remarks back of a very that this idea that under section 230 those bad actors that elicit or participate in unlawful conduct to become publishers and therefore not entitled to immunity. The problem is not all the courts see that as a but you have the decisions like other rules that have endued unlawful behavior and then they turn around and claim the benefit of 230. These things could be helpful and hopeful for the industry and not problematic to clarify in legislation that if you facilitate criminal activity you are not a Good Samaritan in the sense that you are not therefore entitled to this broad subsidy from civil liability. This will hopefully direct sites like backpage. Com which prompted the reforms in soft or imagine new version of that with child pornography gear. Com and we dont want those types of bad actors to be able to benefit from section 230 immunity. But related to that is even if you are a Good Platform once you havent knowledge that you have something that violates the federal criminal law on your platform rather its a child for not graffiti or terrorism or drug profiting but as soon as you know that and you understand that is clearly unlawful you have an allegation to do something about it, otherwise youre becoming complicit in that activity. We have a narrow notice liability provision that we are proposing in order to address that scenario to allow victims to seek redress as well as to incentivize platforms to be better about taking down the worst of the worst on their service. I will pause there and theres a reason we drew the line and federal criminal law. I think that is where platforms are clearly on notice that something is unlawful when youre talking about Something Like legal illegal drug trafficking. Its harder when you look at civil claims related to defamation for a platform to know if it is unlawful. Imagine if you have a diner at a restaurant that says my soup was cold and the platform will not know if the soup was cold or hot at the restaurant so at that point we dont what the platform to be in the position to adjudicate those claims unless they are receiving a court detriment that someone has determined it to be illegal. When it comes to something that objectively criminal violates the federal criminal law we are okay with showing that behavior and want platforms to do more so the victim of child sexual abuse material does not have to go through a Court Process before a platform do something about it. I also pause there and distinguish this from the carveout for federal criminal and forces by the government. Those are doing two very Different Things and sometimes the lines become blurred. Obviously we, as the government, can go after bad actors that are violating criminal law but our resources are scarce and its a very high bar. With the notice Liability Regime is intended to do is go after the victims and pursue civil claims against platforms that may not even raise the level of federal criminal law but the content that they are asking the platforms to take down is something that would violate the criminal law. That is also important because you have people on platforms that are personas you might not be able to get to the underlying perpetrator so your only recourse is acting the platform itself to do something. I think the other topic i want to touch on because it might be of interest to this crowd is that carveout federal Civil Enforcement in this touches on the comments from matts workshop so i think he mentioned that the industry and the government could do more. We 100 agree. One way the government could do more is by having a greater ability to go after some of these online civilly and criminally and in a sense by not having a federal [inaudible] were fighting these issues with one hand tied behind our back. And so that is one important reform we propose in our series. Related to that i think we also have what might be interested into the section is a carveout for the federal insight justice laws in that again with along with her about narrowly focused terrorism and cyber stalking are intended to address these areas of the law that we are not intended to be covered by section 230 and at the time was enacted was respondent, as richard said, to decisions about defamation and that is the heart of section 230 and there are certain claims about the outliers that i think we could carveout from section 230 without crumbling the internet as we know it. That covers a lot on the Illicit Activity, criminal side and he also briefly touch on the transparency and account ability then backed off to answer your cushions later and so the other reforms we are thinking about on the other side of the points are these ideas of immunity when you take down lawful speech you need to be more transparent and accountable of why you are doing that. I think we are not requiring platforms to be neutral but requiring them to be transparent and abide by their own rules. It should not be very controversial to say that in order to get this broad blade get immunity you have to abide by your own terms of service with taking down speech and you have to have a plain and clear rules to put people on notice about why you are taking on their content and some minimal level and due process in that sense. And so a lot of that is to be put into the context of the good faith position and courts have not done and have not given full expo nations of what that means we would provide a definition of good faith and i think we would also want to replace the term indeed to otherwise objectionable that has not been read over time to give platforms essentially Carte Blanche to take down any content with which they disagree, even though youre looking at it it is along the other terms in the statute you think it should be read in light of just the type of content that would be harmful to children and going back to the purpose of cda 230. But given that has caused a lot of mischief over time we would just propose that we are replacing it with clear language like unlawful, terrorism, promote selfharm and things that would give greater guidance into platforms and users about when you can take down content with a broadly get immunity. I think it calls a note there that doesnt render then automatically liable if you move section 230 shield that stops claims of the very and you could still have an issue of takedown content based on your terms of service in separate instances. I think that covers a number of our reforms and they just also want to say thank you for everyone that has come and spoken with doj. Our experts in industry, stakeholders, we really try to take this process seriously and recognizing that there is a benefit to section 230 but that it has grown so far that we need to do something to reform it into updated and two for the changes of technology and to address these outlier Court Decisions in order to protect american, the number of different ways. Thank you so much, lauren. We appreciate the apartment of justice having you here to walk us through all of these points and the positions here. We are leaving the government to the other speakers. With that in mind mr. Shares, if you would, give us a brief talk about how this industry and cda 230 as is and tell us about what you see it may be becoming. Thank you. I appreciate that richard. We thank the ada for bringing up this important topic and we are happy to have the opportunity we have covered thus far everyone knows section 230 encourages online intermediaries to moderate and remove problematic thirdparty content. It does that in two ways first, it has what we sometimes refer as lead dont shoot the messenger rule is that the intermediary should not be responsible for the actions of individuals bad actors and it also says the digital should not be sued for decisions they make trying to improve online content. Or to suppress the misuse of their services but for example, if a Digital Service terminates the account of selfproclaimed summoned us to, it closes the court outdoors to those miss users of the service, potential dad who would want to relitigate those moderations . And similarly, for example, hey, this is inflammatory about me. Like richard story about the wolf of wall street and the Service Makes a call and says actually, this is an extraordinarily perfect firm and we will not to that content down and then he tried to get through litigation that which they lost in the moderation so so to recap, section 230 encourages firms to plea without fear that they will own what they dont remove and without fear that when they do remove something that they wont be subjected to litigation by the bad actors who posed it. This has the effect of protecting viewpoints including marginalized students who might not be able to get a voice in the newspapers or on Cable Television to be heard online without being heckled off by their critics. This was all as our story began came from the decision in 1994 which led to the decision in 1986 where congress foresaw the consequences of litigation as a tool for controlling the internet content and dis incentivizing in moderation. In my View Congress never said that plaintiffs cant pursued bad actors who are posting the content. They did not simply limit action against the services that those bad actors use. At the end of the date matter what any of us talk about here you can always act against bad actors but its just the reality is sometimes it is easier to be between that user and the rest of the internet instead of the bad actors themselves. The problem on the internet is that it is small Digital Services startups dont have the resources but some of the Larger Companies do to fill out elaborate content moderation. So, industry wide what has been the effect of that . It its created a vibrant Internet Community that is without question the envy of the world and there are hundreds of thousands of jobs and tens of billions of jobs in annual productivity that use Online Services and that is not just the companies themselves but small and mediumsized businesses brickandmortar Companies Using these Digital Services to engage in Business Activity and of course you and i every day users drive down value and my researchers did more work [inaudible] average american gets 17000 a year from user Digital Services and virtually all of which arrived upon section 230 so undermining those protections could amount to the 17000dollar american and small businesses. A lot of the proposals we are talking about and i dont want to single out the doj proposal that an tia submitted a proposal and they have found some legislative proposals. A lot of these proposals have the risk of undermining those protections. Let me point out one example in the doj has surfaced is which about providing notice when one engages in content moderation and that seems to be an easy thing to do. You notice when you moderate their content and that may sound good but when you think about it Legal Services terminating 70 million accounts a month for Disinformation Agency and creating services to provide for foreign agents to explain with particularity the basis for the out termination and you talk about terminating the accounts and also there are proposals to point to the explicit vision and the types of Service Every time accounts are terminated or taken down and that may have the effect when

© 2025 Vimarsana