vimarsana.com

Officials testify about Consumer Protection. They looked at Internet Service providers, Tech Companies and Internet Users who repost thirdparty material. The committee will now come to order. The chair now recognizes himself for five minutes for an opening statement. Online content moderation has largely enabled the internet experience that we know today, whether its looking up restaurant reviews on yelp, catching up on snl on youtube or checking in on a friend or a loved one on social media, these are all experiences that weve come to know and rely on, and the platforms we go to to do these things have been enabled by usergenerated content as well as the ability of these companies to moderate that content. Section 23069 Communications Decency act has enabled that ecosystem to evolve, by giving Online Companies the ability to moderate content without equating them to the publisher or speaker of that content, weve enabled the creation of massive online communities of millions and billions of people to come together and interact. Today, this committee will be examining that world that section 230 has enabld, bothed good and the bad. I would like to thank the witnesses for appearing before us today. Each of you represents important perspectives related to the content moderation and the online ecosystem. Many of you bring up complex concerns in your testimony and i agree that this is a complex issue. I know that some of you have argued that congress should amend 230 to address such as online criminal activity, and hate speech. I agree these are serious issues. Like too many other communities, my hometown of pittsburgh has seen what unchecked hate can lead to. Almost a year ago our community suffered the most deadly attack on jewishamericans on our nations history. The shooter did so after posting antisemitic remarks on a site before finally posting that he was going in. A similar attack occurred new zealand and the gunman streamed his despicable acts on social media sites and while some of these sites move to quell the spread of that content, many didnt move fast enough and the algorithms helped to meant sports highlights and celebrity goes viral helped a heinous act. In 2016 we saw similar issues when foreign adversaries used the power of these platform against us to disseminate this information and foment doubt in our institutions. Clearly, we all need to do bet e and i would strongly encourage the witnesses before us that represent these Online Platforms and other major platforms to step up. The other witnesses on the panel bring us serious concerns with the kind of content available on your platforms and the impact that content is having on society, and as they point out, some of those impacts are very disturbing. You must do more to address these concerns. That being said, section 230 doesnt just protect the largest platforms or the most fringe websites. It enables comment sections on individual blogs, people to leave honest and open reviews and free and open discussion about controversial topics. The kind of ecosystem that has been enabled by more open, online discussions has enriched our lives and our democracy. The ability of individuals to have voices heard and particularly marginalized communities cannot be understated. The ability of people that speaks truth to power has created political movements in this country and others that have changed the world we live in. We all need to recognize the incredible power this technology has for good as well as the risks we faced when its misused. I want to thank you all again for being here, and i look for washed today to our discussion and i would now like to yield the balance of our time to my good friend miss matsui. Thank you, mr. Chairman. I want to thank the witnesses for being here today. In 2018 Mark Zuckerberg came to congress and said it was my mistake and im sorry. When pushed about facebooks role in allowing russia to influence the 2016 president ial election. Fast forward, 555 days, i fear that mr. Zuckerberg may not have learned from his mistake. Recent developments confirm what we have all feared. Facebook will continue to allow ads that push falsehoods and lies once again making his online ecosystem Fertile Ground for election interference in 2020. The decision to remove platantly false information should not be a difficult one. The choice between deep fake, hate speech and online bullies and a factdriven debate should be easy. Facebook doesnt want to play referee about the truth and political speech then they should get out of the game. I hope this hearing produces a robust discussion because we need it now more than ever. Mr. Chairman, i yield back. Thank you. Thank you. The jebtel lady yields back. The chair recognizes the Ranking Member of the subcommittee for five minutes for his opening statement. Thank you for holding todays hearing and thank you very much to our witnesses for appearing before us. The hearing of content moderation and section 230 of the Communications Decency act. This hearing is a continuation of a serious discussion we began last session as to how congress should examine the law and ensure accountability and transparency for the hundreds of millions of americans using the internet today. We have an excellent panel of witnesses that represent a balanced group of stakeholders and were closely tied to section 230. They range from large to Small Companies as well as academics and researchers. Let me be clear, im not advocating that Congress Repeal the law. Im advocating for congress that could lead to a slippery slope of a death by a thousand cuts and some would argue in the internet industry if it was entirely repealed. Before we discuss if they should make modifications to the law we should first understand how we got to this point. It is important to look at section 230 in context and when it was written. At the time the decency portion of the telecom act in 1996 included other prohibbings on objectionable or lewd content in the early internet. Provisions that were written to target obscene content were struck down by the supreme court, but the section 230 provisions remain. Notably, cda 230 was intended with internet platforms that Interactive Computer Services like compuserve to take down offensive content. As chris cox stated on the house floor, we want to encourage people like prodigy, like compuserve and America Online and Microsoft Network to do Everything Possible for us, the consumer, to help control the portals of our first what comes first. The court has such broad interpretation and something with Liability Protection without platforms having to demonstrate that they are doing, quote, Everything Possible instead of encouraging what congress envisioned, theyve hidden behind shield and used litigation without having to take a responsibility. Want only are Good Samaritans sometimes being selective in taking down harmful or illegal activity, but section 230 has interpreted so broadly that bad samaritans can skate by without accountability. Thats not to say all platforms never afforded the tools by congress and many of the bigger platforms and thats with a b, accounts manually. Aufsh times these instances are the selection and not the rule. Today well learn how platforms decide to remove content and whether its the tools provided by section 230 or by their own selfdestructive terms of service. Under either ought authority we should be encouraging enforcement to continue. Mr. Chairman, thank you for holding this important hearing to have an open discussion on the intent and if we should reevaluate the law. We must ensure the platforms are held without drastically affecting the innovative startups. With that, chairman, i yield back the balance of my time. I should mention this is a joint hearing between our committee and the committee on Consumer Protection and commerce, and i would like to recognize the chair of that committee for five minutes, miss tchaikovsky. Good morning and thank all of the panelists for being here. Today the internet has improved our liefrs in many, many ways and enabld americans to more actively participate in society, education and commerce. Section 230 of the communications and decency act has been at the heart of the United States internet policy for over 20 years. Many say that this law allowed free speech to flourish, allowing the internet the internet to grow into what it is today. In the early days of the internet it was intended to encourage Online Platforms to moderate usergenerated content to remove offensive, dangerous or illegal content. The internet has come a long way since the law was first enacted. The amount and sophistication of User Postings has increased exponentially. Unfortunately, the number of americans who report experiencing extremism, extreme Online Harassment which includes Sexual Harassment, stalking, bullying and threats of violence have gone up over the last two years. 37 of users say that they have experienced that this year. The likewise, extremism, hate speech and election interference and other problematic content is proliferating. The fred of such content, for sure and causes harm that multibilliondollar Companies Like facebook or twitter cant or wont fix. If this ofnt juf cause for concern and more forprofit businesses are attempting to use section 230 as a liability shield and actively that they can that they have nothing to do with thirdparty content or moderation policy. In a recent Washington Post article, uber seemed to be open opening the door to claims to claiming vast immunity from labor, criminal and local traffic liability based on section 230. This would represent a major unraveling of 200 years of social contracts, Community Governance and congressional intent. Although, also at issue is the federal trade commissions section 5 authority and unfair or deceptive practices and the ftc pursued section 5 cases and the websitegenerated content, but the terms of Service Violations for thirdparty content may also be precluded by the 230 immunity. I wanted to talk a bit about injecting 230 into trade agreements. It seems to me that weve already seen that now in the japan trade agreement and there is a real push to include that now in the mexico, canada, u. S. Trade agreement. There is no place for that. I think that the laws in these other countries dont really accommodate what the United States has done about 230. The other thing is we are having a discussion right now, an important conversation about 230. And in the midst of that conversation because of all of the new developments, i think it is just inappropriate right now at this moment to insert this Liability Protection into trade agreements and as a member of the working group that is helping to negotiate that agreement, i am pushing hard to make sure that it just isnt there. I dont think we need to have any adjustment to 230. It just should not be in trade agreements. So all of the issues that we are talking about today indicate that there may be a larger problem, that 230 no longer is achieving the goal of encouraging platforms to protect their users and today i hope that we can discuss holistic solutions. Not talking about eliminating 230, but having a new look at that in light of the many changes that we are seeing into the world of big tech right now we want i look forward to hearing from our witnesses and how it can be made even better for consumers, and i yield back. Thank you. The gentle lady yields back. The chair now recognizes the Ranking Member of the Committee Miss mcmorris rogers. Good morning. Welcome to todays joint hearing on online content management. As a republican leader on the Consumer Protection and commerce subcommittee it is my priority to protect consumers while preserving the ability for Small Businesses and startups to innovate. In that spirit, today we are discussing Online Platforms in section 230 of the Communications Decency act. In the early days of the internet two companies were sued for content, posted on their website by users. One company sought to moderate content on their platform, the other did not. In deciding these cases, the court found the company that did not make content decisions was immune from liability, but the company that moderated content was not. It was after these decisions that congress created section 230. Section 230 is intended to protect, quote, Interactive Computer Services from being sued while also allowing them to moderate content that may be harmful, illicit or illegal. This Liability Protection has played a critically a critical and Important Role in the way we regulate the internet. Its allowed Small Businesses and innovators to thrive online without the fear of frivolous lawsuits from bad actors looking to make a quick buck. Section 230 is also largely misunderstood. Congress never intended to provide immunity only to websites who are, quote, neutral. Congress never wanted platforms to simply be neutral conduits, but instead wanted platforms to moderate content. The Liability Protection also extended to allow platforms to make good faith efforts to moderate material that is obscene, lewd, excessively violent or harassing. There is supposed to be a balance to the use of section 230. Small Internet Companies enjoy a safe harbor to innovate and flourish online while also Incentivizing Companies to keep the internet clear of offensive and violent content by empowering the platforms to act and to clean up their own site. The internet also revolutionized the freedom of speech by providing a platform for every american to have their voice heard and to access an almost infinite amount of information at their fingertips. Medium and other online blogs have provided a platform for anyone to write an op ed. Wikipedia, provides free, indepth information on almost any topic you can imagine through mostly usergenerated and moderated content. Companies that started in dorm rooms and garages are now global powerhouses. We take great pride in being the Global Leader in tech and innovation, but while some of our Biggest Companies certainly have grown, have they matured . Today its often difficult to go online without seeing harmful, disgusting and somewhat illegal content. To be clear, i fully support free speech and believes society strongly benefits from open dialogue and Free Expression online. I know that theres been some calls for Big Government to mandate or dictate free speech or ensure fairness online and its coming from both sides of the aisle. Though i share similar concerns that others have expressed, that are driving some of these policy proposals, i do not believe these proposals are consistent with the First Amendment. Republicans successfully fought to repeal the fccs fairness doctrine for broadcast regulation of the 1980s and i strongly caution for advocating for a similar doctrine online. It should not be the fcc, ftc or any government agencys job to moderate free speech online. Instead, we should continue to provide oversight of big tech and their use of section 230 and encourage con structurive discussions on the responsible use of content moderation. This is a very important question that were going to explore today with everyones on the panel. How do we ensure that companies are responsibly earning their protection. We want them not only question from the shield, but also use the sword congress afforded them to rid their sites of harmful content. I understand its a delicate issue and certainly very nuanced. I want to be very clear, im not for gutting section 230. Its essential for consumers and entities in the internet ecosystem. Misguided and hasty attempts to amend section 230 for bias or other reasons could have unintended consequences for free speech and the ability for Small Businesses to provide new and Innovative Service e but at the same time its clear weve reached a point where it is incumbent upon us as policymakers to have a serious and thoughtful discussion about achieving the balance on section 230. I thank you for the time and i yield back. The gentle lady yields back. The chair recognizes mr. Palome for his opening statement. Thank you. The internet is one of the single, greatest innovations and it promotes community, and it fosters Economic Opportunity with trillions of dollars exchanged online every year and one of the principal laws that paved the way for the internet to flourish to section 230 of the communications, decency act which, of course, pay us through the telenuncations act in 86 we gave them the ability to immediate rate their sites without excessive risk of litigation and to be clear, section 230 has been an incredible success, but in the 20 years since section 230 became law the internet has become more complex and sophisticated. In 1996 the Global Internet reached only 36 million users or less than 1 of the worlds population. Only one in four americans reported going online every day. Compare that to now when nearly all of us are online almost every hour that were not sleeping. Earlier this year the internet passed 4. 39 billion users worldwide and here in the u. S. There are about 230 million smartphones that provide americans instant access to Online Platforms. The internet has become a central part of our social, political and economic fabric in a way that wye couldnt have dreamed of when we passed the Telecommunications Act and with that complexity and growth, we also have seen the darker side of the internet grow. Online radicalization has spread leading to Mass Shootings in our school, churches and movie theaters. International terrorists are using the internet to groom recruits and platforms have been used for the illegal sale of drugs including those that sparked the Opioid Epidemic and the disinformation campaigns using new Technology Like deep fakes to sow civil unrest and disrupt democratic electionses and there are constant attacks against women, people of color and other minority groups. It throws in the rur sender of sexual exmrit aigd online. Shes the depicting the material last year, had had and 45 videos are removed and recent reporter theyed that Law Enforcement are overwhelmed. Each of these issues demonstrates how online content moderation has not stayed true to the values underlying section 230 and has not kept pace with the increasing importance of the Global Internet. Theres no easy solution to keep this content off the internet. As policymakers im sure we all have ideas about how we might tackle the symptoms of content moderation online while also protecting free speech, but we must seek to fully understand the breadth and depth of the internet today, how its changed and how it can be made better and we have to be thoughtful, careful and bipartisan in our approach. So its with that in mind that i was disappointed that ambassador lighthizer, u. S. Trade representative, refused to testify today. The u. S. Has included language similar to section 230 in the United Statescanadatrade agreement. They raised concerns about why the ustr has included this language in trade deals as we debate them across the nation, and i was hoping to hear his perspective on why he believes that that was appropriate. Because including provisions that were controversial to both democrats and republicans is not the way to get to democrats. Theyll be more responsive to bipartisan requests in the future. With that, mr. Chairman, i would yield back. The chair would like to remind members that all Opening Statements shall be made part of the record. Could mine be i apologize. The chair now yields to my good friend, the Ranking Member. How times have changed. For five minutes. Thank you, mr. Chairman. I want to welcome our witnesses and thank you for being here. It is very important work and we have another subcommittee meeting upstairs and ill be bouncing in between and i look forward to your comments. Its without a question of balanced roster of experts in this field so were really blessed to have you here and last congress we held significant hearings that jumpstarted the discussion on the state of online protections as well as the legal basis underpaying the modern internet ecosystem as you heard today and the future of content moderation as algorithms determine, and thats an issue our constituents want to know more about. Today well undertake the deeper review of the Communications Decency act portion of the Telecommunications Act. In august of this year, as you just heard, chairman pollone and i raised the issue of the export language mirroring section 230 and trade agreements and we did that in the letter with the u. S. Trade representative robert lighthizer. We expressed concerns of being out of the content of its intent and the trade representative should consult the committee in advance on negotiating on these very issues. Unfortunately, we have learned that derivative language of section 230 appeared in an agreement with japan and continues to be advanced in other discussions. Im very frustrated about that and i hope the administration is paying attention on this matter. The administration itself says theyre applying to how cda is being utilized in american society. That makes it even more alarmi for the ustr to be in the involvement of this committee. Be clear. This section of the telecom act served as a foundation for the information age. So we are here by no means to condemn, but rather to understand what truly is and see that the entirety of this section is faithfully followed rather than cherry picking a portion. I of twant to go back to the tr piece. I thought the letter to the ambassador would send the right message. Were not trying to blow up ustr or usmca. Im a big free trader, but were getting blown off on this and im tired of it so let it be here. Then we find out its in the japan agreement. Clearly, theyre not listening to our committee or us. So were serious about this matter. Weve not heard from the ustr and this is a real problem. So take note. If we only refer to section 230 as the 26 words that created the internet as has been popularized by some, were miths the mark by our word count which you can use software to figure out and that excludes section c2. We should Start Talking more about that section about the 83 words that can preserve the internet. All of these actions and provisions of cda 230 should be taken together and not apart and many of our concerns could be readily addressed if the company had just enforced their terms of service. Put that in context and a quick History Lesson is in order. The days of compuserve and prodigy and they dominated in the 90s and while the internet is more dynamic and content rich than ever before, there were vast amounts of speech online. As chris cox and the oauthor of the legislation, no matter how big the army of bureaucrats its not going to protect my kids because i do not think the federal government would get there in time. Thats his quote. As congress recognized then as we do now to create harmful and illegal content for the platforms and its not something to be regulated and managed by the government. Clearly bestowed on providers, the ability to go after the illegal and harmful content without fear of being held liable in court. While the law was intended to empower, we have seen social media platforms slow to clean up sites from legal responsibility for such content. In some cases,er in the platforms have shishgd the responsibility for the content of their platforms. The broad liability shield now in place through common law has obscured the central bargain that was struck and that is the internet platforms with usergenerated content are protected from liability in exchange for the ability to make goodfaith effort for harmful illegal content and the Computer Services definition and enforce your own terms of service. I look forward to an informative discussion today on the differentiating protective speech from illegal content and how we should think of cda Consumer Protections, and various elements and they sap what consumers see or dont see. Mr. Chairman, thank you for holding this hearing and i look forward to the witnesses. So the administration doesnt listen to you guys either, huh . My statement spoke for itself pretty clearly, i think. Well find out if theyre listening or not. The gentleman yields back. I would reiterate that the members written Opening Statements will be made part of the record. We will introduce our witnesses for todays hearing and mr. Steve huff man, cofounder and ceo of reddit, welcome. Miss danielle sitron, professor of law at Boston University school of law. Welcome. Dr. Curran mcsherry, legal director of the Electronic Frontier foundation, welcome. Miss gretchen peter, the executive director of the alliance that counter crime online, welcome. Miss Katherine Oyam a global head of intellectual property policy for google, welcome, and dr. Hani fareed, professor at the university of california berkeley, welcome to all of you. We want to thank you for joining us today. We look forward to your testimony. At this time the chair will recognize each witness for five minutes to provide their opening statement. Before we begin i would like to explain our lighting system. In front of you is a series of lights and the light will initially be green at the start of the opening statement. The light will turn yellow when we have one minute remaining and please wrap up your testimony at that point when the light turns red, we just cut your microphone off. No, we dont, but try to finish before then. So mr. Huffman, well start with you and you are recognized for five minutes. Thank you. Good morning. Chairpersons, Ranking Members and members of the committee. Thank you for inviting me. My name is steve huffman, im the cofounder and ceo of reddit any i and im grateful to explain why it is good for our company and the open internet. Reddit is in a fundamentally different way and we empower communities and this approach relies on 23 on. Changes of tw230 pose an exist earn threats and we destroy what little competition remains in our industry. We started reddit in 2005 to find news and interesting content. Since then its grown into a vast, communitydriven site where millions of people find not just news and a few laughs, but new perspectives and a real sense of belonging. Reddit is communities and theyre moderated by our users. Our model has taken years to develop with many hard Lessons Learned along the way. As some of you know i left the company in 2009 and for a time reddit lurched from crisis to crisis over questions of moderations that were discussing today. In 2015 i came back because i realized the vast majority of our communities were providing an invaluable use to our users. The way reddit handles content is unique. We use a model akin to our own democracy where everyone follows a set of rules and has the ability to vote and selforganize and provide responsibility for how the platform works. First, we have the content policy. The fundamental rules that everyone on reddit must follow. Think think of these of the federal laws, and collectively knownsa the antievil team to enforce these policies. Beyond that, each community create their own rules. These rules are tailored to the unique needs of their communities intended to be complex than hours. It is the most scalable solution to the challenges of moderating content online. Individual users play a crucial roll and they can have post or comments and report it to the team. Through this system of voting and records, users can rejecter accept any content. Thus finding its effectiveness has happening improved with our efforts. Theyve shown to be largely, factive in curbing bad behavior. In 2016 we found that of all accounts that tried less than 1 made it past the routine defenses of our team and Community Moderation and simple down votes from everyday users. We also constantly evolve our content policies and since my return weve made a series of updates, with deep fake pornography and content harassment. Weve worked to moderate in good faith which brings us to the question of what reddit would like like without 230. For starters wed be forced to defend against anyone with enough money to bankroll a lawsuit no matter how frivolous. The most commonly dismissed are about defamation. Its an open platform where people are allowed to voice critical opinions and effectively censoring through litigation. It will create a Regulatory Burden on the entire industry and benefiting the Largest Company by placing a significant cost, and more than enough to be considered a Large Company and today were an underdog compared to our nearest competitors. We are 10 to 100 teams our size. Still, we recognize theres harmful material on the internet. Its important to understand that rather than helping near row changes can undermine the power of community and help the vulnerable. Take the open yoioid epidemic a have many communities where those struggling with addiction can help them on their way. When theyre simply become too ricky. This is exactly the type of decision 230 would force on us. It is a uniquely american law that has aloud the internet platforms like ours to flourish while also incentivizing good faith attempts to mitigate the down sides of Free Expression. While theyre serious and demand the attention of both us and industry and you in congress, they do not outweigh the overwhelming good that 230 has enabled. Thank you. I look forward to your questions. Thank you, mr. Huffman. Miss sitron, you are recognized for five minutes. Thank you for having me and for having such a thoughtful bench with me on the panel. When congress adopted section 230 20 years ago the goal was to incentivize Tech Companies to moderate content and although congress, of course, wanted the internet and what they happened are wanted at that time to be open and free, we also knew that openness would risk offensive material and im going to use their words, and so what they did was devise an insensitive, a legal shield for Good Samaritans who are trying to clean up the internet. Both accounting for the failure to remove so underfiltering, and overfiltering of content. The purpose of the statute was fairly clear, but its interpretation, the words werent and so when weve seen are courts massively overextending section 23 po to sites that are irresponsible in the extreme and that produce extraordinary harm. Weve seen the liability shield to whose entire model is abuse, to sites that all they do is cure eight viewers and deep fake sex videos they get to have iks munity immunity. Its also sites that have nothing to do with speech, but traffic in Dangerous Goods like arms list. Com, and the costs are significant. This overbroad interpretation allows bad samaritan sites, reckless, irresponsible sites to really have costs on peoples lives and ill take the case of Online Harassment because ive been studying it for the past ten years. The costs are significant and especially to women and minorities. Online harassment thats often hosted on these sites is costly to peoples central life opportunities. So when a Google Search of your name contains rape threats your nude photo without your consent and your home address its hard to get a job and its hard to keep a job and also for victims, they are driven offline in the face of online assaults. Theyre terrorized and they often change their names and they move. In many respects the calculous, the free speech calculous is not necessarily a win for free speech as were seeing diverse viewpoints and diverse individuals being chased offline. So now the market is not going to solve this problem and so many of these businesses, they make money off of Online Advertising and salacious and negative and novel content, that attracts eyeballs. So the market itself, i dont think we can rely on to solve this problem. Of course, legal reform. The question is how should we do it . I think we have to keep section 230. It has tremendous upsides, but we should return it to its original purpose which was to condition the shield on being a Good Samaritan, and we call reasonable content moderation practices and there are other ways to do it. In my testimony i sort of draw up some solutions, but weve got to do something because doing nothing has costs. It says to victims of online abuse that their speech and their equality is less important than the business profits of some of these most harmful platforms. Thank you. The chair now recognizes dr. Mcsherry for five minutes. Thank you. As legal director for the Electronic Frontier foundation, i want to thank the chairs, Ranking Members and members of the committee for the opportunity to share our thoughts with you today on this very, very important topic. For nearly excuse me, for nearly 30 years, eff has represented the interests of Technology Users both in court cases and in broader policy debates to help ensure that law and Technology Supports our civil liberties. Like everyone in this room, we are well aware that online speech is not always pretty. Sometimes its extremely ugly and it causes serious harm. We all want an internet where we are free to meet, create, organize, share, debate and learn. We want to have control over our online experience and to feel empowered by the tools we use. We want our elections free from manipulation and for women and marginalized communities to be able to speak openly about their experiences. Chipping away at the legal foundations of the internet in order to pressure platforms to Better Police the internet is not the way to accomplish those goals. Section 230 made it possible for all kinds of voices to get their message out to the whole World Without having to acquire a broadcast license, own a newspaper or learn how to code. The law has thereby helped remove much of the gate keeping that once stifled social change and perpetuated power imbalances, and its not just because it protects tech june thes. It protects regular people, if you forwarded a picture or piece of political criticism youve done so with the protection of section 230. If you maintained an online forum for a neighborhood group, youve done so with the protection of section 230. If youve used wikipedia to figure out where George Washington was born, you benefitted from section 230. If you watched videos from northern syria, you are benefitting from section 230. Intermediaries whether social media platforms, websites and foreigners are protected from section 230 for their benefit. Theyre protected so they can be available to all of us. Theres another very practical reason to resist the impulse to amend the law to pressure platforms to more actively monitor and moderate user content. Simply put, theyre bad at it. As eff and many others have shown they regular le take down all kinds of valuable content its lawful difficult between lawful and unlawful speech particularly at scale and those mistakes often silence the already marginalized people. Moreover, increased liability risk will inevitably lead to lead to overcensorship and it is a lot easier and cheaper to take something down than to pay lawyers to fight over it, particularly if youre a smaller business or nonprofit. And automation is not the magical solution. Context matters very often when youre talking about speech, and robots are pretty bad at nuance. For example, in december between the 18, blogging platform tumblr announced a new ban on adult content. In an attempt to explain the policy tumblr explained new content that would be acceptable under the new rules. Tumblr flagged the same images as unacceptable. Heres the last reason. New burdens are likely to stifle competition and they for automation and litigation competitors dont have that kind of budget. So in essence, we would have opened the door to a few companies and then slammed that door shut for everyone else. The free and open internet has never been fully free or open and the internet can amplify the worst of us as well as the best but at root, the internet still represents and embodies an extraordinary idea that anyone with a Computing Device can connect with the world to tell their story, organize, educate and learn. Section 230 helps make that a reality and its worth protecting. Thank you, and i look forward to your questions. Thaw, dr. Mcsherry. Ms. Peters you are recognized for five minutes. Thank you. Distinguished members of the subcommittee. It is an honor to be here today to discuss one of the most premiere security threats of our time. One that congress is well in a position it side. I am the executive director it counter crime online. Our team is made up together of who have come together to eradicate Serious Organized Crime and terror activity on the internet. I want to thank you for your interest on this research and for asking me to join the panel of witnesses to testify. Like you, i hope to hear the testimony of the u. S. Trade representative because keeping the 230 language out of the agreements is critical to our national security. Distinguished committee member, i have a long history of tracking organized crime and terrorism. I was a reporter and wrote a book about the taliban and the drug trade. . That the got me i mapped transnational Terror Networks for the special Operations Command and centcom. In 2014 i received state Department Funding to map wild life supply chains and thats when my team discovered that the largest Retail Markets for endangered species are actually located on social media platforms like facebook and wicha. Founding the ability to brau many times has changed it is far worse than we ever mentioned and we can of cba 230. There was supposed to be a shared responsibility between tech platforms and organizations like acco, but tech firms are failing to uphold their end of the bargain because of broad interpretations by the courts, they enjoy undeserved safe harbor for hosting Illicit Activity. Distinguished Committee Members, the Tech Industry may try and convince you today that most illegal activity is confined to the dark web, but thats not the case. Surface web platforms provide much the same anonymity and Payment Systems and a much greater reach of people. Were tracking illicit groups from mexican drug cartels to triads that have weaponized platforms and were talking about u. S. , a wide range of illegal goods. Now that were in the midst of a crisis, the Opioid Epidemic which is claiming the lives of 60,000 americans a year, but facebook, the Worlds Largest social Media Company only began tracking drug activity on its platform last year. It identified 1. 5 posts selling drugs. Thats 100 times website the silk road ever cared. Study after study by a ceo, and facebook, reddit, youtube to market and sell fentanyl, oxy c codone and in direct violation of u. S. Federal law. Every major internet platform has a drug problem. Why . Because there is no law that holds tech firms responsible even when a child dies buying drugs on an internet tech firm. Tech firms play an active role in facilitating harm. Their algorithms, wellintentioned connecting friends all link terror groups to a social audience. Isis use social media to recruit, fund raise and spread their propaganda. The acco alliance include an Incredible Group of archaeologists sold in many places by items ranging from rhino horn and elephant ivory to live chimpanzees and cheetahs. Its literally threatening species with extinction. I can continue to sit here and horrify you all morning, children being sexually abuses, weapons explosives, human remains and counterfeit goods is just a few click away. Committee member, modifying cda 230 is a threat to freedom of speech, but cda 230 is a law about liability, not freedom of speech. Please try and imagine another industry in this country that has ever enjoyed such an incredible subsidy from congress, total immunity no matter what harm their product brings to consumers. Tech firms could have implemented internal controls to prevent Illicit Activity from occur, but it was cheaper and easier to scale while looking the other way. They were given this incredible freedom and they have no one to blame, but themselves for squandering it. We want to see reforms to the law and to strip i community to strip away the content to regulate the firms mustreport crime to the appropriations by Law Enforcement. Distinguished Committee Members it ought to be real to host it online, it is only to reform cba 230 to make the internet safer for all. Thank you very much. The gentle lady yields back. You are recognized for five minutes. Chairman doyle, chairwoman schakowsky, distinguished members of the committee, thank you for the opportunity to appear before you today. I appreciate your leadership on these important issues and welcome the opportunity to discuss googles work in these areas. My name is katie oyama and im the global head of ip policy at google. In that capacity i also advised company on Public Policy frameworks for the management and moderation of online content of all kinds. At google our mission is to organize and make the worlds information universally accessible and useful. Our services and many others, are positive forces for creativity, learning and access to information. This creativity and innovation continues to yield Enormous Economic benefits for the United States, however, like all means of communications that came before it, the internet has been used for both the best and worst of purposes. This is why in addition to respecting local law, we have robust policies, proceedures and Community Guidelines that govern what activity is permissible on our platforms and we update them regularly to meet the changing needs of our users and society and my testimony today will focus three areas. The history of 230 and how it has helped the internet grow, how 230 contributes to our efforts to take down harmful content and googles, and section 230 has created a robust internet ecosystem and where Free Expression tries to fight online abuse. Digital platforms help millions of consumers find legitimate content across the internet facilitating 29 trillion in online commerce each year. Addressing illegal content is a shared responsibility and our ability to take action is underpinned by 230. The law not only clarifies when services can be held liable for thirdparty content, but also creates the Legal Certainty necessary for services to take swift action against harmful content of all types. Section 230s Good Samaritan provision was introduced to incentivize the selfmoderating and content innovation. It also does nothing to alter platform liability for violations of federal, criminal laws which are expressly exempted from the scope of the cda. Over the years the importance of section 230 has only grown and it is critical in ensuring continued economic growth. A recent study found that over the next decade, 230 will contribute an additional 4. 35 million jobs and 45 million in growth to the economy. Furthermore investors in the Startup Ecosystem has said that weakening safe harbors will have an impact on investment and internationally, 230 is a differentiator for the u. S. China, russia and others take a very different approach to innovation and to censoring speech online. Sometimes including speech that is critical of political leaders. Perhaps the best way to understand the importance of 230 is to imagine what might happen if it werent in place. Without 230 Search Engines and video sharing platforms and startups and review sites of all kind would either not be able to moderate content at all or they would overblock. Either way, harming consumers and businesses that rely on their Services Every day. Without 230, platforms could be sued for decisions around removal of content on their platform suv as removal of hate speech and related to pyramid schemes. Because of 230 we can enforce vigorous policies to ensure that platforms are useful and vibrant for our users. For each product we have a specific set of rules and guidelines that are suitable for the type of platform, how it is used and the risk of harm associated with it. They range from clear content guideline that report content that violates them to creasively enable Machine Learning before a single human user has been able to beingaes is it. In the threemonth period from april to june, weve removed 9 million videos from the platform for violating Community Guidelines and 87 of this content was flagged by machines first rather than by humans and of those detected by machines, 81 of that content was never viewed by a single user. We now have over 10,000 people across google working on content moderation. Weve invested hundreds of millions of dollars for these efforts. In my writen testimony we need to go into policies and procedures on search, google ads and youtube. We are committed to being responsible actors who are part of the solution. Google will continue to invest in the people and the technology to meet this challenge, and we look forward to continued collaboration with the committee as it examines these issues. Thank you for your time, and i look forward to taking your questions. Thank you. Dr. Fried, you have five minutes. Chairman, chairwoman, Ranking Members, members of both subcommittees, thank you for the opportunity to speak with you today. Technology, as youve already heard and the internet have had a remarkable inpact on our lives and society, many educational and entertaining things have emerged from the past two decades in innovation, but at the same time many horrific things have emerged. A massive proliferation of child sexual abuse material. The domestic and international terrorists. The distribution of illegal and deadly drug, the proliferation of miss and disinformation campaigns to designed to sow civil unrest, disrupt violence and proliferations. Hateful conspiracy theory, the routine and daily harassment of women and other underarrested groups and the threats of Sexual Violence and nonconsensual pornography, small and largescale fraud and spectacular failures to protect our personal and sensitive data. How in 20 short years did we go from the promise of the internet make the world more understanding and enlightened to this litany of daily horrors, willful ignorance and the mentality of growth have failed to install proper safeguards on their services . The problem that we face today, however, is not new. As early as 2003 it was well known that the internet was a boom for child predators. Despite early warnings, the Technology Sector dragged their feet through the early and mid2000s, and did not respond to the known problems at the time nor did they put in place the proper safeguards to contend with what should have gone the anticipated problems that we face today. In defense of the Technology Sector, they are contending with an unprecedented amount of data. Some 500 hours of video are uploaded to youtube every minute. Someone billion daily uploads to facebook and 500 million tweets per day. On the other hand, these same companies have had a decade to get their houses in order and have failed to do so. At the same time theyve failed propromise by scale and volume of the data thats up loaded to their Services Every day. They do not have trouble dealing with unwanted material when it serves their interests. They remove Copyright Infringement and they effectively remove legal and adult pornography because they would be latered with pornograph pornography. Ai is the savior for content moderation and we are told five to ten years, put aside that its not clear what we should do in the intervening decade or so, this claim is almost certainly overly ko overly optimistic. It show cased the facebooks ai despite all the latest advances in ai, and pattern recognition, this system is only able to perform the task with an average accuracy of 91 . This means that approximately one in ten times, the system is simply wrong. At a scale of a billion applauds a day, this technology cannot possibly automatically moderate content. In this discrimination task, its easier than the task of identifying a broad class of Child Exploitation, extremism, and disinformation material. The promise of ai is just that, a promise. And we cannot wait a decade or more with the hope that ai will improve by some nine orders of magnitude when it might be able to contend with automatic online content moderation. To complicate things even more, earlier this year, mr. Zuckerberg announced that facebooks implementing and beyond encryption on its services, preventing anyone, government, facebook, from seeing the contents of any communications. Blindly implementing and to end encryption will make it more difficult to contend with the litany of abuses that i enumerated at the opening of my remarks. We can and we must do better when it comes to contending with some of the most violent, harmful, dangerous, and hateful content online. I simply reject the naysayers that argued that it is too difficult from a policy or technological perspective or those that say that reasonable and responsible content moderation will lead to the stifling of an open exchange of ideas. Thank you, and i look forward to taking your questions. Thank you. Well, weve concluded our openings. We will move to member questions. Each member will have five minutes to ask questions of our witnesses and i will start by recognizing myself for five minutes. I have to say, when i said at the beginning of my remarks, this is a complex issue, its a very complex issue, and i think weve all heard the problems. What we need to hear our solutions. Let me just start by asking all of you, just buy a show of hands, who thinks that Online Platforms could do a better job of moderating their content on their websites. So thats unanimous. And i agree, i think its important to note that we all recognize that content moderation online is lacking in a number of ways. And that we all need to address this issue better and if not you, who are the platforms and the experts in this technology, and you put that on our shoulders, you may see a law that you dont like very much, and that has a lot of unintended consequences for the internet. So i would say to all of you, you need to do a better job, you need to have an industry getting together and discussing better ways to do this. The idea that you can buy drugs online and we cant stop that . To most americans hearing that, they dont understand why thats possible, why it would not be easy to identify people that are trying to sell illegal things online and take those sites down. Child abuse. Its very troubling. On the other hand, i dont think anyone on this panel is talking about eliminatings section 230, for the question is, what is the solution between not eliminating 230 because of the effect that would have on the whole internet, and making sure that we do a better job of policing this. Mister huffman, lots of people know of red, it but its a small company, when you place it against some of the giants, and you host many communities and you rely on your volunteers to moderate discussions. I know that you shut down a number of controversial sub red its that have spread deepfakes of disturbing content, violent content, and dangerous conspiracy theories. What would read it look like if you are legally liable for the content your youd users post, or for your companys decision to monitor communities . Thank you for the question. What read it would look like would be we would be forced to go to one of two extremes. In one version, we would stop looking. We would go back to the pre230 era, which means, if we dont know, we are not liable. And that, im sure, is not what you intend, and it certainly not what we want, it would be not aligned with our mission to Bring Community belong to everybody in the world. The other would be to remove any content or prohibit any content that could be remotely problematic. And since read it is a platform where 100 of our content is created by our users, it fundamentally undermines the way red it works. Its hard for me to give you an honest answerable they would look at because im not sure read it as we know it could exist in a world where we have to remove all User Generated Content. Doctor mcsherry you, talk about the risk of free speech and being repealed but what other tolls could congress use to incentivize Online Platform with dangerous content and encourage a healthier ecosystem . What would your recommendation be short of eliminating 230 . A number of the problems that weve talked about today so far which i think everyone agrees are very serious and i want to underscore that. They are actually often addressed by insisting laws the target the content itself. In that case wait a situation where what arms list was selling of the guns that was controversial which was perfectly legal under the wisconsin law. Similarly, on many of the problems that weve talked about today are addressed by federal and criminal laws they. Already exist and in section 230 and not a barrier of federal preventable laws. I urge this committee to look carefully at the laws that actually target the actual behavior that we are concerned about and perhaps start there. Miss peters, you did a good job of qualifying us with their testimony. What solution do offer short of repealing 230 . I dont repose repealing 230, i think we want to continue to encourage innovation in this country. It is our core economic core driver of our economy. But i do believe that if a 230 should be revised and if something is illegal in real life it is illegal to posted online. I dont think that that is an unfair burden for tech firms and certainly some of the wealthiest firms in our country should be able to take that on. I myself as a Small Business and we have to run checks to make sure that we do business with foreigners i were not doing business with somebody thats on a terror blacklist. Is it so difficult for Companies Like google and ready to make sure that they are not hosting an illegal pharmacy . My time is getting way expired but i thank you and i think we get the gist of your answer. The chairman now yields to Ranking Member for five minutes. Thank you mister chairman and thanks to our witnesses. If i can start with you, a recent New York Times article online with the nature of child sex abuse and how it has exponentially grown over the last decade. My understanding is that Tech Companies are legally required to report the child abuse only when they are discovered and actively looked for. I understand you make voluntary efforts of those type of content. How can we encourage them to better enforce the terms of service or proactively lose those subsection see two of section 230 to take good faith efforts from accountability within that force . Thank you for that question and for focusing in on the focus on c two to incentivize content. I could say that for google we have that transparency is important and to publish our guidelines and publish our policies. We published on youtube a quarterly transparency report across the different categories of content. Whether the volume of content that weve been removing. We allow for uses to appeal. If their content is stricken and i think thats a mistake, they have the ability to appeal in track what is happening with the appeal. We do understand that this piece of transparency is critical to use their trust and for discussions with policy makers on these critically important topics. Thank you. A number of defendants ive trained section 2 30 and some of which are platforms that may not news another generated contact at all. It was Communications Decency act intended to have those platforms . I keep doing that. Platforms are solely responsible for the content, theres no user and no User Generated Content. It is their Creative Concept and thats a question being covered by the legal shield of 230. Im asking, is that the question . No, they would be responsible for the content that they have created and developed. So section 230, that legal shield, would not apply. Thank you. Mr. Farid, are there tools available, like photo dna or copyright identification to flag the sale of illegal drug online, if the idea is that platforms should be intensified to scan their platforms to take completely legal content, shouldnt there be indicators associated with opioids researched through an automated process. The short answer is yes. Theres two ways of doing content moderation. Once material has been identified, typically by human moderators, whether that child abuse material, Illegal Drugs, terrorism related material, whatever it is, that material, Copyright Infringement, can be fingerprinted, digitally fingerprinted and then stopped from future distribution. That technology has been well understood and has been deployed over a decade. I think it has been deployed endemically across platforms, and not aggressively enough. Thats one form of content moderation that works today. The second form of a kind of moderation is called the day zero, finding the christchurch video on upload, that is incredibly difficult and still requires journalists or the platforms themselves to find, but once that has been identified, it can be removed from future upload. And by the way, today you can go on to google and you can type, by fentanyl online, and it will show you, in the first page, where you can click on purpose fentanyl, that is a difficult to find, we are not talking about the dark web, we are not talking about things bird on page 20, its on the first, page and there is no excuse for that. Let me follow up, because you say its anemic, when some of the platforms might be doing out there. Last, year we pass over 60 pieces of legislation dealing with the drug crisis that we have in this country, fentanyl being one of, them we mentioned that you can just type in and find it. Okay. Because again, we are trying to make sure that we dont have the 70 to thousand deaths that we had in this country over a year ago, and with over 43,000 being associated with fentanyl. So how do we go to the platforms and say, weve got to enforce this because we dont want this stuff coming in from china, how do we do this . This is what the conversation is. Im with everybody else on the panel, we dont repeal 230, but we make a responsibility, not to write. If your platform can be weaponized in the way that we have seen, and the litany of things that i have in my opening remarks, really something is not working, if i can find on google, in page one, and not just me, my colleagues on the table also, investigative journalists, we know this content is. There its not hiding. Its not difficult. And we have to ask the question, if a reasonable person can find this content, surely google, with its resources, can find it as well. And what is the responsibility . I think you said earlier as well, you should just enforcer terms of service. If we dont want to talk about 230, lets talk about terms of service. The terms of service of most of the major platforms are actually pretty good. Its just that they dont really do all that much to enforce them in a clear, consistent, and transparent way. Thank you very much. I yield back. The gentleman yields back. The chair now recognizes ms. Schakowsky miss oyama, you said that without 230 i want to see if theres any hands that would go up that we should abandon 230. Does anybody say that . Okay, so this is not the issue. This is a sensible conversation about how to make it better. Mr. Huffman, you said, and i want to thank you for we had a very productive meeting yesterday, explaining to me what your organization does, and how its unique, but you also said in your testimony that section 230 is a unique american law. And so, when we talked yesterday, you thought it was a good idea to put it into a trade agreement, dealing with mexico, and canada. If its a unique american law, let me just say, that i think trying to fit it into the regulatory structure of other countries at this time is inappropriate, and i would like to just quote, i dont know if hes here, from a letter that both chairman walden wrote to mr. Lighthizer, that said we find it inappropriate for the United States to export language mirroring section 230 while such serious policy discussions are ongoing. Thats whats happening right now. We have serious a policy discussion. But i think what the chairman was trying to do and what i want to do is try to figure out what do we really want to do to amend or change in some way. Again, briefly, if the three of you that have talked about the need for changes. Let me start with miss once citron what do you see in 230 . I want to bring the statute but to its original purpose and that they are engaged in responsible and reasonable content reckless is. But i have the language to change the statute that with preconditioned and not treat a provider or interactive service. That engages in a reasonable content moderation practices as a publisher or speaker. It would keep the immunity let me suggest that if there is language, i think wed like to see suggestions. Miss peters if you pretty much scared us as to what is happening. Then, how we can make 230 responsive to those concerns. Thank you for your question chairman schakowsky we, would share some language on how to reform to 30 and to protect better against organized crime and terror activity and one of the things that im concerned about that a lot of tax are involved in is when they detect Illicit Activity or it gets flagged by users. The responses to delete it and forget about it. What im concerned about is two things. Number one, that essentially is destroying Critical Evidence of a crime and its helping criminals to cover their tracks. As opposed to a situation like what we have in the financial industry and even aspects of the transport industry. They know that Illicit Activity is going on and have to share with Law Enforcement and there to do it in a certain timeframe. I certainly want to see the content removed but i dont want to see it simply deleted. I think that is an important distinction. Id like to see a world where the big tech firms work collaboratively and in a Civil Society and with Law Enforcement to brute out some of these evil entities. Im going to cut you off because my time is running out i want to get to doctor farid where the same thing. You would welcome these kinds of suggestions . I would agree with my colleague citron that it should be a privilege not a right. You have to show that you have content moderation and should be about the small startups. If we start regulating now, the ecosystem will become even more novelist it. We have to think how we can carve out these platforms and not compete were these companies do not have to deal with that academic pressure. The rules have to be clearer, consistent and transparent. Thank you. I yield back. Chair now recognize miss morris rogers. It section 230 was provided for shield the ideality and a soared to have efforts to filter or block the address of certain content online. Professor citron gave Live Companies are making the sword enough and why do you think that is . Were seeing the dominant platforms that ive been working with facebook and twitter for about eight years. I think the dominant platform for talks on this panel is engaging in. I would describe at a broad level is fairly reasonable content moderation practices. We could do far better and transparency and about what they mean by what they mean by that and what is the harm that they want to avoid. For example they could be more transparent about their processes that they use and make decisions to have more accountability. What really worries me are the renegade sites as well. inaudible the four men incitement for no moderation and getting upset have no ability to bend impersonators and have ip addresses. It is the biggest of providers and not the small ones. You know they have illegality happening on their platforms, and do nothing about it. Why are they doing that . Because of section 230. The dating apps grinder comes to mind. Hosting impersonations of someones ex, and someone was using the platform to send thousands of men to this mans home, grindr heard 50 times from the individual is being targeted and did nothing about it. Finally, when they responded, after getting a lawsuit, their response was, our technology doesnt allow us to track ip addresses. But grindr is very dominant in this space. But when the person went to scruff, the impersonator was posing as it individual, and they responded right away, they said, we can ban the ip address, and took care of it. I think the notion that the smaller versus the large, by my lights, is there is good practices, responsible practices, and irresponsible, harmful practices. Thank you for that. Mr. Huffman and ms. Oyama, your Company Policy except regarding your terms of services, how do you monitor content on your platform to ensure that it does not violate your policies . I will start with mr. Huffman. Sure. So in my opening statement, i described three layers of moderation that we have on red. Our companies moderation, and our team, this is the group that both rights the policies and enforces the policies. Primarily, the way they work is, forcing these policies at scale, so looking for aspirational behavior, looking for non problematic sites, we participate in cross industry hash sharing, which allows us to find images, exported of children, that are shared industry wide, or fingerprints thereof. , and the community moderators, users, and the users themselves, those groups participate together in removing content its inappropriate for the community and in violation of our policies. We have policies against hosting, one of the points is, no illegal content. So no regulated goods. No drugs. No guns. Anything of that sort. You are seeking it out and if you find it then you get it off the platform. Thats right. Because 230 does not provide us criminal Liability Protection, so we are not in the business of committing crimes or helping people commit crimes, that would be problematic for our business. So we do our best to make sure that its not on their. Ms. Oyama, could you address that, and tell us what you do when you find that content . Across youtube, we have very cure we publish those online, we have you two videos that give more examples and more specific way so people understand. We are able to detect of the 9 million videos that we removed from youtube in the last quarter, 87 of those were detected first by machines. So automation is one very important way, and then the second way is human reviewers, so we have community flagging, where any user that sees problematic contact and flagged and follow what happens with that complaint, we also have human reviewers that look and we are very transparent about explaining that. When it comes to criminal activity on the internet, of course 230 has a clean carpets, on the case of grindr, we have policies against harassment, but in the case of grindr, where there was real criminal activity, my understanding is that there is a defendant in that case, and there is a criminal case for harassment and stalking that are proceeding against him. So in certain cases, opioids again, controlled substances, under criminal law, theres a section that says, i, think controlled substances, on the internet, sale of controlled substances on the internet, thats a provision. In cases like that, where there is actually a Law Enforcement role, we, would if there is correct legal process, then we would work with Law Enforcement to provide information under due process or a subpoena. Thank you. Okay. My time is expired. I yield back. Thank you. Ms. Degette, you are recognized for five minutes. I want to think this panel, im a former constitutional lawyers im always interested in the intersection between criminality and free speech, and professor citron, i was reading your written testimony, which you confirmed with miss schakowsky about how section 23 should be revised to both continue to provide First Amendment protections, but also returned the statute to its original purpose, which is to let companies at more responsibly, not less. In that vein, i want to talk during my line of questioning about Online Harassment. This is a Sexual Harassment is a real issue that has just only increased. In my Defamation League reported that 24 of women and 63 of lgbtq individuals have experience Online Harassment because of a gender or sexual orientation. This is compared to only 14 of men and 37 of all americans and the Background Experience there Online Harassment which include harassment, stalking physical threats and i want to ask you professor citron and miss peters a very quickly to talk to me about how section 230 facilitates illegal activities and do you think it undermines the value of those laws and if so how . Miss citron in cases involving harassment there is a perpetrator and the platform enables it. Most of the time, the perpetrators are not pursued by Law Enforcement. When you are in cyberspace it explores the fact that Law Enforcement will not understand the abuse or dont know how to investigate it in the case of grinder they had ten protective orders that were violated in new york has done nothing about it. Its not true that we can always find the perpetrator especially in the paces of stalking, harassment and threats. We see a severe under enforcement of law and particularly when it comes to gendered harms. Thats really word falls to the sites to try to protect. Miss peters, do want to contact on the . On this issue there is to be something that came to a cyber restraining order. If someone is talking somebody on grindr or ok cupid or google that site can be blocked from communicating from the other. Even under section 230 the platforms ignore requests of this type of material . They have. Professor youre citron nodding your head. They do and they can especially if those protective orders are coming from state criminal law. Okay. I wanted to ask you doctor mcnerney that it becomes a problem on twitter and other social platforms. I know section 2 30 is a critical tool that facilitates content moderation. But as weve heard in the testimony, a lot of the platforms are not being aggressive enough to apply the terms and conditions. I want to ask you, what can we do to encourage platforms to be more progressive and content getting issues like harassment . I imagine this hearing one courage many of them to do just that. We keep hiring hearings all the time. I understand. Im so absolutely. Many of the platforms are pretty aggressive already in the moderation policy. I agree with what many have said here today. Which is that it would be nice if they start by clearly and forcing their actual terms of service and we have a share of concern about because its enforced very consistently and its very challenging to users. A concern that i have is that if the institute which is one proposal that whatever you get a notice you have some duty to investigate. That can actually backfire for marshals and communities. One thing that happens is that if you were to silence someone online, one day my thing to them i do is provide a Service Provider with stuff about them and they will be the ones that are silenced. Doctor farid, whats your view on that . Pardon me . Whats your view of what he mcsherry said. Theres two issues at hand here. One is a moderation and new risk over moderating and under moderating. We are way under moderating. We look at where we fall down and where he make mistakes and take the content that we should and we weigh that against 45 million pieces of content just last year of child abuse material and terrorism and drugs. The weights are in balanced and we have to rebalance and we are going to make mistakes. Or making way more mistakes on allowing content than we are on not. Thank you very much mister chairman i yield back. The chair now recognizes mr. Johnson for five minutes. Thank you mister chairman and to you and chairman schakowsky for holding this very important hearing on the Information Technology for most of my adult life and social responsibility has been an issue that i have talked about. A lot. In the absence of heavy handed government and regulating and i think thats the absence of regulation and what has allowed the internet and the social media platforms to grow like they have and i hate to sound cliche but that old line from the drastic park movie. Sometimes formal focus on what we can do i dont think about what we should do. I think that is where we find ourselves with some of this. We heard from some of our witnesses inaccessibility on a global audience in the internet platforms meaning thats being used for a legal and illicit purposes. Terrorist organizations and the use of oil weight which is affecting our communities across the nation and in rural areas where i live in southeastern ohio. That platforms also provide an essential tool for legitimate communications and free safe an opening exchange of ideas which has become a vital component of modern society and todays global economy. I appreciate hearing from all of our witnesses as our subcommittee examines whether section 230 if the key really decency act will affectedly solve regulate on this light touch framework. Mr. Huffman, and your testimony you discussed the ability of not only read it employees and its users to self regulate and removed content against read it in the community standards. Do you think other social media platforms like facebook or youtube have been able to implement some of their self regulating functions and guidelines. If not, what makes reddit unique in their way to self regulate . Thank you congressman. Im only familiar with the other platforms to the extent that you probably are which is to say to im not an expert but they are not sitting on their hands and theyre making progress. But red its model is unique in this history in that we believe that the only thing that scales what users is users. We were talking about User Generated Content and sharing someones burden with those people the same way that this society in the United States already agreed rules about what is acceptable and not to say. Same thing exist on our platforms. By allowing the powering in our news is to enforce those unwritten rules, it creates an overall unhealthy ecosystem. Miss oyama, and your testimony discussed the possibility of determining which content is allowed on your platform including balance and respect for the platform of marginalized voices. With a system like reddit and down votes impact the visibility of those viewpoints like youtube and who dislikes on youtube impact they videos visibility . Thank you for the question. As youve seen, musicians skipped tubs up comes down to a video. Its one of many signals so it would be a determinative of the recommendation of the video for relevance. I really appreciate your point about responsible content moderation. I do want to make the point that the piece about hospital buoyed. We removed 35,000 videos from youtube and we did this because of 230 when. The contents or move they might be upset so break there could be cases against the substance provider for defamation and breach of contract. Service providers a large and small are able to have those policies and implement procedures to identify that content and take it down because of the provision of 230. Okay, i got some other questions i want to submit for the record but let me just summarize with this cause i want to stay within my time. Youre going to require me to stay within my time. You know, in the absence of regulation as i mentioned in my opening remarks. That takes social responsibility to a much higher arc. I would suggest that the entire industry of the internet, social media platforms better get serious about this self regulating or youre going to force congress to do something that you might not want to have done. With that, i yield back. Chair recognizes with mid matsui for five minutes. I want to thank the witnesses were being here today. Miss oyama and huffman your committee released a bipartisan order of social media. The report found that the use of social media platforms to sophie all ingenious court was in the 2016 election. What role can section 230 play in assuring that that is not news again with president s . Miss hit oyama and miss huffman. He 230 is important for services so like impacting citizens against an interference in the election it. Its a critical election with the election cycle coming up. We found across google in the 2016 election, due to the measures weve been able to take an ad removal there are only two accounts that infiltrated our systems. They were suspended for less than 5000 dollars in 2016. We continue to be extremely vigilant and there is a transparency report that require ads are just posed and paid for them and show up in a library. You feel that you were effective . We can always do more but on this issue we are extremely focused oh boy he would help raise. Mr. Huffman . And 2016 we found that we saw the same fake news information on our platform that we saw on others. The difference is, on reddit, it was largely rejected by the community and the users before it came to our attention. Thats one thing read it is good that is being skeptical. Its rejecting falsehoods are questioning everything for better or worse. Between then and now, we become dramatically better at finding groups of accounts that are working and a coordinated or authentic matter and create with Law Enforcement. Weve seen in the past and we can see Going Forward that were in a pretty good position coming into the 2020 election. And youre written testimony you said that misinformation campaigns was on to disrupt the upcoming election. The election interference made them in a lot of people. Its more the platforms that could be doing about moderating content online. What more should they be doing about this issue now this time . Let me give you one example a, few months ago we saw Speaker Pelosi make the rounds and the response was interesting. Facebook said we know its fake and or not in the business of telling the truth and that was not a technological problem. It was not satire it was not comedy it was meant to discredit the speaker and i think fundamentally we have to look at the rules in the fact that if you look at facebooks rules you cannot push things that are misleading or fraudulent and its a clear case where the technology worked and the policy was ambiguous and to youtube discredit they turned it down and looked to discredit and didnt even respond. In some cases theres a technological issue a more often than not there was not enforcing belittles that are in place. Thats a decision that they made . Okay. Miss oyama what do you think about what mr. Farid just said . All respond. There are two aspects of this. First, specifically towards reddit, we have a policy towards impersonation so a video can be used to made it be late people or our service will be on this information and its also raises questions about veracity of the things that weve seen here and prompts important discussion. The context about whether video like that stays up or down on red it is difficult as a decision. I will preserve that we are entering into a new era where we cant manipulate videos. Historically were able to manipulate text and images with photo shop and not videos. Not only do the platforms have a responsibility but as a society have to understand that the source of materials and this publication is critically important. There will come a time, no matter what my detector say where will not be able to detect that sort of thing. Exactly, miss oyama you have a few seconds. You mentioned we do have a policy against practices but there is ongoing work that needs to be done to better identify these fakes. Canadian sometimes use them but political context that could undermine democracy will end up with data sets or researchers that was our technology that can better detect that whats manipulated for those policies. I have a lot more to say but you know how this is. Anyway, i yield back the rest of my time, thank you. Jen gentleman yields back and we now address kinzinger. The last line of questions one of the thing is our billeted to have free speech and share opinions but can also be something that is a real threat. I thank the chairman for yielding and i think its safe to say that as a member of congress who has a plan on what to do about this section 230 and the decency act but we can agree that the hearing is warranted. We need to have a discussion of intent on whether the companies that will enjoy these Liability Protections and moral state upfront that they appreciate the efforts of certain platforms over the years to remove and block unlawful content. I also say its clearly not enough and i think that status quo is unacceptable. Its been frustrating for me in recent years with the vibrations thats been used by criminals to defraud people on social media and this goes back ten years. I think we could approach in the fifties to hundreds given what we have just been talked about. The scams are increasingly progressive and not only brought up in the hurrying with mark suffer gabbard last year but in the summer which will go to protect his users. I have a question that sources indicate that 2018 we reported hundreds of millions of ours lost on these scammers at 143 million through romance camps. Many people have gone through more were important for platforms to verify you the user authenticity. To mr. Huffman and miss oyama but dear platforms you to verify the authenticity of User Accounts . Thank you for the question. Again again, two parts to my answer. The first is on the scams themselves. My understanding is that youre referring to scams a target veterans in particular we. Have a number of veterans communities. Around support and share experiences and like all of our communities they create thrown worlds in this community has created worlds that as fundraising in general. The community and members of those communities know that they can be targeted by this scam in particular. That is the nuance that is really important that highlights the power of our community. As a non veteran, i might not have that same sort of intuition. In terms of what we know about our users. It hears reddit are different from appears is that we dont require to share the real world identity with us. We know where they register from and what i peasy use and maybe the email address but we dont force them to reveal their full name or their gender. This is important because on this and reddit there are communities that discussed in the topics in the very same better in communities with a drug addition communities or for parents who are struggling as being parents. This is something that would not go into a platform on facebook to say that i dont like my kids. I dont mean to cut you off but i want to get to miss oyama. Very sorry to hear that that happened to you congressman but until we have the policy against other nations if you were to see proceeding you are a user saw that theres a way that you could submit and you can have a government id but that would result in the general being struck on search. Spam can show up across the web and searches and the index of the web which are relevant information to our users every single day on search we suppress 19 billion lengths that are spam that can be span to defend the users. Thats something called a risk engineer can kick out our fraudulent a couch in the system. Its not im not upset about is kinzinger the worst congressman ever thats understandables for some people. But when you have my case somebody as an example in multiple cases flew from india using her entire life saving because thought they were dating for a year. Not to mention all the money that she gave to the perpetrator and all these other stories. One of the biggest and most important things of people need to be aware if you have somebody over a period that were cheating you and never authenticating that is probably not real and miss peters, what are the risk of not being able to trust other users online . There are multiple risk but i want to come back to the key issues with that if it solicit, the sites should not be required to hand over data to Law Enforcement to work proactively with Law Enforcement. Weve heard a lot today from the general from reddit two better moderate. Some of the members are able to go online the other day type in search for by fentanyl online and came up with many results, same for by admiral online or by admiral for cheap without description prescription. Im not talking about a super high bar on your platform and it doesnt seem too hard or do have automatically direct to a site that would advise you to get counseling for drug abuse. We are not trying to be the fun police or, tried to protect people from organized crime and this kind of activity. Oh yield back and other questions all submit, thank. You determine yields back i want to say that i think he is the worst member of congress. laughs i dont even think youre at the very bottom here adam, youre not that guy. We now recognize miss castor for five minutes. Thanks to all of our witnesses are being here today. Id like to talk about the issue of 230 in this horrendous act tragedy in wisconsin a few years ago in the arms list. Com where a man walked into a salon where his wife was working and shot her dead in front of their daughter and killed two others. And then he killed himself. This is the type of horrific tragedy thats all too common in america today. I inaudible think he misspoke because you set that was all legal but it wasnt because two days before the shooting, there was a temporary restraining order issued against that man. He went Online Shopping on arms list. Com two days after that was issued in the next day he had his murder spree. What happened is arms list knows that they have domestic users and got felons and got terrorist shopping for firearms and yet theyre allowed to proceed with this. Earlier this year, the Wisconsin Supreme Court ruled that arms list was immune. Even though they know that they are perpetuating illegal content and these kinds of tragedies. They Wisconsin Supreme Court said they were immune because of section 2 31. They basically said it did not matter that arms list actually new or intended its website with facilitated legal firearms sales and section 230 still granted immunity. This peters, youve highlight it is not an isolated incident were, talking about child sexual abuse content and illegal drug sale, it is gone way too far. I appreciate that new will propose some solutions for this. Doctor citron, you highlighted a safe form word if Companies Use their best efforts to moderate content then they would have some protection but how would this work in reality . With this be left up to the courts in this type of liability with all students it speaks to the need for very clear standards from congress i think. Yes it would. Thank you so much for the question. How would we do this . It would be in the courts . It would be in the initial motion to dismiss unaccompanied and whoever is being sued question would be our evening regional in content moderation with law. Not with regard to any one piece of contender activity and it is true that it would be a forcing mechanism for this motion in federal court to have companies explain what constitutes reasonableness. I think we get come up with some basic threshold of what we think is a reasonable content moderation and mike described as technological due process and accountability and it is having a process with clarity on what it is you prohibit and its going to be case by case and contacts by context. Because whats reasonable response to a deep and identical that might work on deepfakes. It is going to be different from the kind of advice i would give to facebook, twitter and others out to constitute a threat and how to figure that out. We would think about doctors farid testimony it wouldnt be in the Public Interest if it is explicit and illegal content that they wouldnt wind up as an issue of fact in a lawsuit. What do you think doctor farid . Is the legal content on mine there really shouldnt be a debatable question . Im a mathematician by trade so i dont know why you are sequestered but i completely agree with you. What weve seen over the years and saw the polling for dna is the Technology Company that is muddled up in the gray area. We had a conversation with child abuse and what happens when its an 18yearold or what happens when its a sexually explicit and those are complicated questions. Theyre really clear cut bad behavior were doing awful things to kids as young as two months old. Ill just highlight to the witnesses and the issue of a number of moderators to go through this content and the verge is having a horrendous story of facebook moderators because its one of the places in tampa florida. Im going to submit followup questions about the moderators and some standards for that practice and i encourage you to answer, thank you and i yield back. Genuine lady yields in the chair recognizes the chair from illinois you. Im sorry i missed a lot of this cause i was upstairs but in my 20 years of being a member ive never had a chance to address the same question into different panels on the same day. It was an interesting convergence upstairs where we were talking about vaping in underage use and whats the product. I was curious when you in the Opening Statements here someone and i apologize someone mentioned two cases and the one was dismissed because they really did nothing. One who tried to be the good actor got slammed i dont know about slammed but i see a couple of heads. We miss citron can, you address that first. Youre shaking at the most. Very enthusiastically cars those are the two cases that affected the rise in section 230 and who went with chris cox about that this was a pair of decisions about that if you do nothing youre not going to be punished for it but if you try and moderate that it heightens your responsibility. No good deed goes unpunished. Thats why rolling a heated agreement about that today in many respects. If i tie this to whats going on upstairs. If someone uses a platform to encourage under age he vaping with unknown nicotine content and then decides to clean it up because of the way the laws are right now this good deed which most would agree is probably a good deed way to go punished. Now we have section 230 and thats why we have section 230. They are encouraged as long as theyre doing it in good faith under section 230 c two, they can remove it and then we use it. That is the benefit of it and is there fear in this debate that we had earlier and we had comments from some of our colleagues in the usmca debate that part of that would remove the debate of 2 30 and would fall back to a regime in which the goodies person could be punished. Is that correct . Everyone is shaking their head mostly. Peters youre not go ahead. Just turn your mic on. We need to keep the 2 30 language out of the trade agreements. It is currently an issue of a debate here in the United States and its not fair to put that into trade agreements and will make it impossible or make it harder. Dont get me wrong i want to see this pass as soon as possible without any encumber word but it doesnt happen. Im not a proponent of trying to delay this process, im trying to work through this debate and the concern upstairs to those of us we believe is that these products or have been approved by the fda in about a black market operation that will use these platforms to sell to underage kids. That would be how i would tie these two hearings together. I dont think you when we had the facebook hearing a couple years ago, i referred to a book called the future computer which talks about the ability of the industry to settle standards. When to this across the board whether its engineering of the hitting an error throwing equipment or we have this that comes together for the good of the whole and say here are our standards and the fear is that this sector doesnt do that then the heavy handed government or to it. But will cause a little bit more problems, doctor farid youre shaking your head. We say we have to do better because if we dont do it someone else will do it. That would be a nightmare. Part of that book talked about fairness and reliability and transparency and accountability. I would encourage the industry who are listening to help us move in that direction on our own before we do it for them. With that mister chairman i yield back my time. This gentleman gentleman yields in our czech own eyes to care for five minutes. Its very interesting testimony and jarring in some ways. Miss peter, scheer testimony is particularly jarring. Have you seen any Authentic Office of weapons of mass destruction being offered for sale online . I have not personally but we certainly have members of our alliance of tracking weapons activity. What is more concerning to me anyway is that the number of illegal groups from inaudible designated groups to alqaeda will retain web pages and look to their twitter and facebook pages from knows and run fundraising campaigns. There are theyre interested in the weapons of mass destruction issue. It is inside those groups which are the epicenter of Illicit Activity. It is hard for us to get inside those. Weve had an undercover operation to get inside some of them. Mr. Farid you talk about retention in Tech Companies between the motivation of maximizing the amount of time online and under platforms and on other hand. Content moderation. Can you talk about that briefly please . We talked about it but 230 there is another tension point here or another thing that is the underlying Business Model today which is not to sell a product. You are the product. In some ways, thats where a lot of attention is coming from because the metrics we use in these companies for success is how many users and how long they have their platforms. You can see why that is fundamentally the intention with moving users and removing content. The Business Model is an issue in the way we deal with privacy of user data is also an issue here. If the Business Model is monetizing new data then i need to feed you information and thats why i call it the rabbit hole effect. There is a reason why if you start watching certain types of videos of children or conspiracies or extremism, we are fighting more and more and more of that content down the rabbit hole. There is real tension there, it is the bottom line and its not just ideological, were talking about those profits. Would you like to add to that . Anne here thank you many, of these issues that were discussing today whether it is harassment, extremism, its important to remember the positive and the objective of the potential for the internet. On youtube, weve seen it gets better than life seen counter messaging in a program called creatives for change who are able to create content for youth to counter these extreme messages. Its good to remember that section 230 was borne out of this committee and out of this policy and relevant for Foreign Policy as well will be in the u. S. Cia with these free markets that are responsible for 172 billion dollar surplus that the United States has and its critically important for Small Businesses to build a moderate content and to prevent censorship from other regimes abroad. This is hard to restrain yourself with the brief answers and i understand that. Companies could be doing more today within the current Legal Framework to address problematic content. I like to ask each of you very briefly what you think could be done today with the best tools to moderate content. Very briefly please. For us, the biggest challenge is involving our policies to meet new challenges. As such, weve involved or policies a dozen of times a continue to do so into the future. For example, reason ones for are expanding our harassment policy and pornography. Undoubtedly, fake pornography wasnt even a word a few years ago but there are new challenges in the future and being able to address them is really important. This 230 gives us the space to adapt to these challenges. Hes been missile in what it enables is to respond to changing threats and its not going to change we, can have a checklist right now and i would encourage companies to not only have policy but be clear about them as we have them. Them now to the doctor just issue mcsherry. The issue with me with the angle is terrifying. That means practical matters for Small Businesses are a lot of litigation risk as we tried to figure out what counts as reasonable. To your question, one of the crucial things we need and we want better moderation practices at one users not to be treated just as products is to incentivize alternative business laws. We need to make sure that we clear space in this competition so when a given site is behaving badly such as grindr, people have other places to go with other practices. You can go to other sites that are encouraged to develop and involve which will make Market Forces downward and me to make that work. My time now wolf i will yield to the gentleman from indiana. Thank you mister chairman, thank you so much for this very important hearing. Doctor farid, to set the record recently, the reason im asking these questions on, a former u. S. Attorney and involved in the climbs against Children Task force. We did a lot of work from 2001 to 2007. There was a deepfake pornography was on the term at that time. We certainly know that long force man has been challenged for decades in dealing with pornography and yet, i believe that we have to continue to do more to protect children and protect kids all around the globe. A concept or a tool of photo dna was developed a long time ago to detect criminal online child pornography and yet Means Nothing to that legal activity with the platforms that dont do anything about it. Weve been dealing with this now for decades. This is not new and yet we have now have you tools for photos dna which is a matter of tools or effort or how is it that it is still happening . Doctor farid . Its a source of incredible frustration and me make photo dna back in 2008 with microsoft. From an industry that prides itself on rapid and aggressive development, theres been no tools unless decade that have gone beyond photo dna and that is pathetic. It is truly pathetic were talking about this kind of material and upright itself on saying were going to use tenyearold technology to combat some of the most gut wrenching and heartbreaking stories online. It is totally inexcusable it, is not a technological limitation, we are simply not putting the effort into developing these tools. Let me just share and weve watched some of these videos and it is something that new never want to see and you cannot get out of your mind. Im curious, miss with yours is oyama how, is it that were still at this place . Thank you for that question. At google is not true at all. We have never stopped working on prioritizing this. We can always do better, we can constantly adopted new technologies. We had one of the first one which was enabling us to create digital fingerprints and imagery and prevent it from ever being uploaded on youtube and there is a new tool that we have called the api and we are sharing it with others in the industry and ngos with some of that have resulted in the increased which is the type of content to going to continue to be a priority. I want to be clear, from the very top of our company, and we need to be a safe and secure place for parents and children. We will not stop working on this issue. Im very pleased to hear that there have been advances and that youre sharing them and that is critically important. I will say, the Indiana State Police captain truck was testifying before the commerce recently told me that one of the issues that they were working with those companies which was that he called minimally compliant. He says that Internet Companies are not preserved content which can be used for investigation of law if it makes the companies aware of these materials which are automatically flags that content to review without checking if its truly objectionable or not. Have any of you have thought specifically on his comment . He is an expert to any of you have thoughts on how he balances a Law Enforcement and critical need . They are saving children all around the globe. Miss peters, without restricting Companies Immunity from hosting concerning content. I feel like if Companies Still cant find some punitive damage every time there is a list of content will see a lot less illicit content very quickly. If its illegal in real life it should be illegal to hold posted online. That is a very simple approach that i think we can apply worldwide. I have a question particularly cars i asked Mark Zuckerberg this relative to terrorism and to recruitment in isis and need to be more concerned about isis. I understand that we have teams of people taking on how many people on your team have been . Dedicated to removing contents. Writing our policies about 20 of our company. About 100 people. Missed oyama . More than 10,000 people working on content. That actually removed content . That are involved in the content moderation policy. How many people are on the team that actually do that work . Im happy to get back to you. Thank you and i yield back. The lady yields. I would like to introduce i public objection for the record. The chair recognizes the gentleman from new york mr. Clarke for five minutes. I think our chairman and our chairwoman and Ranking Members for having this subcommittee hearing today. And fostering a healthier continental protect consumers and i introduced the first house bill on Deepfake Technology can be accountability act which would regulate fake videos. Deepfakes could be used to impersonate all the candidates and create fake revenge porn and theater the notion of what is real. Miss oyama and mr. Huffman, whether the implications of section 230 on the deepfake policies . Ill go. Thank you for the question. Please the most with most of our peers around the same time, they prohibit the deepfake pornography on reddit because we saw that as a new emerging threat and get ahead of as quickly as possible. The challenge with this is the challenge you raise which is the increasing challenge of being able to detect what israel or not. This is where we believe that this model actually shines. By empowering our users and communities to have every piece of cloth that which is highlight digs adopters videos in images but news and resources. I do believe very strongly that we as a society and not just as platforms but in addition to developing the fences against this manipulation. It is only going to increase. Thank you. On youtube, our overall policy is a policy against practices and theres instances where weve seen these deepfakes. The Speaker Pelosi video is one example where we identify that and he was removed from the platform. As for search and for you to, surfacing authoritative Accurate Information for our business and long term business objective. I would agree with what mr. Huffing said. One of the things that were doing is investing deeply in the academic side and the Research Side and the Machine Learning side to open up the assets were these are deepfakes and get better up being able to tie and defy and we also have a revenge porn pocket policy for victimized by that and expand that to include synthetic images or other images. Miss citron, can you explain the ways of section 230 in the revenge. The activities that weve seen on youtube are precisely the kinds of activities that are proactive in clear legality and the real problem is that the folks at the table with the slab that had a report that eight out of the ten biggest porn sites have deepfake videos and nights now that have their Business Model with samesex videos and 90 of those videos are involved. Section 230 provides them immunity. Does the current immunity structure reflect the unique nature of this threat . None of them far section 230 as its devices at its best and incentivize the nimbleness that we are seeing in these dominant platforms but the way the language is written under section 2 30 seat one, it has immunity on being responsible and reasonable. You have these outliers that cause enormous harm because it could be any search of your name that theres a video to be indexed and its fine double and people will contact you as terrifying for victims. Its outlier companies and their Business Model is abused and section 2 30 is what they point to and they say to me, too bad so said. That is the problem. Very well. One of the issues that has become an existential threat to Civil Society is hate speech and propaganda on social media platforms. Miss oyama, if programs were removed would it change their incentives amount moderating such speech. Thank you for the question, its a really important area to show the power in section 230. There are First Amendment restrictions on government regulation so there is additional responsibility for Service Providers like us in the private sector. We have policy against hate speech and incitement to violence is prohibited and hate speech is prohibited and targeting the specific groups for attributes based on race, religion and veteran status. Takedowns that we do every single corner with automated flagging and Machine Learning which are lawful and possible because of 2 30. When we take down content, someones content is being taken down. We come docked any Service Provider bigger small and sued for defamation and one of the equities of the Small Business interest is really important because i think they would say that there are even more deeply reliant on this flexibility and new ways to identify that content and take it down without fear of unmitigated litigation or legal uncertainty. Thank you very much i yield back to the madam chairman. They general lady yields back and mr. Walberg you are recognized for five minutes. I think the chairwoman and appreciate the paddle being here. Todays hearing and the issues at hand are at home for a long time that has been discussed here. The internet is an amazing tool and its brought about Great Innovation and connected millions of people in ways that weve never thought of before. Truthfully, we look forward to what we see in the future but these are issues you have to wrestle with. I was pleased to invite haley from our district in the state of the union as my guest to highlight her good work in what she is doing in my district and in areas to combat cyberbullying. It is very much and individuals a comprehensive individual who understands the young person whats going on. Having a real impact in high schools and colleges now with her experience and trying to make some positive things out of it after she almost committed suicide. Thankfully it wasnt successful as a result of cyberbullying. She signed a light on that and important for miss oyama, what is your Company Looking to address cyberbullying . Thank you for that question congressman. Just two weeks ago, we updated our policies around harassment. Its one of the most complex or nuance challenges we face and appear so in many ways. One of the big changes we made is to allow harassment purport but from the third parties until somebody sees instance of harassment the reported to our team to investigate. This is a nationwide pursuit and our platform when people come to us and times of need. For example, the teenager struggling with their own sexuality as a place to turn or maybe another friends or family, they come to a platform like ours to talk to others in difficult situations or people who are having suicidal thoughts and it is our First Priority in the law with support with lawmakers in this initiative to make sure that these people have a safe experience on reddit. We will continue to do so in the future. Miss he comes oyama the. Thank you for the question. Harassment and cyberbullying is permitted so we would use our policies to help us force either through automated detection, human flagging or community flagging which would identify and take it down last quarter. We removed 35,000 videos under that policy against harassment and bullying. I want to echo mr. Huffmans perspective that the internet and content sharing is a valuable place and the victim of harassment and we see that all the time when they may be isolated in their school and reach out across from another state which has really created a lot and we want to continue to invest in that important educational resource in a content like that. Im glad to hear they are both are able to continue to helping us and as we move forward in this area. Pet whose Global Network in the last few years hold serve adds to illegal activity. This is will come a long way in illegal activity. What to identify certain activity was why not just take down the content question was for some . That was for miss oyama. On our at system we have a risk engine and illegal content there was so many different policies and more than two billion ads every year are stricken out of the network for violating those policies. You have taken them down. Yes, absolutely before theyre able to hit any page. A very squirrelly online which will feel that our network and our platforms are safe. We with good adds to good competent. Google offers a feature to tag that will automatically take them and uploaded but to google terms with this technology it applies the content and why Google Office will be used for free. Thank you for the question. I think that maybe a misperception cars we have content idea which is our Copyright Infringement system which is on needed and i think every leading publisher is part of it. Its part of our Partner Program and offered for free and it doesnt cost them anything its a revenue generator so we had three billion dollars based on the claims of copyrighted material that right holders claimed. We were able to take the majority of the revenue associated with the content and send it back out to them. That system of being able to identify and protect the content to then set controls to be in the entertainment served or in the case of violent extremism is absolutely blocked which powers much of this. Thank you, i yield back. They gentlemen you looked back and mr. Loebsack, you are recognized for five minutes. Thank you madam chair i want to thank the chair schakowsky and the two Ranking Members of the committee for holding this hearing today. I want to thank the witnesses for your attendance as well and its been very informative even for not being able to answer all the questions. Its not the first time our community has examined our social media and force for intervention and human connection. Which we all enjoyed when were making those connections as they are positive. But also of criminality and i think every one has an expert in your field and i appreciate hearing from you all today and we can consider how section 2 30 is interpreted by the courts and its initial passing on what the changes should be and consider that theres a lot to consider as we discussed the full scope of what section 2 30 covers. From cyberbullying and hate speech on facebook and youtube and elsewhere to elicit transaction of substance or weapons. The question today is twofold. First, we must ask if congress moderators are doing enough and second, we must ask whether congressional action is required to fix these challenges. The second one has been preferred to buy some of you and by some of us. But i think thats essentially the second question that were really facing today. We have some difference of opinion on whether section 2 30 should be focusing its resources. Id like to ask everyone the same question which is the easiest question to answer any most of adult. Because it is a seemingly vague. Whats the difference between good and bad content moderation look like . Start with you mr. Huffman he. Thank you congressman for that philosophically impossible question it. I think there is a couple of easy answers i hope everyone on this panel of agree with. That content moderation is annoying the problem. That was the situation or in prepersons 230 to perverse incentives. Many there are many forms of good content moderation and what is important to us at a red it is twofold. One empowering our users and set standards in this course amongst themselves and its the only truly scale bull solution and the second is what 2 30 provides us, which is the ability to look at our platforms to investigate to use some finance and nuance. For what makes content bad or what makes content moderation . There was a difference between good and bad content moderation. Okay thats over talking about. Of course, it proceeds a question of why were here with it comes to harms to say why we should try to talk about change in section 230. Whats bad or incredibly troubling is when sides are permitted to have the entire Business Model which is abuse and harm. By my lines that is the worst in the worst and theyre induced and in this it alliance. The problem is how to deal with the problem. I got some answers for you but it will do a long way to do that. I did in my testimony that id deal with the bad samaritans and a broader approach. Thank you for the question. I think its a great question. I think that as someone who supports liberties online in this primary goal for us, good content his precise with the parent and what we see far too often is that in the name of content moderation and making sure that its safer for everybody, all viable and lawful content will have details that are submitted in the testimony and one example where we have the archive of videos attempting to document war atrocities and videos are often flagged as violating terms of service because they contained horrible material. The point is to actually support political conversations and its very difficult for deep Service Providers to tell the difference. Thank you. Miss peters. If its illegal in real life it should be illegal online. Content moderation ought to focus on illegal activity. I think there has been little investment in technology that will prove this for the platforms because of section 2 30. Thank you. I realize a lot of time, im sorry i had such other press clinch but id like to get a response if i could to the witnesses here and if i could please, thank you so much in the yield back. The gentlemen yields back i recognize mr. Carter for five minutes. Thank you medicare and thank you all for being here. I know that you all understand the important this is and i hope that you all take it seriously. Thank you for being here thank you for participating. I would like to ask you, and your testimony, you pointed out that there is clearly quite a bit of illegal conduct that the Online Platform still are hosting. For instance, illegal for bc where you can buy pills without a prescription. Terrorists that are profiteering off of artifacts and also products from endangered species and it even gets worse. We mentioned the sale of human remains and Child Exploitation and gross things if you will. How much effort do you feel like the platforms are putting in to containing this and to stopping this . It depends on the platform. But thats a very good question. Id like to respond with a question to the committee. When was the last time anybody here saw a picture on facebook . They can keep genitalia off of these platforms they can keep drugs off these platforms, they can keep sexual the use of these platforms. The technology exists and their policy issues on whether its the policy meant to allow video of nancy pelosi or the policy to allow pictures of human genitalia. Let me ask you this, never go to them and meet with them and expresses to him . Absolutely. How are you received . We are typically told that the firm has intelligence people working on it and create ai and a few years that he is going to work. What we presented is evident specific to identifiable Crime Networks and we have been told to get back to us and they dont. Its happened multiple times. Are you ever told that they dont want to me with you . Weve usually gotten meetings. You feel like you got a Good Relationship and you feel like the effort is being put forth . I dont feel its being put forth. Thats right struggle. I am doing my best to keep the federal government out of this. I dont want to stifle integration im concerned about that but at the same time, look, we cannot allow this to go on and this is irresponsible. If you dont do it youre going to force us to do it for you and i dont want that to happen. Its as clear as that. Let me ask you miss peterson, you mentioned in your testimony that you are getting funding from the state department and wildlife supply chain and thats when you discovered there was a Retail Market first in danger to be sees that exist on platforms like facebook and wechat. Ive any of these platforms made a commitment to stop this . If they have, is it working . Is it getting any better . Thats a terrific example that the number of tech firms up joined a coalition with the Wildlife Fund and have taken a pledge to remove endangered species content and wildlife markets from the platform by 2020. Im not aware that anything has changed. We have researchers going online and logging wildlife markets all the time. I will be fair. Im going to let i will let the google im sorry i cant see that far. I let you respond to that. Youre feel like youre doing everything you can . Thank you. We can always do more and more committed to always doing more. I appreciate that but i know that. I dont need to tell me that. Do we have a plan in place to fix and stop this . Let me tell you what were doing in the two categories you mentioned. Wildlife in the sale of endangered species is prohibited from google apps. On the National Epidemic that you mentioned for opioids, were committed to playing our part in combatting this epidemic. Theres an online committed an offline component. The online component at the research showed that less than 0. 05 of misuse of opioids are from the internet and what we have done with Google Search is the fta can send us a warning that if they see theres a lincoln search for road pharmacy, we will deal that out of search. There is a port an offline component to so we work with the da on prescription that day and featured these places on google maps and cbs and happy to come in. I invite you to do just that, okay . Id like to see even talk to you further about this. Mr. Hartman, ill give the opportunity because my staff has gone on read it and they have googled if you will or searched for Illegal Drugs and it comes up. I suspect you will tell me the same thing. Were working on it and we have it on her control but is still coming up. I got a slightly different answer. First of all, is again it is against our roles to have controlled goods on our platform. It is also illegal and 230 doesnt have an effect on criminally ability. If you went to any Technology Service with a search bar, including your own emails without a role whether spam fold or at least three and its come is an and its spam a first and its removed by filters enters a lax between something being submitted and thats how the system works. With that said, we do take this issue very seriously and technology continues to improve along these lines. Thats exactly the sort of ability that 2 30 gives us. The ability to look for this content and the extent that your staff has found this content specifically and that it is still on our platform. Wed be happy to follow up on this position. My son is grownup i feel like apparent pleading with their child again. Please dont maybe have to do this. Thank you madam chair and yield back. The gentleman yields back and now i recognize congressman kelly for five minutes. Thank you madam chair and thank you for holding this hearing on section two and they more consuming internet and the section 2 30 is still our companies to moderate content under the provision and this law seems to be widely in supplied. Good samaritan provision 230 was attempted not good faith and to access the availability that the provider or user considers to be obscene, moved or less specific and filthy and excessively by bun and harassing otherwise objectionable with those material is constitutionally protected. Congress section 2 30 as in foster to make them liable for any activity related to sex trafficking. Some of criticize the law for being too ambiguous and addition to my work on his committee i chair the house accountability caucus and that capacity is stakeholders to have family users in a mountain that will allow invaders to innovate. As we look to foster them more friendly and intimate line hope that our discussion will set the standard doing so in response with affective imbalance away. Professor citron, in your testimony, you give platforms immunity from liability but they show their content moderation practices right large and are reasonable. As the chairman referenced, how should Companies Know weather where the line is if theyre doing an off . Its the sort of genius of reasonableness is that it matters and depends on the context. Their service is a baseline consumption default of what would constitute reasonable content moderation practices which includes having them. Some dont engage in it at all and they dont engage in moderation and encourage the legality but there are some baseline academic lighting that ive done with companies and there is a set of policies and with these best practices which is going to change. Depending on the challenge or going to have different approaches to different new involving challenges. Thats why reasonable this which is preserved the liability shield but it does it in exchange with those efforts. Would you agree that a change to be made we need to ensure that it doesnt happen from ambiguity. What is disappointing to someone who helps some officers work on the languages when you include the language knowingly and facilitate, that is the moderators dilemma. To sit on your hands or be overly aggressive. I think its disappointment which unfortunately how it came out. We do see ourselves back in those initial cases and are seeing way overly aggressive responses to sexual expression online which is a shame. And we see the doing nothing so i hope we dont do that. The way people communicate is changing as we all know and we start on one platform and jump to another and go haywire very quickly. The 2016 election shows that how it could spread and how effective it can be to motivate and deter the populations. Content is shared and groups and out to a wider audience. Miss peters, what do you believe the responsibility of Tech Companies to monitor those contents and is rapidly spreading before being flagged by users. I believe that Companies Need to moderate and removed content whether it is concerning of a clearly illegal activity. If its illegal in real life about to be illegal to host in drug trafficking, human trafficking, wildlife trafficking and Serious Organized Crime and designated terror group should not be given space to operate on our platforms. I also think that section 230 should provide more opportunities for civil and state Law Enforcement or state and local Law Enforcement to have illegal tools to respond to Illicit Activity. Thats one of the reasons its been placed. Miss oyama and mr. Huffman, what do you do to stop the spread of extremist separation content . Share widely if there are flags that come up and the same content to share a few thousand times . Thank you for the question. On youtube, were using machines and algorithm want content is identified and our technology prevents it from being uploaded. Do you really important point about. Working across the platforms in this collaboration the example would be the global in that form under terrorism and the floating numbers and tech is part of that. One of the things we saw during the christchurch shooting was that we were grateful to see some of the crisis policy was put into place and there was a shooting in germany and there was a piece of content that appeared on twitch and companies were able to engage and there was the content that will spread across the companies and enabled all of us to block it. I am now out of time. They gentle yay the eels back and now it shimkus is recognized for five minutes. Appreciate it very much. My question is for doctor mcsherry. I understand the path of eff which is including language in the legislation and for the purpose of making a language with the agreement to protect the statute domestically or statute domestically. Do you see the intent of including such 2 30 like language in trade agreements which is to ensure that we may not revisit this statute . No. Okay. All right, thank you very much. What id like to do madam chair is to ask eff and the course from january 23rd 2018 and chairman malcolm be, entered into the record. Without objection so ordered. Thank you madam chair, appreciate it. Next question is for mr. Huffman and miss oyama. Is that ok . Thank you. In april, 2018, i questioned marcus october about how soon illegal opioid ads would be removed from the website. His answer was that the ads would be reviewed when they were flagged by the users as being illegal or inappropriate. This is a standard answer any social media space. However, mr. Zuckerberg also said at the time that Industry Needs to build tools that proactively go out and identify as the for opioids before people have been flagged or for us to review and that ends the quote. This was significantly in my opinion cut down with the time and illegal act would be on their website. Again, mr. Huffman and miss oyama, it has been a year and a half and it is a editing epidemic and people are dying and im sure you agree with this. As the industry been actively working on Artificial Intelligence flagging and will identify illegal ads and the technology of when can we expect implementation. Whoever would like to go first is fine. But its reddit a little bit different in our peers because they go through a strict human process and that not only are they on the right side of our content policy butch prohibits the buying and selling of illegal substances but are much more strict adds policy. Which has a much higher bar to cross. We do not want ads that cause any sort of controversy on a platform. Okay, we we have to be proactive as far as this is concert and mr. Zuckerberg indicated that is the case. These kids are dying good people are dog and we just cant stand by to have this happen and have access to these in most cases opioids and drugs are different types of drugs. Miss oyama would you like to comment on this . Thank you. I agree with your comment about the need for proactive efforts. On google ads we have a risk engine that helps as identifiable as it is bad and helping to the system in 2018 we kicked out 2. 3 billion as out of our system for violating our policies. For any prescription that would show up in an ad its also independently verified by the independent group call legit scripts through that would be verified by them and specific cases of opioids would be a substance under federal law and theres a lot of important work that we done with the da and the fta and the pharmacies like cbs off line to help promote things like take back your drugs day where people can take opioids and drop them awful or not misused later on. One of the things we have seen is the vast majority of the night in that present of opioid misuse happens in off line world. On a doctor thats prescribing or a Family Member or a friend and they use the technology to educate and inform people that might be potentially victimized from this which is equally important to some of the work that are in with these ads. How about anyone else on the panel. What they like to comment is, the industry doing enough . I dont think the industry is doing enough. Weve ignored other drug sales taking place on google groups, instagram, facebook groups and the groups on these platforms are the epicenter and thats why they industry has to be better at this. If they have it up to users the flag it, if its a private or secret group it will not just happen. These firms know what users were getting up to and they can sell the stuff and can figure this out. Can i also add that there are two issues here. There are the ads and thirsty native content. You heard miss peter say that you want this morning and searched on reddit and that content is there in the same with Google Search there are two places yet to worry about. Not just the at. Very good. Thank you madam chair in a yield back. Gentleman yields back and i now ask the chairman of my committee for five good minutes. Thank you madam chair. I want to start mitt miss oyama. And her testimony, you talk about the Community Guidelines for hate speech on youtube and the hate speech and abuses on the rise on social media platforms. How does section 230 incentivize platforms to moderate such speeds . How does section 230 incentivize platforms for taking a hands off approach for removing hate speech of you will . Thank you so much for the question. On the category of hate speech it prohibits hate speech we have a very clear policy so that would be speech that incites violence or speech that is hateful against groups of specific attributes which would be speech based on their race, religion, sex, age, disability status, veteran status, that is prohibited that could be detected by machines which is a case of more than 87 of humidity at the vigils users or all of those actions that we take in the last quarter saw a five x increase in the amount of content that our missions are able to find and remove. Those removals are vitally dependent on the protection of to 230 give providers to moderate content and fight bad content and take it down. We do have claims against us when we remove speech and they sue us for defamation. They have other illegal claims and 230 is what enables not only google or youtube but any site were the user comments and generated content. And insight on the internet larger small to yield and moderate that content. He i think we would encourage congress to think about not harming the good actors in the innocent actors are taking the steps but in an effort to go after that will criminalized fully exempted and they should be penalized and Law Enforcement will play a role in the back page that has been taken down and there is platform liability that breaks the law. Doctor freed, through the internet as led to the International Terrorism and the criminal and civil liabilities with the special terrorism we. Want to start with doctor cheri and it doesnt apply to federal and criminal law and how to Companies Use a section 2 30 to shield themselves from civil liability from around their platforms as propaganda and improve these platforms and civil liability . The ongoing cases where platforms have been accused of violating for hosting certain kinds of content on their platforms. They invoked two section 2 30 in those cases quite successfully. I think that is the facts of a lot of those cases and its quite appropriate and the reality is that its very difficult for being able to tell and i always draw the line between content thats talking about protected and political applications and content that steps over the line. For these cases are hard and complicated and need to get resolved on the facts. They also create a space in which the additional protection provides create a space for Service Provider and will moderate in these policies. You all go back to doctor fried, do you see how it should be progressed from a technological perspective . When you hear about the moderation of having the google and read it and that will come from intense pressure. It will come from advertisers and come from packable hill and from the eu and from pressure from the press. Theres bad news and targeting syria. Im struggling with extremism and terrorism online and weve hit a hard wall. The eu started pressure and they started advertising pressure and we started getting responses. Its exactly what this conversation is about and what are the under finding factor of the process which is not working and its also coming from other avenues. The limited pressure from the modest changes it is the right direction and they news on oyama is that they are good actors and should encourage the change. And will deal with the problems that theyre dealing with. And with this fight for over a decade now it is a very consistent pattern. You will minimize the extent of it and do not the technology exist and you take the pressure and start to make change. He should step to the other part of that and start doing better. Thank you madam chair. Gentleman yields back and now i recognized for five minutes thinking about 20 years ago gene forte. I tried to improve Customer Service with right now technologies from a Spare Bedroom in our home. We eventually grew that business to be one of the largest employers in town and had about 500 highways jobs there. The platform recreate it had about eight billion unique visitors per day. I understand albertan section 2 30 can be for Small Business. Its important liability shield will be mixed up with complaints which ones and i will cite one particular case this with muzzle abased Rocky Mountain which reached out to my office because google had denied one of their advertisements. The foundation did what i done many times and try to use paid advertising on a Global Network to promote a short video about a father hinting with his daughter. This time, however the, foundation received an email from google and i quote. Any promotion about hunting practices even when they are intended as a healthy method of population control or conservation its considered animal cruelty and deemed inappropriate to be shown on our network. Then the day i heard about this i sent a letter to google and you are very responsive but the initial position taken was absurd. Hunting is a way of life and montana and many parts of the country. Im very thankful that you worked quickly to reverse that but i remain very concerned about googles effort to stifle the promotion of the foundation and with other groups they have faced similar efforts to shut down their advocacy. We dont know how many hunting had google has blocked in the last five years. In my march letter i invited google ceo to meet with leaders of our Outdoor Recreational businesses in montana and i havent heard anything back. Miss oyama i would extend the invitation again in front. I think frankly it would help google us to get out of Silicon Valley and come to montana and sit down with some of your customers and hear from them directly on things that are important to them. Id be happy to host a visit in love to meet you there. I think its important to understand they worked at these groups due to further conservation at to help species thrive. As an avid hunter and outdoorsman myself i know many businesses in montana which will focus on hunting and fishing. I worry that they may be denied the opportunity to advertised on the largest Online Platform they built to your credit. I also worry that the overburden sea of regulatory regimes could hurt Small Businesses. And stifle montanas rapidly growing high tech sector. The invitation is open and doctor i farid have one question for you. How can you walk this line between protected Small Business versus overburden of regulation. Its the right question to ask because we have seized and if we start regulating the Small Companies are not going to be able and theyre creating conduct with the eu and the uk as theyre talking about regulation and are coming from small platforms. Its versus three billion users and we are tread very lightly and theyre also making the point that they want to inspire competition for a better Business Model and there are mechanisms to do that. We had a lot of discussion today with the efforts are taking to get criminal activity and i thought we should draw to continue to get but as a follow from the doctor how, do we ensure that the content moderation does it become censorship and a violation of our First Amendment . The way we have been thinking about content is a collaboration between humans and computers what. Theyve been doing is a same thing over and over again. What were not good at still is nuance and subtlety and facing contacts. The content moderation for example in the space as human moderator say this is a child and sexually explicit. We fingerprint that content at or above specific targeted back peace content and raised for a decade ago 50 billion and that is a scale we need to be operating at. If youre part of the technology you have to be very automatic and under a very high scale and humans and he computers cant do it on their own so we need more from the moderators. 10,000 moderators and 500 hours of video permitted and its not a health moderator. You can do this yourself in those moderates have to put an hours and hours of video per hour so me to beef up our alliteration. Miss oyama, i look forward to seeing you in montana yield back. That the gentlemen deals back. I now recognize congresswoman chairman and ranking blunt rochester members. Thank you thank you for holding this important hearing and i think many of us here today are seeking to more fully understand how section 2 30 of the Communications Decency act and the ever changing technological war. This hearing is a really significant and as miss oyama said, i want us to not forget the important things that the internet as provided to us from movements to applications and tiktok. But also as mr. Huffman said we and you applied it to read it but i think it applies to all of us we must constantly be involved in our policies and involving to face new challenges while also balancing our civil liberties. We have a very important balance here so my question is really are surrounded the question that they asked about that content moderation and i want to start by saying that the utilization of the learning algorithms on our official intelligence are through content posted on wind sites as large as youtube. Its important Technological Solutions with the amount of content to moderate. As we become more and more reliant on algorithms, we are increasingly finding blind spots and gaps that make it difficult to breach and simply more in their code. Air is a real concern that groups facing the prejudices will be further marginalized and censored. So, as i thought about this i thought about groups like the veterans and the American Community in 2016 election. Doctor fareed, can you describe some of the challenges with by algorithm or the bias . When we automate at the scale we will have problems and weve already seen that. We know the face recognition on people of color that it does and the problem with the automatic moderation is that it doesnt work at scale flicker. Talking about billions and your algorithm is 99 accurate which is very good, youre still making 100 mistakes and is literally tens of millions of mistakes of data to be making at the fill of the internet. The underlying idea that we can fully automated and not take on the responsibility and moderators is that i fear that we have been too far that to give us time with the ai algorithms because if we dont want to hire the human moderators. This will not work in the next new year, five years or ten years because we are also assumes that an adversary is not adapted. We know the adversaries can adapt so we know for example with these algorithms that can verify content are vulnerable to the tax with small amount of content information which can be fooling the systems. I want to ask a quick question by mr. Huffman and miss oyama. Both of you talked about the number of human moderators that you have available to you. I know that weve had many hearings on challenges of diversity in the tech field. Im assuming mr. Huffman, yours are more from the user perspective in terms of moderators and people that you hire and they 10,000 or so that you mentioned. These are people that you hired or are they users . Just quick so Everybody Knows or is it a combination . Its about 100 employees out of 500 with millions of users as well. The 10,000 that i mentioned is the full time employees and we specialize in the community flagging and could be Law Enforcement and could be an average user. I dont have a lot of time but could you provide us with information on diversity of your moderators . Thats one of my questions. Also, i dont like to make assumptions but im going to assume that will be a challenge to find diverse population of individuals in this role and what youre doing in that vein. Will you have a followup with that my . Latest question will be for the panel, what should the federal government be doing to help us in the space . Im really concerned about the capacity to do this and do it well and if anybody has any suggestions and recommendation. Mr. Fried is pushing his button. I think this conversation is helping and will be scaring the pursuance i think thats good to do. Thank you so much to all of do and what youre doing for your work. Gentlemen yields back and i wont now ask last but not least representative he soto for five minutes. Thank you for being here and im the last one so youre in the home stretch here. Its amazing that were here today. When we think about how far the internet has progressed. One of the greatest inventions in human existence on connecting the world and connecting billions and giving a voice while your story might never be told provided the knowledge at our fingertips. It is just incredible. Section 2 30 has been a big part of it with finding that safe harbor at its a holding back the lawsuits which created integration but its also created a breeding ground for defamation and harassment for impersonation and election interference. Bringing right supremacy around to global terrorism another extremism. There is just wonderful gift humanity on one side and then all of the terrible things on the other side. My biggest concern is that its spreading then faster than the speed of light and a lot of truth seems to go at a snails pace. Thats one thing that i constantly hear from my constituents. I want to start and that is opinion on what you think should be the cop on to be. With the choices being fcc, ftc or the courts you and what you think on. That if those are i only three options if a bit of a short answer. The United States society. Who do you think should be the cop on the . I think thats the best option which is the courts. Let me and will beat the norm producers. With doctor mcsherry. Cardinal principle for the eff is at the end of the day users should be able to control their internet experience. We need to have many more tools he i think its a ridiculous argument. People that study organized crime. All answer the question in the courts and Law Enforcement and most people are good and a small percentage appeal statistically in any community commit a crime. We have to control for it. Missile we alma. Content moderation has been a multi stakeholder approach and i want to know to the courts and the ftc have jurisdiction and they do have broader distinction in the courts that are looking at the section 230. Doctor free. We all have a responsibility. If we were to use the courts it would be great to hear with you doctor farid, if we would have relief on do you think that would be enough and whether or not the attorney for the set . Im not a policy maker so i dont think i should be the one answering that question with all due respect. Mr. Yarmulke and mr. Huffman, would relief in the courts change certain behaviors . I think courts to have the power for injunctive relief and Small Businesses that will produce that framework thats created uncertainty and essential further content moderation and their liability. Thank you. Mr. Huffman. Similar answer. I will shudder to think what would happen if or on the receiving end of these lawyers. You are not in quite a bit. Objective relief and what we should be looking at . As you say injunctive relief all i see is the first event of prior straits. I think we need to be careful in the kinds of remedies we think about. The law operates and we allow them to operate if people acts with unreasonably and recklessly but i think the array of possibilities should be available. Lastly, i want to talk a little bit about section 230 and incorporated in our trade deals with in a land where fictional amounts or a fictional wizard are two of our greatest assets here and miss peters, i know you talk a little bit about the issue of including teams 230 in trade deals. How that be problematic because of the electoral properties are so critical . Its problematic because its going to tie congress his hands and it is precisely why the industry is pushing back. There are pages of language and insisting the u. S. Trade agreements and to 30 could be treated as the same as u. S. Law and doesnt by Congress Hands at all. If we adjusted loss here that would affect the trade deals . There is no language in the trade deals and they regularly have heels like the climate and like theres 230 nothing in the trade deal that prevents u. S. Law from creating a u. S. Framework and countries like china and russia are developing their own remarks the. Theres nothing in the fta that would limit their ability with 230 and decided any twins later on. Thanks yielded back. Gentlemen yields back and that concludes our period for questioning. I see it unanimous consent to put into the record a letter from creative future with a letter from American Hotel and lodging association with the sooner technology and a letter from travel Technology Association and a white paper from the airbnb and a Common Sense Media and a letter from computer and Communications Industry association and a letter from representative ed grace and a letter from and a letter in support of the plan plan act. A letter from the eye to coalition or a letter for a letter to the fcc and representative forte. A letter from attack freedom and a letter from the Internet Association and a letter from which a media foundation. A Motion Pictures association, an article from the verge titled searching for help. A statement from our street. Without objection so ordered. Let me thank let me thank our witnesses. I think this was a really useful hearing. I think that those of you who have suggestions for once that came up today on our committee would appreciate it very much and im sure the joint committee would appreciate that as well in this joint hearing. So i want to thank all of you so much for your thoughtful presentation and for the win testimony, which so often went way beyond what we were able to here today. So, i want to remind members that they went to the Committee Rules and they have ten Business Days to submit additional questions for the record and to be answered by witnesses who have appeared in want to ask witnesses to respond promptly to any sort of questions that you may receive. And at this time the committee and the committees are adjourned. Thank you

© 2024 Vimarsana

vimarsana.com © 2020. All Rights Reserved.