vimarsana.com

Card image cap

Today we welcome representatives from the Worlds Largest social Media Companies and Online Platforms. We hear from ms. Monica bickert, head of the Global Policy management for facebook. And mr. Nick pickles, Public Policy at twitter. Mr. Derrick slater, global director of information policy at google and mr. George saline, Senior Vice President of programs for the antidefamation league. Platforms have dramatically changed the way we skmcommunica and have been used positively for likeminded groups to come together and shed light on despotic regimes throughout the world. No matter how great the benefit to society these platforms provide, it is important to consider how they can be used for evil at home and abroad. On august 3rd, 2019, 20 people were killed and more than two dozen injured in a mass shooting at an el paso shopping center. Police have said that they are reasonably confident that the suspect posted a manifesto to a website called 8chan 27 minutes prior to the shooting. 8 chan moderators removed the original post, though users continued sharing copies. Following the shooting, President Trump called on social Media Companies to work in partnership with local, state and federal agencies to develop tools that can detect mass shooters before they strike. I certainly hope we talk about that challenge today. Sadly, the el paso shooting is not the only recent example of Mass Violence with an online dimension. On march 15, 2019, 51 people were killed and 49 were injured in shootings at two mosques in christchurch, new zealand. The perpetrator filmed the attacks using a body camera and live streamed the footage to his facebook followers, who began to reload the footage to facebook and other sites. Access to the footage quickly spread and facebook stated that it removed 1. 5 million videos of the massacre within 24 hours of the attack. 1. 2 million views of the videos were blocked before they could be uploaded. Like the el paso shooter, the christchurch shooter also uploaded a manifesto to 8 chan. The 2016 shooting at the pulse nightclub in orlando, florida, killed 49 and injured 53 more. The orlando shooter was reportedly radicalized by isis and other jihadist propaganda through online sources. Days after the attack, the fbi director stated that investigators were highly confident that the shooter was selfradicalized through the internet. According to an official involved in the investigation, analysis of the shooters Electronic Devices revealed that he had consumed, quote, a hell of a lot of jihadist propaganda, unquote, including isis beheading videos, shooting survivors and family members of victims brought a federal lawsuit against those three social media platforms under the antiterrorism act. The sixth circuit dismissed the lawsuit on grounds this was not an act of international terrorism. With other 3. 2 billion internet users, this committee recognizes the challenge facing social Media Companies and Online Platforms. Their ability to act and remove content threatening violence from their sites. These are questions about tracking of a users online activity. Does this invade an individuals privacy, thwart due process or violate constitutional rights. The automatic removal may also impact an Online Platforms ability to detect possible warning signs. Indeed, the First Amendment offers strong protections against restricting certain speech. This undeniably adds to the complexity of our task. I hope these witnesses will speak to these challenges and how their companies are navigating these challenges. In todays internet connected society, misinformation, fake news, deep fakes and viral online conspiracy theories have become the norm. This hearing is an opportunity for witnesses to discuss how their platforms go about identifying content and material that threatens violence and poses a real and potentially immediate danger to the public. I hope our witnesses will also discuss how their content moderation processes work. This include addressing how human review or technological tools are employed to remove or otherwise limit violent content before it is posted, copied and disseminated across the internet. Communication with Law Enforcement officials at the federal, state and local levels is critical to protecting our neighborhoods and communities. We would like to know how companies are coordinating with Law Enforcement when violent or extremist content is identified. And finally, i hope witnesses will discuss how congress can assist in ongoing efforts to remove content promoting violence from Online Platforms and whether best practices or industry codes of conduct in this area would help increase safety both online and offline. So i look forward to hearing testimonies from our witnesses. Hope we engage in a constructive discussion about potential solutions to a pressing issue. And im delighted at this point to recognize my friend and Ranking Member senator cantwell. Across the country we are seeing and experiencing a surge of hate. As a result we need to think much harder about the tools and resources we have to combat this problem both online and offline. While the First Amendment to the constitution protects free speech, speech that incites imminent violence is not protected and congress should review and strengthen laws that prohibit rules of violence, harassment, stalking, intimidati intimidation. In testimony before the Senate Judiciary committee in july, federal bureau of investigation fbi director chris wray said that the white supremacist violence is on the rise. He said the fbi takes this threat, quote, extremely seriously and has made over 100 arrests so far this year. In my community we suffered a shooting of a Jewish Community center in seattle and others. And over the last year weve seen a rise in the desecration of both synagogues and mosques. The rise in hate across the country has also led to multiple Mass Shootings including the tree of life congregation in pittsburgh and the walmart in el pass toe. Social media is used to amplify that hate. And the shooter at one high school in parkland said the image of himself and guns and knives on instagram wrote social media posts prior to the attack on his fellow students. In el paso, the killer published a white supremacist antiimmigration manifesto on 8 chan message board and my colleague just mentioned the streaming of live content related to the christchurch shooting, the horrific incidents that happened there. In myanmar the military promoted violence against muslim rohingya. These human lives were all cut short by deep hatred and extremism that we have seen has become more common. This is a particular problem on the dark web where we see certain websites like 8 chan and adding technology and tools to mainstream websites to stop the spread of these dark websites is a start, but there needs to be an effort to ensure that people are not directed into these cesspools. I believe calling on the department of justice to make sure that we are working across the board on an international basis with companies to fight this issue is an important thing to be done. We dont want to push people off of social media platforms only to then be on the dark web where we are finding less of them. We need to do more at the department of justice to shut down these dark websites and social Media Companies need to work with us to make sure that we are doing this. I do want to mention just last week as theres much discussion here in washington about initiatives, the state of washington has passed three gun initiatives by the vote of the people, closing background loopholes and also relating to private sales and extreme person laws. All voted on by a majority of people in our state and successfully passed. So i do appreciate just last week representatives from very companies in the Tech Industry sending a letter asking for passage of bills requiring extensive background checks. Very much appreciate that and your support of extreme person laws to keep guns out of the hands of people who a court has determined are dangerous in the possession of that. So this morning we look to forward to asking you about ways in which we can better fight these issues. I do want us to think about ways in which we can all Work Together to address these issues. I feel that working together, these are successful tools that we can demploying in trying to fight extremism that exists online. Thank you, mr. Chairman, for the hearing. Thank you very, very much. Now well hear oral testimony from our four witnesses. Your entire statements will be submitted for the record without objection. We ask you to limit your comments at this point to five minutes. Ms. Bicker, youre recognized. Thank you for being here. Thank you chairman wicker, Ranking Member cantwell. Thank you for the opportunity to be here today and answer your questions and explain our efforts in these areas. My name is monica bicker and i am facebooks manager for Global Policy management and Counter Terrorism. Im responsible for our rules around content on facebook and our companys response. Id like to begin by expressing my sympathy and solidarity with everybody affected by the recent attacks across the country. In the face of such acts we remain committed to assisting Law Enforcement and standing with the community against hate and violence. Were thankful to be able to provide a way for those affected by this horrific violence to communicate with loved ones, organize events for people to gather in grief, raise money to help support communities and begin to heal. Our mission is to give people the power to connect with one another and to build community. But we know that people need to be safe in order to build that community. Thats why we have rules in place against harmful conduct including hate speech and inciting violence. Our goal is to ensure that facebook is both a place where people can express themselves, but where they are also safe. While were not aware of any connection between the recent attacks and our platform, we certainly recognize that we all have a role to play in keeping our community safe. Thats why we remove content that encourages real world harm. This includes content that is involving violence or incitem t incitement, promoting or publicizing crime or encouraging suicide or selfinjury. We dont allow any individuals or organizations who proclaim a violent mission, advocate for violence or are engaged in violence to have any presence on facebook even if they are talking about something unrelated. This includes organizations and individuals involved in or advocating for terror activity, organized hate or other violence. We also dont allow content posted by anyone that praises or supports these individuals or their actions. When we find content that violates our standards, we remove it promptly. We also disable accounts when we see severe or repeated violations. We work with Law Enforcement directly when we believe there is a risk of physical harm or a direct threat to public safety. While there is always room for improvement, we already remoove per year. Our efforts to improve the enforcement of these policies are focused on three areas. First, building new technical solutions. Second, investing in people who can help us implement these policies. At facebook we now have more than 30,000 people across the Company Working on safety and security efforts. This includes more than 350 people whose primary focus is counter hate and Counter Terrorism. And third, Building Partnerships with other companies, Civil Society, researchers and governments so that together we can come up with shared solutio solutions. Were proud of the work weve done to make facebook a less hos sti tile place. We know bad actors will continue to attempt to skirt detection. And were dedicated to continuing to advance our work and share our progress. We look forward to working with the committee, regulators, others in the Tech Industry and Civil Society to continue this progress. Again, i appreciate the opportunity to be here today and i look forward to your questions. Thank you. Thank you. Mr. Pickles . Twitter has publicly committed to improving the collective health, openness and civility of public conversation on our platform. Our policies are designed to keep people safe on twitter and they continuously evolve to reflect the realities of the world we operate in. We are working faster and investing to remove content that distracts from a healthy conversation before its reported, including terrorist content. Tackling terrorism requires a whole of society response, including from social Media Companies. Let me be clear. Twitter is incentivized to keep violent content off our platform. Communities have been impacted by instances of Mass Violence with tragic frequency in recent years. These events demand a robust Public Policy response from every quarter. We recognize that content removal alone cannot solve these issues. Firstly twitter takes a zero tolerance approach to terrorism on our service. Since 2015 we have suspended more than 1. 5 million accounts for violations of our rules related to terrorism. In the majority of cases we take action at the account creation stage before an account has even tweeted. The remaining 10 is identified through a combination of user reports and partnerships. Secondly we prohibit the use of twitter by violent extremist grou groups. Since the introduction of this policy in 2017, weve taken action on more than 186 groups globally and suspended more than 2,000 unique accounts. Thirdly, twitter does not allow hateful conduct on our service. An individual on twitter is not allowed to promote violence. Where any of these rules are broken, we will take action to remove the content and will permanently remove them from twitter. Fourthly our rules prohibit the selling, buying of transactions in weapons. Well take appropriate action on any account found to be engaged in this activity including permanent suspension where appropriate. Additional le ly collaboration our industry peers and Civil Society is critically poorimpor to addressing the threats from terrorism globally. This facilitate information sharing, technical coordination, research collaboration. Twitter and Technology Companies have a role the plo play in addressing Mass Violence. This cannot be the only Public Policy response and removing content alone will not stop those determined to cause harm. When we remove content from our platform, it moves those views into the darker corners of the internet. This content continues to m migrate to less governed platforms and services. Addressing Mass Violence requires a whole of society response. We continue to work with industry peers, government institutions, Law Enforcement, academic and Civil Society to find the right solutions. Thank you for your time today. Thank you very much. Mr. Slater. Chairman wicker, Ranking Member cantswell. My name is derrick slater. I lewould like to take a momentn behalf of everyone at grade schooling to express our horror upon learning of the tragic attacks. While Google Services were not involved in these recent incidents we have engaged on steps we are taking to ensure that our platforms are not used to incite hate speech. I will focus on three key areas where we are making progress to help people. First how we work with governments and Law Enforcement, second, our efforts to prohibit products that cause damage or injury. First, google engages in ongoing dialogue with Law Enforcement and Law Enforcement agencies to understand the threat landscape. For example, when we have a good faith belief that there is a threat to life or serious bodily harm made on our platform in the United States, the dpgoogle cyb crime group will report it. In turn that Intelligence Center quickly gets the report into the hands of Police Officers to respond. We are also deeply committed to working with kbogovernment, ande Tech Industry. Second, we take the threat posed by gun violence in the United States very seriously and our advertising policies have long prohibited the selling of weapons, ammunition and similar products that cause damage, harm or injury. We also prohibit the promotion of instructions from making guns, explosives or other harmful products. We em moi a numbploy a number o to make sure that our efforts are enforced. Third, on youtube we have rigorous policies and programs to defend against the use of our platform to spread hate or incite violence. We have invested heavily in machines and people to quickly identify and remove content that violates our policies. This includes Machine Learning technology, hiring over 10,000 people for reviewing and removing content, an intel desk of experts that looks for new trends, and finally creating beneficial counter speech. This work has lead to tangible results. Over 87 of the 9 million videos we removed in the Second Quarter of 2019 were first flagged by our automated systems. Overall videos that violate our policies generate a fraction of a percent of the views on youtube. Our efforts do not end there as we are constantly i involevolvi new policies. The updated policy specifically prohibits videos claiming that a group is superior. We have already seen a 5 x spike in removals and channel terminations on hate speech. In conclusion we take the safety of our users seriously and value our Close Relationships with Law Enforcement and government agencies. We want to be responsible actors for our part of the solution. As these issues evolve, google will invest in people and technology to meet the challenge. We look forward to continuing collaboration with the committee as it examines these issues. Thank you for your time. Thank you very much. Mr. Saliam, your group refers to be known as adl these days. The antidefamation league. Goes by adl for short. We appreciate you being with us today and were happy to receive your testimony. Thank you for the opportunity to be here. My name is george saliam and i serve as the Senior Vice President for programs at the adl. For decades the adl has fought against bigotry and antisemitism by exposing those who spread hate to incite violence. Today the adl is the foremost nongovernmental authority on dou dough mdomestic. Id like to share some key data, findings, analysis and urge this committee to take action to counter a severe National Security threat, the threat of online white supremacist extr e extremism that threatens our communities. He expressed support for the accused shooter in new zealand who also posted on 8 chan. Before the massacre in california, the alleged shooter posted a link to his manifesto on 8 chan, citing the terrorists in new zealand and the pittsburgh tree of life attack. Three killing sprees, three white supremacist manifestos. One targeted muslims, another targeted jus and the third targeted land x and other immigrant communities. One thing these three killers had in common was 8 chan. Unfettered access to Online Platforms both fringe and mainstream has significantly driven the scale, speed and effectiveness of these forms of extremist attacks. Our Research Shows that domestic extremist violence is trending up and that antisemitic hate is trending up. The fbi and doj data shows similar trends. Immediate action is paramount to prevent the next tragedy that could take innocent lives. Adl has worked with the platforms represented at this tabl table. We appreciate these work greatly but much more needs to be done. Adl has called on these companies at this hearing as well as many others to be far more trabs parensparent. We need meaningful transparency to give actionable information to policy makers and stakeholders. The problem will not be solved by addressing these issues online alone. We urge this committee to take immediate action. First, our nations leaders must call out bigotry in all its forms at every opportunity. Our nations Law Enforcement leadership must make enforcing hate crimes laws a top priority. Our communities need this congresss immediate action on a range of ways, notably to codify federal offices and create comprehensive reporting. Our federal legal system currently lacks the means to prosecute a white supremacist terrorist as a terrorist. Congress should explore whether it is possible to craft a rights protecti protecting domestic terrorism statu statute. In addition, the state department should examine whether certain foreign, white supremacist groups meet the criteria for designation of fto. For technology and social Media Companies, we look forward to Companies Expanding their terms of service and exploring accountability and governance challenges, aspiring to Greater Transparency and how you decrease these efforts. This is an all hands on deck moment to protect all of our communities. I look forward to your questions. Thank you. Thank you. On your platforms how do you define violent content . How do you define extreme content . Thank you, mr. Chairman. We will remove any content that celebrates a violent act and this is a serious physical injury or death. We also will remove any organization that has proclaimed a violent mission or is engaged in acts of violence. We also dont allow anybody who has engaged in organized hate to have a presence on the site and we remove hate speech. Hate speech we define as an attack on a person based on his or her characteristics like race, religion, sexual orientati orientation. Harder to find extreme than violent, isnt that correct . Yes. And we see different people use that word in different ways. What we do is any organization that has proclaimed a violent mission or engaged in documented acts of violence, we remove t m them. What is your platforms definition of extreme . We agree that the word extremism itself is very subjective and in some contexts can be a positive thing. People can be extremely active on an issue and it isnt a bad thing. We have a three stage test that identifies violent extremist groups. That test is that we identify through their stated purpose publication or they promote violence as a means to further their cause and they target civilians. Weve got that three stage test because we believe that framing allows us to protect speech and debates but also remove violent extremists from our platform. Mr. Slater, can you add anything . Thank you, chairman. Probably similar in that we ban designated foreign terrorist programs from using our policeman form. Broadly similar lines. Now, mr. Saliam has suggested that your three platforms need to be more transparent. What do you say to that, mr. Slater . Thank you, chairman. I think transparency is a bedrock of the work that we do, particularly around online content. We are in t we have in the last year on youtube provided our Community Enforcement record where you can go and see how many videos we removed in a quarter and we break that down. I think this is a really key issue and we look forward to continuing to improve. Perhaps you could help them understood how you frankly dont believe theyre quite transparent enough at this point . Thank you for your question. To be clear the point im making on transparency is to make sure there are more clearly delineated categories between the point that mr. Slater was making and the point in terms of what the machines or algorithms use and what users on any of these platforms go onto say. Like we think this is a violation of the terms of service. There is degrees of inconsistencies across these platforms at the table and others. To get a wholistic picture of what a certain issue may be, while individuals may flag for some gravels pulled down, there are different consistencies in that. Were looking for a much more balanced approach across all the platforms. Mr. Pickles, is he touching on something that has a point . Absolutely. I think the balance for Companies Investing in technology, understanding how the technology was found is very important. We now publish a breakdown of six policy areas and the number of user reports we receive. Its about 11 million reports every year. B telling that story in a meaningful way is absolutely a challenge. Whats that percentage in facebook . When it comes to violent content and terror content, more than 99 of what we remove is flagged by our technical tools. By Artificial Intelligence . Some of its Artificial Intelligence, some of it is image matching. Known individu weve worked w for years on this and i think transparency is key. We for the past year and a half have published not only our detailed implementation guidelines for exactly how we define hate speech and violence, but also reports on exactly how much were removing in each category and how much of that is actually flagged by our technical tools before we get user reports. Thank you very much. Senator can it twell. What do you think we need to do to monitor incitement on 8 chan and other dark websites. So i think you can really approach this issue from two categories. There are a number of increased measures, some of which i noted in my written statement submitted to this committee that these companies as well as others can take to create a greater degree of transparency and standards so we can have a really accurate measure of the types of hatred and bigotry that exists online. We can makebetter policies. I think having the good data is a frame fowork. Youre saying theres more they can do . Yes, maam. I look at your statement. You include auditing and Third Party Evaluation for that transparency as well as responsibility. As i mentioned in my Opening Statement, basically then drive all this to a dark web that we have less access to. What more do you think we should be doing together to address the hate that is taking mace on these darker websites too . So a number of measures. I mean, the first is having our Public Policy starting from a place where were victim focused. We know whether its pittsburgh or any of the number of cities that other panelists have mentioned in their statements. We need to start taking measures that combat extremism or domestic terrorism for preventing other such tragedies. I think we can make better policy and programs at the federal government and state and local and also in the private industry levels as well. One of the reasons im definitely going to be calling on department of justice to ask what more we can do in this coordination is several years ago interpol, microsoft, others worked on trying to address on an international basis child pornography to try to better skill Law Enforcement at policing crime scenes online. I would assume that the representatives today would be supportive and maybe helpful, maybe even financially helpful in trying to address these crimes as they exist today as hate crimes on the dark side of the web. Do i have any responses from our Tech Companies here . This is something that across the industry weve been working on for the past few years in a manner very similar to how the industry came together against Child Exploitation online. We launched the Global Internet forum to Counter Terrorism as a way of getting industry to create sort of a nogo zone for this terrorist and violent content. As part of that, we train hundreds of Smaller Companies on best practices and we make Technology Available to them. The reality is for the big e. R. Companies we often are able to build technical tools to stop videos at the time of upload. We now have 14 Companies Involved in a Hash Sharing Consortium so we can stop content at the time of upload. I agree there is more you can do on your own sites. Setting that aside for a minute, what do you think we should do about 8 chan and the dark websites . I can tell you what we do on facebook, senator, which is we ban any link that connects to 8 chan where these manifestos have appeared. These were not available through facebook. Im saying what more do you think in government and Law Enforcement working together besides what you do to address this, anybody else . I think to follow up, i think certainly if theres criminal activity happening on these platforms, a Law Enforcement response is primary. If people are promoting violence against individuals and committing criminal offenses, a Law Enforcement intervention is something i think should be looked at. If we can strengthen as industry our cooperation from Law Enforcement, we can make sure the information sharing is as strong as it needs to be. You think we need more Law Enforcement resources addressing this issue . I think its a question of resources. There was a paper from George Washington University Last week looking at some of the statutory framework around these basis. I definitely believe you need more Law Enforcement resources on this issue and i look at what progress we made with interpol and the Tech Industry fighting on other issues. I think this is something more resources. Thank you all, mr. Chairman. Thank you. Senator fisher. Thank you, mr. Chairman. In june senator thun held a Sub Committee hearing on design. Whether its through predictions of the next video to keep us watching or us watching or what content to push to the top of our news feeds. I think we have to realize that when social media platforms fail to block extremist content online, this content doesnt just slip through the cracks. Its amplified. And its amplified to a wider audience. And we saw those effects during the christchurch shooting. The Facebook Live broadcast was up for an hour. That was confirmed by the wall street journal before it was removed. And it gained thousands of views during that time frame. Ms. Bickert, how do you concentrate on how your algorithms boost content while still getting dangerous content off the platform . You touched on that in a bit in your response, but how are you targeting solutions to address that specific tension that we see . Senator, thank you for the question. Its a real area of focus. And there are three things that were doing probably the most significant is technological improvements, which i will come back to in a second. Second is making sure we are staffed to very quickly review reports that come in. So the christchurch video, once that was reported by Law Enforcement, we were able to remove it within minutes. That Response Time is critical to stopping the virality you mentioned. We have hundreds of safety and Civil Society organizations. If they are seeing something, they can flag it for us through a special channel. Going back to the technology, briefly. With with the horrific christchurch video, one of the challenges was our Artificial Intelligence tools did not spot violence in the video. What we are doing Going Forward is working with Law Enforcement agencies, including in the u. S. And the uk, to try to gather videos that could be helpful Training Data for our technical tools. And thats just one of the many efforts we have to try to improve these Machine Learning technologies so that we can stop the next viral video at the time of upload the time of creation. When you said Law Enforcement, is that reciprocal . Do you see something show up and then you, in turn, try to get it to Law Enforcement as soon as possible so individuals can be identified . Whats the working relationship there . Absolutely, senator. We have a team that is our Law Enforcement outreach team. Any time that we identify a credible threat of imminent harm, we will reach out proactively to Law Enforcement agencies. We do that regularly. Also, when there is some sort of Mass Violence incident, we reach out to them even if we have no kaegz that our services involved at all. We want to make sure the lines of communication are open, they know how to submit emergency process to us. We respond around the clock. Every minute is critical in this type of situation. Im a former prosecutor myself. So these things are very personal to me. I know that the platforms that are represented today, you have increased your efforts to take down this harmful content. But as we know, thrill still shortfalls that exist in order to get that response made in a not just a timely manner but one that is going to truly have an effect. Mr. Slater, when it comes to liability, do media platforms do guys need more skin in the game . So you can ensure better accountability and be able to incentivize some kind of timely solution . Thank you, senator, for the question. I think if you look at the practices that we are all investing in, certainly looking from our perspective and the way we are Getting Better over time, the current Legal Framework strikes a reasonable balance. In particular, both provides protection from liability that would go too far that would be overbroad but acts as a sword, not just a shield, empowering us giving us the Legal Certainty we need to invest in these technologies, the people, to monitor, detect, review, and remove this sort of violative content. The Legal Framework continues to work well. Can you comment on this as well . Do you think there is enough legal motivation for social media platforms to prioritize some kind of solutions out there . I think thats what this hearing is about, to find solutions so we can curb that online hate that i think continues to grow. When thinking through the issues of content moderation, the authorities that exist within the current Legal Frameworks that reside within the companys represented at this table is sufficient for them to take actions on issues of content moderation, transparency, reporting, et cetera. So there certainly is a degree of legal authorities that affords these companies, as well as others, the opportunity to take any number of measures. Ms. Bickert, in your testimony, you said Facebook Live will ban a user 30 days for first time violation of its platform policies. Is that enough . Can users be banned permanently . Would that be something to look at . Senator, thank you for the question. One serious violation will lead to a temporary removal of the ability to use live. However, if we see repeated serious violations, we simply take that persons accounts away. That is something we do across the board, not just with hate and in citing content but other content as well. Thank you. Thank you, senator fischer. Senator blumenthal. Thank you, mr. Chairman. Thank you all for being here today. Thank you for outlining the increased attention and intensity of effort that you are providing to this very profoundly significant area. I welcome that you are doing more and trying to do it better. But i would suggest that even more needs to be done. And it needs to be better. And you have the resources and technological capability to do better. And just to take the question that senator fischer asked you of, mr. Selim, about incentives, your answer was that they have authority to provide them with opportunities. The question is, really, dont they need more incentives to do it more and to do it better to prevent this kind of Mass Violence that may be spurred by hate speech on the sites or may, in fact, actually be a signal of violence to come . . And i just want to highlight that 80 of all perpetrators of Mass Violence provide clear signals and signs that they are about to kill people. That is the reason the senator and i have a bipartisan measure to provide incentives to more states to adopt extreme Risk Protection order, laws that will, in fact, give Law Enforcement the information they need to take guns away from people who are dangerous to themselves or others. And that is so critically important to prevent Mass Violence, suicides, domestic violence, and the keys and information and signals often appear on the internet. Just this past december in monroe, washington, a clearly troubled young man made a series of anti essential etic rants online. He bragged about planning to, quote, shoot up an expletive school, end quote, in a video while armed with an ar15style weapon and on facebook posted that he was, quote, shooting for 30 jews. Fortunately, the adl saw that post. It went to the fbi. And the adls vigilance prevented another parkland or tree of life attack. Fred guttenberg of Coral Springs, florida met with me yesterday. He told me about a similar incident involving a young man in Coral Springs who said he was about to shoot up the high school there, and Law Enforcement was able to foresaw it and using extreme Risk Protection order statute. So my question is to facebook, twitter, and google, what more can you do to make sure that these kinds of signs and signals involving references to guns. It may not be hate speech, but it is references to possible violence with guns or use of guns. Thank you, senator blumenthal. One of the biggest things we can do is engage with Law Enforcement to find out what is working in our relationship and what isnt. Thats the dialogue over the past years has led to us establishing a port al through which they can request requests for content and we can respond very quickly. What are you doing proactively . I apologize for interrupting, but my time is limited. Proactively, what are you doing with the identity of signs and signals that somebody is about to use a gun in a dangerous way. Senator, we are now using technology to identify any of those early signs, including gun violence but also suicide or selfinjury. You report it to Law Enforcement . We do. In 2018, we referred a number of many cases of suicide or selfinjury where we detected them using Artificial Intelligence to Law Enforcement so they were able to then intervene and in many cases, save lives. We have a similar approach. They are at risk. We work with the fbi to ensure they have the information they need. Mr. Slater . Thank you, senator. Similarly, when we have a goodfaith pwhroef of a threat, we will pro privily refer to the california Northern Regional center who will fan it out to the authorities. Because my time has expired, im going to ask each of you if you would, please, to give me more details in writing as a followup for how you what identification, signs you use, what kind of technology, and how you think it can be improved, assuming that congress approves, as i hope it will, the emergency Risk Protection order statute to provide incentives more than just the 18 states that have them now but others to do the same. Thank you. Thank you, senator blumenthal. Senator bloom. Thank you for all being here today. Your participation in this hearing is appreciated as we continue the oversight of tasks each of our companys face. While seeking to responsibly manage and thwart those who use your services to spread extremist and violent content. Last congress when we held a hearing looking at propaganda online, we discussed the crosssharing of information between facebook, microsoft, twitter, and youtube which allowed each of those companies to identify potential extremism faster and more efficiently. So i would just direct this question and ask how effective is that shared database of hashes been . Senator, thank you for the question. Through the shared database, we have 200,000 distinct hashes of terror propaganda. And that has allowed i can speak for facebook only. That has allowed us to remove a lot more than we otherwise would have been able to do. I would just add, since that hearing, actually, the reassuring thing is we dont just share hashes now. We have grown that partnership. So we share urls. If we see a link to Something Like a manifesto, we share it across industry. After christchurch we recognized we need to improve. We have Realtime Communications in a crisis. So industry can talk to each other in realtime operationally to say even not content related but situational awareness. That partnership between industry now also involves Law Enforcement. That wasnt there when i think we had the hearing last. So i think it is not just about the hash program but broadening out new programs that are developing the work further. Yeah. I think broadly i would say look at how we have been improving over time. Surely systems are not perfect. Were always going to have to evolve to deal with bad actors. On the whole, we are doing a better and better job because of information sharing in removing the sort of content before it has wide exposure of any sort or before it is viewed widely. Senator, i would only add that the threat environment that we are in today as a country has changed and evolved in the last 24 to 36 months. Likewi likewise, the tactics and techniques that these platforms, as well as others, use to evolve the evolving nature of the terrorist landscape online, foreign or domestic, needs to keep pace with the threat environment that we are in today. So just as a followup, are there similar partnerships among your companies, as well as the smaller platforms, to specifically identify Mass Violence . Senator, one of the things we have done over time is expand the mandate of the Global Internet forum to counterterrorism. So we relatively recently expanded to include mass violent incidents. And we are now sharing, both through our crisis incident protocol and hash sharing, sharing a broader variety of incidents. Mr. Slater, youtubes automated Recommendation System has come under criticism for steering users towards violent content. Earlier this yearive led a Sub Committee using persuasive technologies on internet platforms al going rhythmic content selection. I asked the witness that google provided at that time for that hearing. Several specific questions for the record that were not thoroughly answered. And i would just say that providing complete answers to questions members submit for the record is essential as we work to look together as partners to combat many of the issues discussed here today. So i would like your commitment to provide thorough responses to any questions you might get for the record. Do i have that . Certainly, senator, to the best of our ability. I would like to explore the nexus between persuasive technologies and todays topic. Specifically, what percentage of Youtube Video views as a result of youtube automatically suggesting or playing another video after the user finishes watching a video . Senator, i dont have a specific statistic there, but i can say the purpose of our Recommendation System is to show people videos they may like that are similar to what they have watched before. At the same time, we recognize this concern about recommendations for borderline content. That is content that maybe isnt removed but brushes right up against those lines. And we have introduced changes this year to reduce recommendations for those sorts of borderline videos. Okay. If you could get the number. I assume you have that somewhere. Thats got to be available. And furnish it for the record. The question again is to ask you specifically what is youtube doing to address the risk of some of these features, which you note are pointing a user to the direction of increasingly violent content . Yes. And that change we made in january to reduce recommendations. Its been key. Its still early days, but it is working well. We have reduced views by 50 just since january. As the systems get better, we hope that will improve and happy to discuss it further. Thank you. Thank you, mr. Chairman. Thank you, senator thaoupb. Based on presence at the gavel, we have senator black burn, followed by senator scott. Senator black burn. Thank you, mr. Chairman. And i want to thank each of you for being here this morning. And for talking with us. This committee has looked at this issue on the algorithms and their utilization for some time. And we are going to continue to do this. Looking at content and the extremist content that is online is certainly important. We know there are a host of solutions that are out there. And we need to come to an agreement and an understanding of how you are going to use these and these technologies to really protect our citizens. And social companies are in a sense open public forums. And they should be, where people can interact with one another. And part of your responsibility in this vane is to have an objective cop on the beat. And be able to see what is happening. Because youre looking at it in realtime. But what has unfortunately happened many times is you dont get an objective view. You dont get a consistent view. You get a subjective view. And this is problematic. And it leads to confusion by the public that is using the virtual space for entertainment, for their transactional life, for obtaining their news. So, indeed, as we look at this issue, we are looking for you to approach it in a consistent and objective manner. And we welcome the opportunity to visit with you today. Ms. Bickert, i have a couple of things that i wanted to talk with you about. Weve all heard about these thirdparty facilities where contractors are working long hours and theyre looking at grotesque and violent images and theyre doing there day in and day out. So talk a little bit about how you transition from that to using modern technologies. What facebook is going to do in order to capture this to extract it into to minimize harm. Youve talked about youve got 30,000 employees that are working on safety and security. And then there are thirdparty entities that are working on this. So lets talk about that impact on the individuals and then talk about the use of technologies to speed up this process and to make it more consistent and accurate. Thank you for the question, senator. Making sure that we are enforcing our policies is a priority for us. Making sure that our content viewers are healthy in their jobs is paramount. So one of the things that we do is we make sure that we are using technology to make their jobs easier and to limit the amount of content types of content they have to see. And i will give you a couple of examples where Child Exploitation videos, with graphic violence, with terror propaganda. We are now able to use technology to review a lot of that content so that people dont have to. And in situations where now, let me ask you this. Sorry to interrupt. But we need to move forward. Your 30,000 reviewers, are they all located in palo aloe or are they scattered around the country or around the globe . No, senator. We have 30,000 people working in safety and security. Some of them are engineers or lawyers. The content reviewsers, we have more than 15,000. They are based around the world. Yes. Okay. Great. Thank you. For any of them, not only are we using technology, and there are ways we are using even where we cannot make a decision on the content using technology alone, there are things we can do, like removing the volume. Or separating a video into still frames that can make the experience better for the reviewer. Okay. Now, let me ask you about this. Mark zuckerberg, in a Washington Post oped, had called for us to regulate to define lawful but awful speech. So tell me how you think you could define or we could define lawful but awful speech but not overreach or infringe on somebodys First Amendment free speech rights. Senator, one of the things that we are looking to with our dialogue with governments is clarity on the actions that governments want us to take. So we have our set of policies that lays out very clearly how we define things. But we dont do that in a vacuum. We do it a lot with Civil Society organizations and academics around the world. But we also like to hear the views from government so we can make sure we are mindful of all the different safety well, ours are constitutionally based. I am out of time. Mr. Pickles im going to submit a question to you for the record. Mr. Selim, i have one that i am going to send to you. Mr. Slater, i always have questions for google. So you can depend on me to get one to you. And we do hope you all are addressing your prioritization issues also. With that, mr. Chairman, ill yield back. Thank you very much. Senator scott. Thank you for being here today. Im glad were having a here today to have a meaningful conversation about whats happening in our nation. Its time we face the inability our culture produced a class of predominantly young white men. They live purposeless life for the most evil desires, sometimes with racial hatred. As you all know, we had the, while i was governor, we had the horrible shooting at parkland, at the school in parkland. Within three weeks, we passed historical legislation including the Risk Protection orders that senator blumenthal was talking about. We did it by sitting down with Law Enforcement, Mental Health counselors and educators to come up with the right solution. Now, with regard to the shooting at parkland, the killer, nicholnikolas cr cruz, had a long history of violent behavior. In september 2017, the fbi learned that someone with the user name nikolas cruz posted a comment on a Youtube Video that said, i am going to be a professional school shooter. In addition, nikolas cruz made other threatening comments on various platforms. The individual whose video he posted the comment on reported it to the fbi. Unfortunately, the fbi closed the investigation after 16 days without ever contacting Nicholas Cruz. The fbi claimed they were unable to identify the person who made the comment. Unfortunately, we now have 17 innocent lives that were lost because of Nicholas Cruz. My question is for mr. Slater. How is a platform like youtube, which is owned by google, not able to track down the i. P. Address and the identity of the person who made that comment. When did youtube remove the comment . Did youtube report this comment to Law Enforcement . If so, who and when . If you did report this comment to Law Enforcement, did you follow up . What was the process . And was there any followup to see if there was any corrective action . Senator, thank you for the question. First, it was a horrendous event. And we strive to be vigilant, to invest heavily to proactively report where we see an imminent threat. I dont have the details on this and the specific facts that you are describing. Id be happy to get back to you. Let me say this Going Forward. Looking ahead, parkland was a moment that did spur us to proactively reach out to Law Enforcement to Start Talking about how can we do this better. Thats part how we reached out and started working with the Northern California regional Intelligence Center so we could go to a onestop shop, who could get it from the right Law Enforcement locally rather than to cold call people. Just this month in fact, in the last month, there was an incident where pbs was streaming news hour on youtube. Somebody put a threat in the live chat. We referred that to the regional Intelligence Center. They referred it to orlando police, who then took the person into custody appropriately. And this was reported in the news. So thats not to say things are perfect. We always have to strive to get better and look forward to working with you and Law Enforcement on that. But i do think we continue to improve over time. So with regard to Nicholas Cruz, you will give me the information of, you know, who did you contact, when did you contact, when was it taken down. So to this day i cannot get an answer on what anybody did with regard to this shooter. What youtube did, the fbi did. Nobody wants to talk about it, which is fascinating to me. So if you will get me that information. Second, are you comfortable that if another Nicholas Cruz puts something up, you have the process now that you will contact somebody and there will be a followup process . Senator, i think our processes are Getting Better all the time. They are robust. I think this is an area where it is an evolving challenge both because technology evolves, peoples tactics evolve. They might use code words and so on. I will be happy to followup on how we will continue to Work Together. Thank you. Mr. Pickles, how can knack lass maduro, who is committing genocide against his citizens, who is withholding clean water, food, medicine, still have a twitter account with 3. 7 million followers . You rightly highlight that the behavior thats been taken there is abhorrent, and the question for us as a Public Company that provides a public space for dialogue is someone breaking our rules on our service. We recognize there are situations where there are geopolitical circumstances, World Leaders have accounts where twitter is blocked, there is no free speech. With he take a view of that, that we hope that the dialogue that person being on the platform starts helps contribute us to solving the challenges that you outline. He has been doing it for a long time. It is not Getting Better in venezuela. Its getting worse. And i think this is a good illustration of how the role of Technology Companies, along with other parts of Public Policy response is. If we remove that persons account, it would not change facts on the ground. So we need to bear in mind how did the other levers have come play. I disagree. Maduro sits there and talks about things and continues to act like he is a word leader, and hes a pariah. It sure seems to me what you are doing is allowing him to continue to do that. Well, as i say, his current account has broken our rules. Were he to break the rules, he would be treated as any other user and we would take action as necessary. We have votes already started and you are trying to get to other people. I would be happy to work with the senator from florida on this issue. I think we are not doing enough. And i think the specific case i mentioned in my Opening Statement about the rohingya and what happened on facebook is another example. So happy to work with you on this issue. Well, yes. And thank you, senator cantwell, and senator scott, for raising this. Im told there is a vote. And im shocked, shocked to hear they are going to leave it open until 11 30. Which is generally what happens. Senator duckworth . Thank you, mr. Chairman. While i do appreciation the intersection of extremism and social media, many i think would agree that todays hearing is another data point in a long history of congressional hand wringing on gun violence. Since 2019 began, 260 days ago, we have witnessed 318 Mass Shootings in the u. S. More than one per day. Mass shootings are those in which at least four people are shot, excluding the shooter. After 20 children, six adults and the shooter lost their lives at sandy hook in 2012, many elected officials, including myself, declared an end to congressional inaction. No more, we said. But since that day, our nation has endured 2,226 Mass Shootings. Think about that number for a minute. But here we are not focused on ways to stop gun violence but rather the scourge of social media. Im not going to say that there is no connection. But every other country on the planet has social media, video games, online harassment, hate groups, crime, and Mental Health issues. But they dont have Mass Shootings like we do. Nothing highlights the absurdity of congresss ability to solve the gun crisis than seeing 318 Mass Shootings in 260 days and holding a hearing on extremism in social media. This is a chart from the Digital Marketing institute that according to their website highlights the average number of hours that social media users spend on platforms like facebook and twitter. As you will see, the United States and the u. S. , our users are relatively middle of the pack when it comes to time spent online. My question to you both is this. Do you agree that americans use social media is not unique on a per capita basis . In other words, are you aware of specific trends on your platforms to explain the amount of gun violence in the United States . Senator duckworth, and this wont come out of your time. Do sort of explain to us, because some of us cant see the detail. Sure. This is how much time, average of tpho hours that social media users spend using social media each day via any device. And the arrow points to the United States. The highest is the philippines. The lowest is japan. The u. S. Is right in the middle. So american users look, i have a 4 1 2yearold and an 18monthold when i get home who says iphone, iphone. Shes on it. She knows how to select youtube kids and knows how to go right to what she wants to watch. So im just as concerned. The United States, in terms of social media usage, would you both agree is somewhere in the middle of the pack compared to the rest of the world . Yes, senator. According to the study, which im not more familiar with, yes. In other words, are you aware, either one of you aware of specific trends on your platforms to explain the amount of gun violence in the United States . No. I think your study reflects our view, about 80 of our users are outside the United States. So i think your image speaks for itself. Thank you. Mr. Selim, you brought up the role that video games can play in online hate and harass. I disagree any dissemination can be used. If a connection between video games and gun violence exists, you would think that the widespread use of video games in japan and south korea would reflect that connection, correct . I think there is something to be said for the availability of guns in the u. S. If you look, the amount of time the folks in japan and south korea spend on video games is far greater than tpheurlgs. Were third. If you look at gun violence and gun death in 2017, heres the u. S. Were not the biggest users of video games. Would this be accurate . Senator, thank you for your question. I have not read this specific study. But i do have one data point, if i may, to share with you for just a moment. According to an adl report looking at extremistrelated murders and homicides over the past decade, a Research Shows 73 of extremistrelated murders and homicides were, in fact, committed with firearms. So to the extent you are making the point that extremists with weapons results in violence and homicide, we have the data that backs that point up. Thank you. As we are reminded daily, the world is full of people who use spoerbl so disparage others and question facts. Some will use the anonymity to spread hate. But our use of social media, video games and other variables does little to explain the 2,226 Mass Shootings since sandy hook. The internet has emboldened and empowered hate by allowing individuals to develop online communities and share their warped ideas. It is our weak gun laws here in the u. S. That allows the hate to become lethal. There is a clear and undeniable connection between the number of guns in the United States and the number of gun deaths in our community. Look at this platform. This is the number of guns per 100 people. This is the number of gunrelated deaths. We are up here. Heres the rest of the world. Some who use more social media than we do. Some of whom engage in more video games than we do. We are saturated in weaponry that was designed for war but made available to nearly anyone who attends a local gun show. The dayton shooter had a 100round drum. I didnt have one when i served in iraq. We didnt send marines into fallujah with 100round drums. Yet you can buy them in gun shows. Many agree congress should expand red flag laws and background complex. Banning highcapacity ammunition clips is what we need to do. This is not controversial. It is well pastime that leader mcconnell brings it to the house, the house pass background checks and to the senate floor for a vote. I hope leader mcconnell will allow votes on the keep americans safe, the disarm hate act, and the domestic terrorism prevention act. Each of these bills will keep our children and our neighbors safer. I hope my republican colleagues will join in these bipartisan efforts. Thank you. And i yield back. Senator duckworth, lets do this so we can have a complete record. If you would reduce those three posters to a size that we can copy and they will be admitted in the record at this point. In the hearing without objection. Thank you very much, mr. Chairman. Thats generous of you. So ordered. Senator young. Thank you, mr. Tkhaeur man. Chairman. I want to thank our panelists for being here today. I really do appreciate your testimony. And your answering our questions. Look, we all need to collaborate in cushing online extremism, which i understand to be one of multiple causes that we could cite as we all think about the issue of mass casualty events and extremist events or generally. The nations wrestling with Mass Violence, extremism and issues of responsibility, digital responsibility for some of these events. In fact, in my home state of indiana, hoosiers and crown point, indiana, recently experienced firsthand how a person can become radicalized over the internet, something i know that many of your companies has studied and are working on. In 2016, a crown point man was arrested and convicted for planning a terrorist attack after becoming radicalized by isis over the internet. Thankfully, the fbi and the Indianapolis Joint Terrorism Task force intervened before any violent attack occurred. However, that isnt always the case, as we know. We have seen this across the country. And thats why it is critically important we have this hearing, that we continue to Work Together collaboratively, knowing that your products and platforms provide incredible value to consumers. And they obviously werent intended for this purpose. So its our responsibility in congress. It is definitely your responsibility as Business People to make sure that we monitor how the great value that you provide can be used in an illicit, improper, dangerous and nefarious manner. In one minute or less, because i have three minutes left, i would request that the representatives from google and facebook and twitter tell us why americans should be confident that each of your companies is taking this issue seriously and why americans should be optimistic about your efforts Going Forward. One minute each. Indeed. Google. Thank you, senator. I would start by pointing to Youtube Community guidelines enforcement report which details every quarter videos we have removed, the reasons why. And indeed how much is being flagged first by machines. Dealing with this issue, removing vie legislative content is technology and people. Technology can get better and better in identifying patterns. People can deal with the right nuances. And we have seen over time that the technology is Getting Better and better taking the content down faster and before people have viewed it. Of the 9 million videos that we have removed in the Second Quarter of this year, 87 of those were first flagged by our machines. 80 of those were removed before a single view. We talked about violent extremism it is better in terms of removable for wide viewing. So, you know, we are already seeing advancements in Machine Learning, not just in this area but across the industry broadly. And the thing about Machine Learning, as it is fed for data, as it learns from mistakes. As we say, you got around here, those systems will get better. So why wont you be optimistic. Those systems ideally will continue to get better. Will they be perfect . No. Bad actors will continue to evolve. But i think there is reason for optimistic and i think there is reason for optimistic based on the collaboration between all of us today. Thank you. Facebook . Thank you, senator. The first thing ill say is facebook wont work as a service if it is not a safe place. And this is something that we are keenly aware of every day. If we want people to come together to build this community, they have to know their safe. So incentives are there to make sure we do our part. On our team of 350 people primarily dedicated to countering terrorism and hate is expertise. So i lead this team, my background is with more than a decade as a federal criminal prosecutor, safety and security are personal to me. But the people that i have hired onto this team have backgrounds in Law Enforcement, in academia, studying terrorism and radicalization. This is something that people come to work on at facebook because this is what they care about. Theyre not a in soed to work on it while theyre at facebook. This is bringing in expertise. And i want to make that very clear. And then finally, similar to my colleagues here, we have taken steps to make what were doing very transparent. The reports we published in the past year and a half show a steady increase in our ability to detect terror, violence, and hate much earlier when it is uploaded to the site and before anybody reports it to us. Now more than 99 of the violent videos and the terrorist propaganda we remove from the site we are finding ourselves before anybody reports it to us. Thank you. Twitter . Thank you, senator. I think people can be optimistic. A few years ago at the peak of this caliphate socalled, people challenged our industry to do more, be better. I now look at a time where 90 of the terrorists content that twitter removes is detected through technology. I look at independent academics like professor conway who talk about the is community being decimated on twitter. I look at the collaboration between our companies which didnt exist when i joined twitter five and a half years ago. All of those areas have driven better technology, faster response, and a much more aggressive posture towards bad actors that is now showing benefit in other areas. But i think we can also take confidence that no one is going to tell this committee that our work is done. And every one of us will leave here today knowing we have more to do and we can never sleep. These actors are adversarial and we have to keep adapting. Thank you so much. I could spend five days, five weeks, maybe five months or five years. I only had five minutes. Im already one minute over. Mr. Chairman . Thank you. Senator rosen youre next. Im going to go vote. I can assure you i will not let them close that vote until you have asked your questions and get over there. Senator rosen. I appreciate it, senator. Thank you for holding this important hearing. I want to thank aurlt witnesses for being here to talk about this very real and difficult issue. The rise of extremism online is a serious threat. And the internet is unfortunately proven a valuable tool to extremists who are connecting through various forms to spread hate and dangerous eye dealings. Extremism online, we must not lose sight of the fact that violent individuals who find communities online to fuel their hatred have also acted in the name of hate. We cannot ignore the fact that the absence of sensible common sense gun Safety Measures like background checks are allowing individuals to access dangerous weapons far too easily. And so we know the majority of americans want us to support that. But i represent the great state of nevada. And as we approach unfortunately the twoyear anniversary of the 1 october shooting in las vegas, the deadly mass shooting in modern American History, we know that coordination with and between Law Enforcement is more important than ever. The Southern Nevada counterterrorism center, also known as our Fusion Center, is an example of a Dynamic Partnership between 27 different Law Enforcement agencies to rapidly and accurately respond to terrorists and other threats. With las vegas hosting nearly 50 million tourists and visitors each year, the Fusion Center is responsible for preventing countless crimes and even acts of terrorism. So to all of you, can you please discuss with us your coordination efforts with Law Enforcement when violent or threatening content is identified on your platforms and what do you need from us as a legislative body to promote and enable, facilitate, whatever word you want to use, to facilitate this partnership to keep our communities safe from another shooting like 1 october . Please. Thank you, senator. The attack was incredibly tragic and our hearts are with those who have suffered and did suffer in that attack. Our relationship with Law Enforcement, first, is an ongoing effort. We have a team that does trainings to make sure that Law Enforcement understands how they can best work with us. And thats something that we do proactively we reach out and offer those. In i time there is a Mass Violence incident, we reach out to Law Enforcement immediately, be even if were not aware of nip connection between our service and the incident. We want to make sure they know where we are and how to reach us. We also have an online portal through which they can submit legal process, including emergency requests. And we have a team that that office is staffed 24 hours a day to respond quickly. Finally, we proactively refer imminent threats of serious physical harm to Law Enforcement whenever we find them. Thank you. Thank you, senator. And i just want to echo monikas sympathies of the victims of that horrible tragedy. The lessons i think we have learned since that attack have continued to inform our thinking. For example, not waiting for the ideological intent to be known before acting. One of the challenges we have is in the traditional terrorist space you might look for an Organization Affiliation where we say this is a terrorist attack. We dont wait for that any more. We act first for people to stop using the services. We cooperate with Law Enforcement to provide credible threats. I think one of the questions, and i, along with colleagues from other companies, meant with agencies yesterday to discuss how we can further deepen our collaboration. One of the questions we had there is there is a huge amount of information within Law Enforcement community, within the dhs umbrella, that is classified. It might help us understand the threats, the trends, the situational awareness. So understanding how more information can be shared with industry, to better inform us about the threats. Is so can you provide us in writing some of the tools you think you might need to help you better cooperate to protect our communities . Absolutely. That was the subject of the meeting yesterday. We had a very productive conversation. Thank you. Senator, broadly, similar here both in horror and sympathy, tragedies like that one and in the ways we proactive live cooperate with Law Enforcement, refer credible threats, as well as receive valid requests, emergency disclosure requests, propbd to them expeditiously. Thank you. I see my time is up. I will submit a question about combatting violent antisemitism online. I know other people are waiting. We have votes. I appreciate your time and your commitment to solving working on this issue. Thank you. Thank you, senator rosen. Your questions will be expected for the record. I want to start with a simple yes or no question. I dont mean this to be a trick yes or no question. It is either yes or no. Yes or no with a brief onesentence caveat if you need to. Id like to hear from each of the three of you from ms. Bickerrt, mr. Pickles, mr. Slater. Do you provide a platform that you regard as neutral in the political sense . Yes, senator, are rules are politically neutral. We apply them neutrally so you aspire to political neutrality. We want to be a service for political ideas across the spectrum. Ing on. Okay. We enforce our rules. Our rules are crafted without ideology being included. Mr. Slater . Similarly, we craft our Services Without regard to political ideology. Were not neutral against terrorism or violent extremism or hate speech. I appreciate you pointing that out. That is of course not what im talking about. That leads into the next question i wanted to raise with each of you. I think its important the work each of you are doing in this area is important. Its important for anyone occupying the space to be conscious of those things. You do a service to those who access your services by removing things like pornography, terrorism advocacy and things like that. There is a lot of debate surrounding this issue and the Legal Framework surrounding it. As you know, section 230 of the Communications Decency act has received a lot of criticism. It protects a website from being held liable as a publisher by another information content provider. Section 230 tkpwefs you the promise that you wont be held liable for taking down this type of objectional content we are talking about. Whether its something that is constitutionally protected or not. So for each of the same witnesses, each of you represent a private company. Each of you are accountable to your consumers within your company can. This means in some sense you have incentives to provide a safe experience on your respective platform. So got a question about section 230. Does section 230, particularly the Good Samaritan provisions, help you in your efforts to swift live take down things like pornography and terrorist content off your platforms . And would it be more difficult without the Legal Certainty that section 230 provides . Absolutely, senator. Section 230 is critical to our efforts in safety and security. Mr. Pickles . Absolutely. And i would go further and say section 230 has been critical to the leadership of american industry in the Information Technology sector. Mr. Slater . Absolutely, yes. On a related point, imagine a world where this is suddenly taken away. Where those provisions no longer exist. Large Companies Like yours may be able to in fact, i strongly suspect still would be able to filter out this content between the Artificial Intelligence capabilities at our disposal and your the Human Resources that you have. I suspect you could and probably would still do your best to perform the same function. What about a startup . What about a company trying to enter into the space that each of your companies entered into when they were created not very many years ago . What would happen to them . Ms. Bickert. Thank you for your question. This reminds me of industry conversations involving Smaller Companies before we formed the global counterterrorism in june 2017. We were having closed door sessions with Companies Large and small to talk about the best ways to combat counterterrorism on online. Section 230 is very important for them to be able to begin to proactively act and assess content. Id say its a fundamental part of maintaining a competitive online ecosystem. Without it, the ecosystem is less competitive. Mr. Slater. Yes. And i would just add the u. S. Has section 230. Thats part of the reason why we have been a leader in economic growth, innovation and tech technological development. Other countries that dont have Something Like that it suffer. Study after study show that. And i would be happy to discuss it more. If it was to be taken away, all three companies, yours in particular, mr. Slater, not known for being a Small Business or a business with a modest economic impact. You can identify it with this amount of concern im expressing. If we were to talk that away, google might be able to keep up with what it has to do. Wouldnt it be harder for a new tech platform, somebody starting out in the same position where your company was a couple of decades ago. Wouldnt that be exponentially more difficult . I think it will create problems for innovators of all stripes. Certainly small and mediumsized businesses would have a lot of trouble potentially getting their arms around that sort of significant change to the fundamental Legal Framework of the internet. Thank you. I see my time has expired. Senator baldwin. Thank you. I wanted to begin by thanking our full Committee Chairman wicker for holding this hearing. I think it is a vital conversation for us to be hav g having. We need to be taking a hard look at how we address the rising tide of online extremism and its real world consequences in our country. I do have some questions for you on this important topic. But, first, i wanted to echo some of what my colleagues have already said, which is there is much more that the senate must do to address gun violence. Whether or not its connected to hatred espoused to the internet. So more than 200 days ago the house of representatives passed a bipartisan universal background check bill. It has an extraordinary level of public support. It deserves a vote on the senate floor. And i feel like we cant simply have hearings but we have to act to reduce gun violence. Mr. Selim, adls center on extremism has closely studied hate crimes and extremist violence in this country. Is it fair to say that there has been an alarming increase in bias motivated crimes, including extremist killings in the last several years . Yes, senator, thats accurate. In the case of extremist killings, what role do you have that access to firearms has played in that increase . Senator, thank you for that question. As i briefly alluded to earlier, just to expand on what i was mentioning, according to our recent adl report, extremists of all ideological spectrums that committed murders or homicides in the United States, 73 of those acts were committed with firearms. Thank you. What impact do you believe this increase in hate crime, including extremist killings have on the minority killings who have been the targets of these attacks . And let me just add to that question one of the unique aspects of a hate crime is that it not only victimizes the targeted victim but it strikes fear among those who share the same characteristics with the victim or victims . Senator, thank you for making this point. In the past 24 months, we saw a calendar year 2017 with a 57 increase essential etic incidents across the country. And biasmotivated krao eupls in 20176789 we continue to see the troubling statistics year after year. So it is imperative. And part of my testimony today, both the submitted written and my oral testimony speaks to the need for greater enhancement and enforcement of hate crime laws and protections for victims. I am an original cosponsor of senator bob caseys legislation, the disarm hate act, which would bar those convicted of misdemeanor hate crimes from obtaining firearms. Do you agree this could help keep guns out of the hands of individuals who might engage in extremist violence . Yes, stpher. Thank you for your legislation. Adl supports this legislation. Thank you. I appreciate the efforts that our witnesses from the social Media Companies have described regarding their companiess efforts to combat online extremism, including to provide some transparency to their users and the general public. Its of course critically important to understand how youre addressing problems within your existing services and platforms. Id actually like to learn more from you about how you are thinking about this issue as you develop and introduce new products. In other words, i think a lot of us feel that the approach of rapidly introducing a new product and then assessing the consequences later is a problem. So id like to ask you how do you plan to build combatting extremism into the next generation of ways in which individuals engage online and why dont we start with you, ms. Thank you for the question, senator. Safety by design is an important part of building new products at our company. One of the things weve built in the past maybe five years is a new products policy team that sits and their responsibility is to make sure theyre aware of new products and features being built and explain to his engineers who are thinking of all the wonderful ways the service can be used, all of the abuse scenarios we can envision in making sure we have reporting mechanisms and other safety features in place. I think as i said url yr we are in a very adversarial space. One of the key processes in part of that discussion is how can this be used against us, how can this be gained . How will people change their behavior and i think youre absolutely right. We need to take that learning and share that with some Smaller Companies. Working with some 200 Small Companies around the world to share that knowledge with them, to help them understand the challenges is also invaluable. Similarly or trust and safety teams are at the table with product managers and engineers from the conception of an idea all the way to development and possible release. So from ground up cites safety by design. I want to thank the witnesses and im going to be taking over as chair and i will call on myself as the next witness. I want to actually ask all of you, you know, your companies, your technology is famous for its algorithms, which seem to have the ability to pin point on what people want. Union you can put an email out or some people think talk about say your interest in yellow sweaters and next thing you do know you have ads popping up on your facebook or other accounts that talk about yellow sweaters. Who knows how that happens, but to a lot of us it happens. Pretty impressive, but heres my question. If your Algorithm Technology is so good at kind of pinpointing things like that, what people are interested in particularly as it relates to ads, what are the challenges with regard to directing that kind of Technology Behind you to help us and help you find what has been talked about on both sides of the aisle which is the people committing this kind of violence are typically disaffected young males. And arent there signs . Arent there things you can do with the technology you do so well in other spaces to at least provide more warning signs of this kind of violence from these kind of individuals who in some ways already have a profile online . Ill throw that out to any of you. And are you working on that . Thank you for the question, senator. Technology plays a huge role in what we are doing to enforce our Safety Policies on facebook. In the areas of terrorism, extremism and violence its not just the matching softwares we have to stop organized terror propaganda videos, were now using Artificial Intelligence Machine Learning to get better at identifying new content that we havent seen before that might be promoting violence or we proictively send that out to Law Enforcement and these systems are Getting Better every day. Different products work in different ways. Is it a priority of yours like it would be for selling yellow sweaters . Absolutely. Can i ask that of all the companies here . Absolutely. Investing in technology to find content that is terrorist content, extremist violent concept is absolutely a top priority. It is a priority, yes. Senator, id only add to this part of the conversation as someone whos studied the research and data around these issues for nearly two decades, the threat environment that were in today has changed significantly. White supremacist terrorists in the United States dont have Training Camps in the same way that foreign terrorist groups do like alqaeda and isis and others. Their Training Camp where they connect, learn and coordinate with one another is in the online space. So its imperative the question youre asking about the Machine Learning, the technology, the Artificial Intelligence continue to advance to disrupt that environment and make it an inhospitable place for individuals that want to promote violent content of any ideological spectrum to be disrupted. All of your companies kind of have this tension between you want eyeballs on, more clicks, more time on, and yet with facebook or google or twitter, and yet i think theres increasing studies that are showing for example the amount of young men and women, young girls who feel kind of a sense of loneliness from their time online. You know, theres indications that among teenagers suicide rates are increasing particularly for young girls. One of the things i deal with were looking back on my god, how did we do that, how did we get to this position in the 90s and policies and there are things that 72,000 americans died of overdoses last year and so were kind of looking backwards saying how did this happen. In your kind of suites of policymaking do you ever wonder are we going to be looking back in 20 years going how in the hell did we addict a bunch of Young Americans to look at their iphones 8 hours a day and 20 years from now were going to be seeing the social and physical and psychological ramifications where we all might be kicking ourselves in the head saying why did we allow that to happen . I think about that and it worries me, but you have tension because dont you want more face time . Dont you want more teenagers spending 7 hours a day staring at their iphones because that helps your revenues . Do you worry 15, 20 years from now were going to be in the same spot like with opioids saying what did we do with our kids, what did we do our citizens . Do you guys worry about that your power, the negative implications in whats happening in society right now. Senator, thank you for the question. As a mother i take these question about wellness very seriously and our company does as well. And this is something we look at and talk to Group Wellness groups to make sure were crafting products that are in the best interests. Woo have seen social media be a tremendous place of support for those thinking of harming themselves or struggling with eating disorders or opioid addiction or getting exposed to hateful concept. And so were also exploring and dedeveloping ways of linking people up with helg resources. We already do that now for opioid addiction, for thoughts of selfharm, for people who are asking or searching for hateful content, we now provide them with help resources. We do think this can be a really positive thing for overall wellness. We have similar problems in place for both opioid searches and also for people who are using terms referencing selfharm or suicide where we will intervene and provide them with a source of support. And thats something we draw around the world. We also recognize things like Digital Literacy certainly we as an industry and we as twitter need to invest in to make sure as people are using our services they have the skills to use them discernibly and finally our ceo is committed to looking at the health of the situation but looking at much more broader metrics, much of the health of the conversation rather than just revenue. Chonthank you, mr. Chairman. And i will say thank you to my friend from alaska to sharing apparently the deep voice and longing in your heart. I want to start with you. I want to talk a little bit about project dragonfly. In august of 2018 it was reported that google was developing a censored search . Gen under the alias of project dragonfly. In response to those concerns alphabet shareholders requested they provide an Impact Report this year. However, during alphabets Shareholder Meeting on june 19th the proposal for the assessment was rejelkted. In fact alphabets board of directors explicitly encouraged shareholders to vote against the proposal. Quote, google has been open about its desire to serve yurz in china and other countries. Weve considered a variety of options in a way that is consistent with our mission have have gradually expanded our offerings to consumers in china. So i want to start with just some clarity. Has google ceased any and all development and work . Senator, to my knowledge, yes. And has google committed to foregoing future projects that may be named differently but would be focused on developing a sensory Search Engine in china . Senator, we have nothing to announce at this time, and i think whatever we would do, we would look very carefully at things like human rights. In fact, we work with the global Work Initiative on an ongoing basis to evaluate how our principles, our practices, our products comport with human rights and the law. So roughly contemporaneously google decided that it did want want to work with the u. S. Department of defense. How does google justify being willing to work with the Chinese Government on complex projects including Artificial Intelligence under project maven, and at the same time not being willing to help the department of defense develop ways to minimize civilian casualties . How do you reconcile those two approaches . Senator, as wave talked about today we do partner with Law Enforcement and we do partner with the military in certain ways offering some of our services. Also as a business we draw responsible lines about where we want to be in business including limitations on getting in the field of building weapons and so on. And, you know, we will continue to evaluate that over time. Let me shift to a different topic which is this panel has talked about combating extremism and the efforts of social media to do that. Many americans including myself have a longstanding concern that when big tech says its combating extremism that that is often a shield for advancing political censorship. I want to talk about recently twitter extended its pattern of censorship to the level it took down the twitter account of the Senate Majority leader mitch mcconnell. That i found a pretty remarkable thing for twitter to do. And it did so because that account as i understand it had sent out a video of angry protesters outside of senator mcconnells house including an organizer black lives matter in louisville whos heard in the video saying tat the Senate Majority leader, quote, should have broken his little raggedy wrinkled ass neck, and someone else who had a voodoo doll of the majority leader. Senate majority leader sent out those threats of violence and found remarkably his own twitter account taken down. How does twitter explain that . Thank you, senator, for the opportunity to discuss this. Something weve been asked around the world is the climate in Many Political jurisdictions of safety of people who hold public office. So when we saw a video posted by numerous users that clearly identified someones home and clearly contained as you referenced their threats, out of an abundance of caution we did remove that video. We didnt remove the accounts. We removed that single tweet that contained that video from everybody who had posted it. Because the ens of a video with someones personal home where the Senate Majority leader may have been residing at the time with several violent references, we felt was something out of an abundance of caution we should remove. We then discussed this further with the office. We understood their intent was to call attention to those threats of violence and so we did prevent the video saying this is sensitive media but its not balance were striking between ive been in many situations where ive been offered things opposite which is similar concept should be removed. That balance is something we struggle to get right every day. You would agree theres a difference between someone posting video where they are threatening someone else and the target of that threat posting the video. You mind agree those a qualitatively different. I believe thats wholly fair. I believe there is still a risk there and we are motivated by connecting that offline home that could have occurred because the home was visible. And we appreciate theyre insight but this was something our motivation was to prevent harm and not the kind of potential ideological issues you may allude to. But mr. Pickles, have you rethought your policy since what senator cruz asked about. And i would recall the written testimony on page 2 which says and i quote we do not allow propaganda symbols to be shared on a platform unless theyre being used to condemn or inform. Is that language instructive to your platform, and dont you think that clearly it was readily evident from the beginning that senator mcconnell and his campaign had posted that video to condemn and inform . I think this is an absolutely relevant issue. We as a company have taken a more aggressive posture. After the christchurch attack we did see people posting the manifesto and content to condemn it. And we decided even in those circumstanc circumstances we would remove it. And manifestoes, large chunks of manifestoes even when they are condemning it we have taken the decision to remove the material. I think the case you illustrate highlights for us the complexity in getting this right. Again, if were going to err on the side of caution, fewer violence threats and fewer people homes being visible on our platform is notably a good thing. But this is something where this is the first time and ive been with the company for 5 1 2 years ive ever been asked why did want we leave something up to the content of the threat. Well, in terms of the context in this instance it was the owner of the home who chose to inform the world about what was being said against him. And it was the individual himself who posted this. And it seems to be a clearcut case in that instance that differentiates it from the condemnation of the larger incident of the christchurch violence. I would just suggest that it shouldnt have taken very long for twitter to understand that. Senator sullivan, you are recognized. Thank you, mr. Chairman. I have a couple of follow up questions. Senator cruzs question i think its whether a Company Wants to work with the pentagon i think is something the leadership of the companies the individual companies have to make that decision. I think thats certainly something thats fine. I think what troubles a number of us is that theres a declaration that youre not womening to work with the department of defense on certain issues, and yet theres a willingness to work with one of our countrys potential adversaries particularly on sensitive tech logical issues that are important to the competition between the two nations. Do you understand why that has caused bipartisan concern here, and how should we address it . Should Congress Take action on those kind of situations . Not saying everybody has to work for the pentagon, thats your decision. But if you dont want to help to work with our Nations Defense but youre working with the the country that poses a significant threat longterm in the United States, do you understand why that causes concern here . Senator, i do appreciate the concern. We are a proudly american company. We are a business that wants to draw responsible lines and we look forward to continuing to engage with you, the committee and others to make shurlg were doing that. Do you think theres a clearcut example, hey, were not going to do anything with the u. S. Department of defense but were going to work with the chinese, something very clear and obvious, do you think theres something we should do to prevent that or penalize that, we the congress . I think its an important question. I think as a business we try and strike responsible and consistent lines, but the details would certainly have to matter. Okay, mr. Pickles let me ask just the one time question. Its a really good follow up to senator scotts earlier question. You said that the twitter account of maduro in venezuela has, quote, not broken any other rules. What are those rules, and at what point would you like to have somebody whos certainly not treating its citizens well, senator scotts been a leader on this issue but what are those rules and at what point would you look at what theyre doing to their own citizens as a way to maybe not provide them the platform that you have . Thank you. Well, firstly, the rules apply to a user on twitter is the same. I can make a full copy example. First of all its encouragement of violence if twitter account was used in some ways we have seen around the world to encourage violence against minorities, to organize violence, we would take action on those accounts breaking those rules. Would twitter allow putin to have an account or xi jinping to have an account . If they were acting within our rules. But one thing i would note and this is slightly different but important, some governments have sought to manipulate our platform to spread propaganda information through breaking our rules. One of those governments is venezuela and we have made a public declaration of every account that we remove from twitter for engaging in Information Operations covertly that we believe are responsible. That government, weve made that whole archive available for the public. Weve taken the same step with Information Operations that have been directed we believe from countries including china, iran and russia. Because we believe its not just those single twitter accounts, that some governments do also seek to manipulate our platform. So if a government takes violence against its own citizens is that breaking the twitter rules . Well, i think that actually is happening offline, and the key question for us is whats happening on twitter. Thank you. Thank you, mr. Chairman. Thank you, senator sullivan and thank you to our witnesses. The hearing record will remain open for two weeks. During this time senators are asked to submit any questions for the record. Upon receipt the witnesses are requested to submit their complete written answers to the committee as soon as possible but no later than wednesday, october 2, 2019, by close of business. I thank each and every one of you for appearing today. This hearing is now adjourned. President trump and First Lady Melania Trump will host the second state dinner of his administration as he welcomes australian Prime Minister Scott Morrison and his wife jenny morrison. Watch guest arrivals and dinner toast. Our live coverage begins friday at 6 30 p. M. Eastern on cspan, online at cspan. Org or listen on the free cspan radio app. Cspan is back in des moines, iowa, this saturday for Live Campaign 2020 covera 20 coverag beginning at 2 00 p. M. Eastern where 18 president ial candidates will take the stage for speeches. Watch the iowa state fry live on cspan or using the free cspan radio app. Car manufacturing in the city is very, very important to us. That industry is one of the backbones in lancing. We have three things in lancing, Michigan State university, we have the State Capitol and automobile manufacturing. These three components have kept lancing very successful. Cspan tour is on the road exploring the american story. This weekend be take you to lancing, michigan, with the help of our Comcast Cable partners. Lancing has been michigans capitol city since 1847. Lancing really ironically was sort of picked as the capitol city because no one really wanted to pick lancing. It was offered up as a compromised location. And well learn about the auto company he founded in lancing. He founded the real motor corp company which was a company titled as an acronym of his name. It emerged here in 1904 and stayed here pretty much close to this location in a variety of different formats through 1975. Watch cspan citys tour of lancing, michigan, as we take in its history and literary scene. This saturday at noon eastern on cspan 2s book tv and sunday at 2 00 p. M. On American History tv on cspan 3. Working with our cable affiliates as we explore the american story. Now its a joint congressional hearing featuring young Climate Change activists. Witnesses include greta toomberg who said i dont wopt you to listen to me, i want you to listen to the other scientists. Action congress should take to reduce Greenhouse Gas emissions and why young people should get involved in the issue. This is just under 90 minutes

© 2024 Vimarsana

vimarsana.com © 2020. All Rights Reserved.