Transcripts For CSPAN Former Social Media Execs Testify On N

CSPAN Former Social Media Execs Testify On National Security September 16, 2022

In recent years, domestic terrorism and specifically white supremacist conspiracy related and antigovernment violence has become one of our nations greatest Homeland Security threats. Last october the Committee Held a hearing to examine the role that social media platforms played in the amplification of domestic extremists content and how the content can translate into realworld violence. We heard from expert witnesses, who discussed how recommendation algorithms and targeting and other amplification tools push extreme content to users because that type of content is what keeps people active on the platforms. Unfortunately because these platforms are designed to push the most engaging, post to more users, the end up amplifying extremist, dangerous and radicalizing content. This includes qanon, stop the steal and other conspiracy theories, as well as white supremacist and antisomatic rhetoric. In some cases the content, may not necessarily violate a companys community guidelines, and other cases even content that is in clear violation of Company Policy remains on the platforms. It is often only removed after public pressure. In both places cases it doesnt significant harm to our society and stokes realworld violence. Weve seen this happen time and time again from the 2017 neonazi unite the right rally in charlottesville that was used organizing a Facebook Page to the violent attack on the u. S. Capitol spread by stop the steal content that repeatedly surfaced online, to the shooter who is livestreamed who livestreamed as he massacred a buffalo supermarket. There is a clear connection between online content and offline violence. We have heard many explanations from social Media Companies about their content moderation policies, efforts to boost trust and safety and actions taken to remove harmful accounts. Theres no question that those efforts are important but there is a question of whether those actions are enough to effectively address the spread of dangerous content online and the resulting threats it poses a two hour Homeland Security. The central question is not what content deplatforms can take down was it is posted but how they design their products in a way that used the content in the first place and whether they build those products with safety in mind to effectively address how harmful content spreads. That is the focus of todays hearings. We will have the hearing the opportunity dear from two panels of witnesses, outside experts including former facebook and twitter executives as well as the current Senior Executives from auto meta, youtube, tiktok and twitter, or charged with designing social Media Products used by billions all across the world. The overwhelming majority of social media users have very Little Information about why they see certain recommended content there is very limited transparency into how social media balance of their Business Decisions with the need for online safety. Including what resources they invest to limiting the spread of hardball harmful content. Our goal is to understand how Companies Business models and incentive structures including revenue generation, both an Employee Compensation determine how social Media Products are built. The extent to which current incentives contribute to the amplification of content that threatens Homeland Security are you for nearly a year, we have been pressing meta, talk and youtube for more information on their policies to monitor tiktok and youtube for more information on their policies to monitor, as well is the relationship between their recommendation algorithms and targeted advertising tools that generate, much of the companys revenues and the amplification of extremist content. The companys response to those inquiries have been incomplete and insufficient so far. This morning we will hear from two former executives and a Technology Journalist with social media expertise about the internal Product Development process and the Business Decisions that these companies make, including tradeoffs between revenues and growth and their trust and safety efforts, as well as how they interact with foreign governments. Later this afternoon we will hear directly from the chief product officers of meta, youtube and twitter and the chief operating officer of tiktok. The executives charged of making those Business Decisions and driving the Strategic Vision of the companys. I look forward to a productive discussion with both panelists, welcome to this committee, we look forward to your testimony. Member portman your recognize for your opening comments. I would like to think the experts for being here. It will be an interesting hearing. This past sunday we observed the 21st anniversary of the tragic 9 11 terrorist attack. And to conflict with user safety, whistleblower testimony has revealed in numerous occasions the users of social Media Companies are aware that certain features create threats to users. It is unfortunate that the American Public must wait for whistleblower exposures to find out about ways in which platforms are knowingly and unknowingly harming their users. The lack of transparency in the develop med process, algorithms and, statistics create an asymmetric environment in which the platforms know all yet the users and policymakers and the public know very little. One consequences one consequences related to china. I have concerns that the chinese coming his party has access to tiktok data on american users come over 100 million americans under the age of 19 who use tiktok. Tiktok data remains vulnerable to the communist party of china that as of the ccp tries to exploit its access to u. S. Data and insert u. S. Content over what users see. Despite moving data servers to the u. S. Tiktok and by dance employees retain the ability to access this data, that is not true we would like to hear about that today. Also we learned yesterday from senator grassleys Opening Statement with the twitter whistleblower, twitter failed to have americas data accessed by foreign governments. They spoke about how several work for agents of india, china and saudi arabia which is concerning and speaks to our congress, we need more information from platforms and how they secure user data. Another consequence of protransparency relates to content moderation. I recognize content moderation is a key component to trading say platforms for users, it cannot be the only thing. Transparency reports released by Companies Often detail the amount of content removed for violating Company Policy. The support does not account for violating content that is left up to the platform and left up on the platform and goes undetected. It also doesnt account for content that is incorrectly censored, as we see with many conservative voices on social media. I, like many of my colleagues have been critical of the political biases held by big tech platforms which resulted in systematic takedowns of accounts that hold ideologies with which the left and liberal media disagree. We will hear about that today. These takedowns are under the guise of combating misinformation, which in fact, they are really just combating conservative viewpoints that conflict with their own. Any steps taken to address the impact of social media on Homeland Security, must safeguard free speech. For us to have a responsible conversation of harmful on content on american users we must talk about how current transparency efforts have worked and have not. Congress must enact legislation that will require companies to share data, so that research can be done to evaluate of how harms from social media impact americans. I have been working on legislation along those lines, was senator coons to establish bipartisan legislation. Platform accountability would allow the largest platforms to share data with vetted, independent resources researchers so we can all increase our understanding of the inner workings of social Media Companies and regulate the Industry Based on good information, that we simply do not have now, that we can learn through this process. I would think the witnesses for being here. I look forward to having your expertise help guide us in these complicated issues and thank you mr. Chairman for holding this hearing. Thank you. It is practice of this committee to swear in witnesses, so if each of you would please stand and raise your right hands. Do you swear that the testimony you will give before this committee will be the truth, the whole truth and nothing but the truth so help you god . You may be seated. His first witness is alex, former c Vice President of twitter. He helped grow twitters monthly active users to over 300 million and build the network from over zero Network Revenue to 2. 5 billion a year. He also spent six years google on a variety of projects including tilting the Worlds Largest computational building the Worlds Largest computational platform. He was in the room for major decisions about products at twitter and is familiar with the priorities that were weighed as products were created, as well as how those products are built. Welcome to the committee, you may proceed with your opening remarks. Mr. Roetter good morning, mr. Chairman, members of the committee. Thank you for inviting me here today. We live in a world where an unprecedented amount of people consume information from social networks. Viral content and misinformation can propagate on these platforms on a scale thats unseen in human history. Regulators must understand companies incentives, culture and processes to appreciate how unlikely voluntary reform is. In over 20 years of working in Silicon Valley as an engineer, an executive, ive seen firsthand how several of these companies will work. Today ill talk how these Companies Operate and actionable ways to demand transparency. The Product Development lifecycle works as follows. First, teams of product managers, engineers, and designers are assigned specific metrics to maximize. These metrics carefully track User Engagement and growth as well as revenue and financial indicators. Other metrics, such as user safety, are either not present or much less important. Second, teams use an experimental system to launch changes to small percentages of users. The effect of every experiment of key metrics is measured extremely accurately. Absent are detailed metrics, tracking impacts on user safety. For example, i never once saw a measurement such as, did it give an experiment increase or decrease the spread of content later identified as hate speech . And third, executives review these experimental dashboards regularly and make decisions on which experiments to launch. These reviews are run by products in engineering. Legal and trust and safety are absent or do not play a substantial role. Culturally, these companies are in formal hierarchies by the builders, engineers, product managers held in the highest regard. Other functions are viewed much more skeptically. The strong bias is to make sure that corporate bureaucracy does not slow down Product Development. These companies conduct regular performance evaluations and promotions and these drive peer recognition, career advancement, cash and stock awards. The main Data Collected is what impact an individuals work has on key metric families. Only a minority of builders get promoted based on impact to trust and safety metrics as theyre not valued highly. What data has been shared to date is mostly nonilluminating statistics designed to create the appearance for their taking the problem seriously. One of the Largest Companies in the world says it is spending what seems like a large absolute number, that number must be put in context and compared to the size of other initiatives. For example, product efforts or how much they spend on stock buybacks. Large investments amounts are not sufficient. We must demand transparency based on measuring actual results. Similarly, when a Company Points to how much content it is taken down, it has to be understood into terms of the reach in the network. For real transparency, i recommend assembling a group of independent researchers and data scientists, task them with enumerating the right questions to ask, and the set of data they need to answer them. Fund them to continually do this work and refine their questions and data requests. The government is able to demand transparency in technically demanding fields. Third Party Auditors of Public Company Financial Statements are able to balance the publics need for reliable Financial Statements with a companys need to keep information confidential. Until such transparency exists, every assurance by any of these companies has to be taken on faith. Transparency is necessary but not sufficient. Until we change the fact that User Attention and profits are what Companies Care about above all else, all the data sharing in the world will not address the problem. Policy and legal experts have previously testified before the committee on ways that incentives could be changed. Incentives matter. Companies behave differently when they care about the quality of content. For example, having inappropriate ads could materially harm Financial Performance so most advertising systems place ad copy removal as a step that has to occur before the new ad ever makes its way to users. User generated content is allowed to go live instantly. Incentives shape companys algorithms. Tiktok and bytedance feed young people in china a diet of educational science and math content via their recommendation algorithm. The chinese version of the app even enforces a daily usage limit. Contrast this how u. S. Companies target content to young americans, optimizing their engagement and revenue at any cost. Any suggestion for a more useful transparency will be met with many objections. The status quo is simply too lucrative. Do not underestimate these companies ability to fight request for information. The legal team at google has the same number of lawyers as all the employees at the f. T. C. Given what we know about companies processes, culture, we should not expect progress voluntarily. We should view their commitments extremely skeptically. However, with the proper transparency and regulatory environment, i believe we can change their incentives and start to see real measurable progress against these problems. Thank you. Senator peters thank you. Our next witness is brian boland, partner engineering marketing strategic operations and analytics at facebook. He worked at facebook for 11 years. He worked in several roles including leading a 500person Multifunction Team focused on product strategy, partner engineering, operations, analytics and marketing. These high impact teams worked across facebook products and features including watch, video, news, group admins. Before joining facebook, he worked at microsoft and other tech companies. Mr. Boland, you may proceed with your opening remarks. Mr. Boland goodmorning, mr. Chairmen, thank you for holding these hearings that cover such important issues for our nation and the world. And thank you for inviting me here today to provide testimony on my experiences as a Senior Executive at facebook, now known as meta. For the last few years ive grown increasingly concerned about the roles that facebook, instagram, youtube, twitter, and tiktok play in driving the growth of misinformation, extremism and generally harmful content. I worked at facebook for 11 years in a variety of leadership roles. Helping shape product and Market Strategies for a broad array of products including advertising, news, video, media, more. During my tenure at the company i worked with the most Senior Executives and i was deeply embedded in the Product Development process. In my last two years at the company, crowd tangle is a tool that provides limited albeit albeit limited but industry leading content on facebook. What finally convinced me it was time to leave was that despite growing evidence that the news feed may be causing harm globally, the focus on and investments in safety remained small and siloed. The documents released by francis, the Facebook Whistleblower who last fall testified here highlighted issues around polarization globally to lead people down a path to more extreme beliefs. These papers demonstrate thoughtful, well reverend documentation of the harms that concerned me. The research was done by highly skilled facebook employees who are experts in their field and was extensive. And rather than address a serious issue raised by its own research, meta leadership chooses growing the company over keeping people safe. While the company has made investments in safety, these investments are small and routinely abandoned if they dont Impact Company growth. My experience at facebook was that rather than seeking to research and discover issue

© 2025 Vimarsana