Want to looking all to city lights. We are thrilled to have Samuel Woolley with us here tonight celebrating a very, very important new book called the reality game how the next wave of technology will break the truth. It is from our friends at Public Affairs book. He is a a writer and researcher specializing in the study of ai, emergent technology, politics, persuasion and social media. Hes an assistant professor in the school of journalism and Program Director for competition all Propaganda Research of the center for media engagement of university of texas at austin. Dr. Woolley founded and directed the Digital Intelligence lab at the institute for the future which is a 50yearold think tank based in the heart of silicon valley. He also cofounded and directed the Research Team at computational propaganda project at Oxford Internet institute at the university of oxford. He has written on political manipulation of technology or a variety of publications. These include wired, the atlantic monthly, vice, tech crunch, the guardian, and many, many others. His research has been featured in publications such as the new york times, the washington post, the wall street journal. Hes also made appearances on the today show, 60 minutes, and frontline. His work is been present a risk of nato, the u. S. Congress and the uk parliament. It is such great honor to having you with us tonight doing this incredible an important research. Please give him a warm welcome. [applause] hi, everyone. Great to be here. When i get my fire up that sounds kind of thing. Its been a ride. This is last talked up by two or and will happen to be here in san francisco. Specifically her at city lights. Thank you to everyone here at the store for having me. I couldnt think of a better place to in the tour and talk about really what is a book on democracy at the end of it and the ways in which we reimagine and rebuild democracy in the technological age. A lot of people assume with the talking about my work that im a Computer Scientist thats actually not true. Nothing could be further from the truth or for a long thought maybe i should try to play the game and be like that, i took a few classes, i knew a little bit, but at the end of the day im the kind of person that studies what i study the talking to people. Im an ethnographer. I spent time in places, spent time with people and i go deep. I good deep on subjects. For the better part of elastic and ive been going deep on the subject of what i call competition propaganda. Its a fancy term for the ways in which automation and computer algorithm Something Like that get used to make of Public Opinion. What we think in the last four or five years, 2016 during use election, during the brexit referent, in myanmar with the massacre in india with recent problems caused by arguably by whatsapp that afflict offline violence, weve seen social media become used as a tool for manipulation as a tool for this information. A lot has changed. In the early 2000 we had a perspective, perspective that social media was going to be something that would be the sake of democracy in many ways. That was best shown through googles kind of talking phrase of you know evil, and it was also showcased by a lot of the work that came out of the time talk about digital utopia and cyber libertarianism. Thats not where we are now, but were not lost. Everything is not lost yet and this book is not just a book about how screwed up everything is and how scary the world is. Its a book about solutions. Everything will chapter ends with solutions. The conclusion is a Solutions Oriented chapter and after spending a decade working on this i realized there are a lot of things we can do. I will end the talk today on those things. First lets talk about storytelling and what it means to be an ethnographer and some who studies technology by talking to people who make intel technology. Tonight, im going to introdue you to four people, for places and for ideas that i learned in the last several ideas. These four people, places and ideas have been instrumental and how i wrote this book and how ive been thinking about technology. The first person is named phil and hes my advisor. He is who the book is dedicated to. Hes got the director of the Internet Institute at the University Box appeared he took me under his wings. Philip at the time had been studying the arab spring. He had been in tunisia, studying the people who are using technology and attempted to make it about democracy to organize contests, to do all the sorts of things. He had written a book with oxford press called the digital origins of dictatorship and democracy. The discussion in this book was all about the ways in which the internet has have played a rolm the beginning of the internet going public in conscious both facilitate dictatorship and also facilitating democracy, for helping people to realize freedom but also dropping people to realize control. Phil was thinking about these kinds of things very early on. I had just come from being a fellow on the Obama Campaign in 2012 and i become enthralled what is working on the campaign with the way they were making use of data. I was like blown away by how sophisticated the Data Campaign was. There was a lot of excitement about Community Organizing aspects of the campaign and the personal storytelling aspects none of that wouldve been anything without the connection to the data the Obama Campaign had on independent or undecided voters. What he did, what they married the data, massive amounts of data in a massive amount of work with personal stories and humanizing from the date the way that was able to really reach people. When i met phil come he taught me something really was important and do something i attended known but really they were saying to all of you, which was technology and Politics Today are inherently connected. You cant have one without the other. To some extent if you think of technology is simply tools, if we think about media and the waste meeting gets used to communicate with people or to people about information on behalf of others, then this is often the case but in todays World Technology and politics are very much intertwined. The campaigns that tend to do the best around the world these days are the campaigns that have the most technological savvy because the reality is, not to overuse the phrase on the front cover but the reality is that if you have a lot of data and if you can marry to sophisticated ai system then you can do very goodspecific individualized at targeting or targeted to people that speak to them in the way they would like to be spoken to. Thats something we realize no. So phil in Seattle University of washington taught me that people and technology are intertwined. The the next person want to introduce you to is a person named andrew and i met andrew in england when i taken a job at the university of oxford but ended up happening we got grant money to study competition propaganda in about 2013 from the National Condition of use and European Council falsehood. They wanted to how russian of the country using social media to try to influence Public Opinion in democracies. Phil got offered a job at oxford and he said to want to come to archer whitney . Yeah, twist my arm. Of course want to come to oxford with you. I ended up there and when did i set a conference actually lsd and understanding around didnt know anyone, was young, kind of scared about it, still them, and this guy approaches me and when you study propaganda although its a conspiracy theorist to talk to. When ray and people approach me also bit like im going to startaliens or flat earth or antivaccine stuff . Amalie going to have to carry on with you . The fact of medicine i dont know how to talk to about that stuff. But andrew said hey, i make them build automated profiles on social media for a labour party in england. What . Yeah, like i control several hundred or thousand of the Council Twitter i do it on that of the labour party. Im doing it because unlike a member and a believe believe in it and stuff. I said wow visit pretty crazy. Lets talk. Got to talking and stuck up an unlikely friendship, and he really taught me a lot about the ways in which people can use technology to amplify the voices online. A lot of what i talk with in this book is a social media box, the use of campaigns built to look like people site, profiles built to look like people in social media but that are not people. They are automated profiles. One person can manage many, many, many accounts online. You can use this account to drive up likes, to retweet messages, you can also use thm now with the evolution of Machine Learning and ai to talk to people any more sophisticated fashion. Andrew taught me theres always, always a person behind the technology. The technology doesnt exist on its own. Social media firms today would have you believe the algorithms are apolitical, that they dont have value, that they make decisions in a way that no one could if they could have decided upon. If you follow the work of people like Microsoft Research new england with something called social media collective at the some fantastic openwork, surprising thing for microsoft on this, you will know that algorithms and software and technology always have human value in them. The people who built these things encode them with their own beliefs. For instance, if you train a Machine Learning tool and what you have to do is go through a process that tagging the data and your people do that. If all the people that tag the data are white men, then the algorithm may end up being racist especially if its prioritizing what bus lines should go insert neighborhood, hypothetically. Suddenly the poor neighborhoods of the neighborhoods of color, stop getting bus lines. When the bus goes, the places that need free passes for the bus end up not getting as much bus is coming through. That is an algorithm for tool encoded with human values. Bots are the same thing. When you build bots to have information, when you social be an attempt to manipulate Public Opinion, when social Media Companies build algorithms that prioritize information, theres politics there. There are decisions that go into that process. If i had adult fruit about social Media Company say were not the arbiters of truth, i would have 10,000 because they dont want you to think that arbitrate truth. Im here to tell you today thats not the case. Trending algorithms that prioritize information to people share information, they prioritize the things that you see. For the longest time and even today organizations like google, facebook, twitter and make decisions about how to prioritize news to people. Think about that. That matters and this book is about that. Andrew taught me you will need to look at the person behind the tool. Its not enough for us to do one of the Research Download data and say we will do they data now, why duplicate and who theyre doing it for. You might think its savvy lyrical campaigns that are doing this work but it turns out we could dig down deep you start finding like shadowy pr firms and marketing organizations that say i can build you social media profile and at 10,000 account in the next few weeks and surprise, surprise what theyre using is fake profiles in fake information. Its a whole weird world out there. The third person i want to introduce you to is a woman named marina. She was my boss at the institute for the future and i met her a day or two days after trump won the 2016 election. It was the first time i had ever been at the institute of the future edits in palo alto and the food out for a roundtable with a bunch of Research Scientist and a bunch of politicians at the state department were really concerned with weaponization of ai. At the time that i had that in weaponized invoice thought had come there were not smart bots talk to people and consenting to change her mind about politics. It was more a either juice behind algorithms have been more subtly integrated Public Opinion. But marina listen to all of the experts speak and then astutely in her way said at the end of the talk this is all continuation of kgb tactics and russian stuff. Shes ukraine. She grew up in ukraine in the former soviet union and she said maybe what we need to think about is a, we dont need to think this propaganda is good because this is not new. The tactics are not new. The things were seeing our continuation. Its the technology and with the technology is being leveraged thus making this so much more potent. You see automation, anonymity, we see things go the problems of scale parallel application, quantum computing, all of issues are things we need to be considering. While we consider the we have the considering the way the people who are behind them are using them. Marina copy we need to look at history. Read to think that what propaganda has been used in the past and have a deep understanding of not just with the public issues but also the terminology we use. What we say matters. Right now theres a sort of epidemic in this country of using the term fake news. The term fake news has been weaponized by people that spread fake news. I challenge you if you want to ask you what you do as an individual not to be part of the problem, that use the term is the use of the term misinformation which means accidentally spread false information or disinformation which means purposefully spread false information to it want to be fancy, you can send out information which is bad, stupid stuff. Junk news. And so the terminology matters, History Matters and its no surprise to me or the researchers tested stuff like this, carolyn jack who wrote a great piece called the lexicon of lies all about the ways without propaganda, its no spice of people who spread these lies are taken on the terminology make the same exact arguments about when to put the vigor and vim its a you doing this, this, they say no commuting district thats the playbook. The playbook isnt to change peoples might it to create confusion. Its to generate apathy. Its to make people mad and polarized. Thats the thing we miss a lot of the type we think theres sophistication in the sense that these bots are coming to talk to us and they guess suddenly become interested in owning a gun. Thats not what theyre doing. The bots are there to make you not want to vote. Not want to engage in democracy, to think the system is broken so youre so angry you for someone whos picture anchor rather than someone who has policies. We had have to look at historyd marina is right at the step comes under the soviet playbook but one thing you all should note is the russians about the only people that do confrontational propaganda. And, in fact, i think the russians benefit a lot from us thinking there really people do propagation propaganda from us talking about all the South Pacific to think that only happens from them. Computational propaganda and will recall Information Operations happened early in every country around the world these days. Theres a great reporter from an old team at oxford that suggests during elections in over 70 or 80 countries, this stuff has been weaponized by governments and by campaigns. It also happens domestically. There is bent democratization of competition propaganda, almost anyone can do it these days. My next book that will be coming out with joe press is more scholarly so it would probably be even more boring but my next book is called manufacturing consensus, and the idea behind the book is that we use this technology to create the illusion of popular for things. At the more you something look popular, the more you make it seem like a viable idea. Okay, one more person. The last person, the last place in the last idea. The last person is kathleen, by boston at the university of austin. She was formerly at the new york times, and before that she is a sports reporter. I dont know how you make that transition like that, kathleen. But shes fantastic. And when he went to ug i cant lost a little bit of hope because ut. Guess of the writing this damn book and spend time spending ways but how the weight Information System is broken. I worked in the school of journalism at ut and kathleen is director. Kathleen has taught me that we need to place space and institutions where we have. We dont need to create brandnew things. We have the federal election commission, the federal Communication Commission so we dont need the federal disinformation commission. We dont need one more commission. But more specifically we need to invest in the journalism. Journalism in this country has done amazing things. Theres so many people that work for great publications around the United States that what to do good work and want to protect democracy. They still have to learn on the fly. In fact, in the book i talk about the ways in which journalism has been not just challenged by the digital era, its not like there are feckless individuals or orcs which that cant handle it. Organizations like google, google news, youtube, facebook, twitter massively benefit on the work of journalists without giving any renumeration or money to the sites. The same can be said for organizations like wikipedia. When you to face the crisis of disinformation what did it do quick it started linking wikipedia articles. Wikipedia is not profit one of the five most access site on the web but a nonprofit and youtube is using it as the resource that you get some people to when you thought there was this information. Same thing for travelers. Google news for the longest time take snippets of articles and with people start researching it you could click through to the actual article. When you click the actual article, sorry, the research showed that when actually read the full article. No one clicked through. They just read the little piece and so the journalism put all the work and doing this investigation, why did the article, google post an article, the statement and no one reads it. You wonder why the news industry is fairly, why its having a hard time. Maybe not failing. Maybe thats the wrong word. What i think is we can reinvigorate journalism. Im working on an oped right now. The argument is the Technology Firms around the country should have to put i dont know 10 billion or 29 into a public a public trust in the United States and let it be overseen by some society groups, people that have a stake in making sure the money is spent wisely and well. Google news labs has committed 350 million or so to the google news initiative. They make partnerships conversation. They make decisions about who gets in and who doesnt. For the long time what google has done when this experience i clashed about the policies or the algorithms not prioritizing for articles, they deprioritize the new site that complain. Thats not good enough. The Technology Companies have helped create this problem and david miller two. Theres been an big mea culpa moment. We saw Mark Zuckerberg said before congress saying i know james analytic they havent systematized the response to this problem. They have done some things and they been working hard in many ways but is not enough. Its important we remember these are multibilliondollar companies. The Richest Companies in the world. They get treated more like nationstates these days than they get treated like a regular company. Kathleen taught me to reinvestment journalism and the skeptical of what we see today and to not think that journalism has helped to think journalism get a lot. All these things amok all these things taken together we have an interesting picture, and we have this book. This book is a book about the future i spent alternately about what weve been through but this book looks to the next wave of technology. This book is about deep fake because, ai, virtual reality. The automated voice systems the senate just like a person google duplex or google assistant. The way this next wave of technology would make for more potent artificial, this information. The subtitle is provocative for reason. Its both to scare people that investors know you prove you wrong. You will not let the next wave effect dont break the truth. Im going to do a little reading of it in with some solutions and then will do a q a. Conclusion, designing the human rights in mind. Finding solutions to the problem posed by online disinformation and lyrical manipulation is a daunting task. The disinformation landscape is fast and it extends. Our current ability to track it or contained effectively. Moreover, then it grows larger every day. According to a 2017 report, 2017, on the state of the net from the Software Firm which combined Industry Research for multiple companies and news outlets, we create 2. 5 quintillion bytes of data every day. 2. 5inch will be. I dont even know what that number means. Moreover, the number of Internet Users grew by 1 billion total of 3. 7 billion active users in the five years previous to that reporter from 20122070 the internet grew by 1 billion jews of the 2018 forbes article sort of 90 of the online Data Available in the world was generated in the previous two years. Let me let that sink in. 90 of online Data Available was generated in the previous two years. This means people working to gain Public Opinion exert oppression using online tools have almost unimaginable amounts of Data Available on potential targets. With information being out to every millisecond. We have access to a lot of potential targets and can never think anonymity, automation and the sheer scale of the net to me nearly unstoppable. Important ethical and legal considerations along with the possible find a skillful operative make prosecution a poor stretch for stepping out competition propaganda. Instead we have to fix the ecosystem. Its time to build, design and redesign the next wave of technology with human rights at the forefront of our minds. Thinking about response to fry synthetic cover additional propaganda i find helpful to break them down in response for the shortterm and medium term and the long term. Because of the nature of Technology Use consider to or technologybased responses to be the shortest term fixes all. Many of these efforts are banded approaches focus on triaging help for the most egregious issues and oversights associate with infrastructure of web 2. 0, the internet associate me. Such amendments include tweaks to social media news algorithms, the code that identifies trends or Software Patches for existing tools. They include ephemeral new application for did find junk news or browser plugins that track and catalog political advertisements. These efforts are useful as far as they go put on my manipulation tactics are constantly evolving. What works to track disinformation or bots on twitter today might not be useful to your phone no. In fact, many of the applications that are built for such purposes the become defunt owing to code level changes made by the sushmita first. A lack of money or upkeep or propaganda agents finding a simple way around them. There are useful products of this kind like bot check and surf safe. They detect cup additional propaganda on twitter a check for fake news sites using ones browser. These programs need to because of the updater and translated to other platforms to stay relevant and useful. They present a promising start for tools that allow users to threats of disinformation but it must be combined with action Technology Firms, governments, news organizations and others to truly be effective. Another example of a propaganda tracker is the hamilton secured of a project from the alliance for securing democracy at the german marshall fund. That was built to track a budget russian twitter accounts. Although it support to identify and report nefarious or on a social media traffic, and equally important to notify users that they may be encountering false news reports, these efforts are too passive and to focus on userbased fixes to counter competition propaganda also its important to remember a good deal of Research Shows that post hoc fact checks do not work and the social media firms are engaged in the constant battle to catch and delete new and innovative types of bot, cyber board and humanbased Information Operations. More than anything i want to communicate in this chapter that everything is not lost. Not only researchers, policymakers and Civil Society groups around the world but also Technology Firm are fighting to stem the tide of digital propaganda. Employees of facebook and managed to dismantle predatory and just a moment to advertisements on topics from payday loans to which candidate should get a book. Google has to do firm against shady dealings in military during research in manufacture. Its good to be todays large tech firms have to get real with themselves. They are now Media Companies, purveyors of news, curators of information and yes, arbiters of truth. Oh a debt to both democracy and the free market, and their allegiance to the latter does any they can ignore the former work so one of the things that you might not know and you probably dont know, because why would you come is the English Version of the book is subtitled have next wave of technology will break the truth and what we can do about it. For some reason it did make the american version judith anatol decisions but i really like what we can do about it. Ill tell you a few things. I talked about the shortterm, the medium term and the long term. In the short term the best thing you can do as people are pretty simple. They are to read the whole article, and are not being facetious. I think i myself the other day, myself like tweeting an article i have no middle thing and im like what am i doing . Like why am i doing this . I studied this. I should do better than this. One of my closest friends would get a phd with me at university of washington who is brilliant got a message saying that they come i dont want to out them, had shared a known piece of russian disinformation on tumblr from the Internet Research agency. And this is a person that studies the stucco that knows it really, really well. If they can be fooled by it, if i can be full by, and we all can be fooled by it. We have to read the whole article and think very carefully before we share what we share. What were seeing right now is in the proliferation of cheap fakes during deep fakes during the 2020s election. Its not like sophisticated video. Digital check on this and its a potent weapon and well start to see more of this. What were seeing is regular people sharing videos that are edited on imovie, it is a joe biden that look like hes a racist because they are selected. And also the video of the cnn report that was fed it to make look like he was abusing a white house intern, that then got his press credentials revoked and then subsequent reinstated some weird strange episode of mr. Mccue. Youve got to be careful what we share. We all have. The other thing people can do is talk to people that the love orr the people they care about. I just wrote a report for the National Demo for democracy in d. C. And is called demand for this information. The main take away from the report, my colleague deserves the lions share for the credit of the work, the big take away i had was the only people change of mind way comes to these issues and what comes to the polarized nature of the United States at the moment and other countries is to talk to people they care about and love. Psychological help to judgment based upon the conversation you have on facebook. You dont change want based upon an argument you have with someone that you dont know. It needs to be a conversation with some new civil and it needs to be about a topic that you care about. That needs to be conveyed. Those are shortterm things. In the mediumterm we need policy. We need regulation. Im so sick of it from people that this is just a user issue edit selfregulation will solve this. We cant google facebook, twitter and the powers of the make decisions internally about what theyre going to do. What ends up happening when thats the case is they say to researchers like me, hey, your research is flawed. Its not scientific. I see what you mean . They say you didnt have access to the data we have. You did have a represent sample. I say, maybe you can share with me a whole data set and then ill do the analysis. How about that . They like with figure that out. They have this thing called social science one fencepost to get but if that happened. There is no regulation that holds them accountable for whats going on in her platforms. Theres oversight. In the early 2000 the fec made a decision that basically said they were not going to look at any Political Community during elections online, full stop. They look at you, let the reader, look at magazines. They make sure campaigns dont do any real illicit messed up things and is as what online they dont look at all. Thats usual problematic. The government has a huge role in this. The government needs to do something. Currently nothing will get done. There is policy that has been created that is waiting to go before a more lets say a less polarized congress. But im very happy with that i think is wellinformed, but right now theres not much appetite to it. However, we can look to other countries. We can look to other places throughout the world to figure the ways in which they are dealing with this problem. Some countries in south america. And then theres a longterm fix. In the book the tagline is you got to design with democracy like design with human rights in mine. My belief is platforms are used today and its not much of glcm justified them were designed for engagement. They were designed to get people to stay on the. It would decide to scale, to grow, to massive sizes. We do have to accept that. Thats at something we asked f. Facebook has over 2 billion users. They were also designed to make money at the end of the day. What happens when thats the case . We get what we have now. We have gotten what they build, and what they build was a system that did not prioritize high quality information or engagement or civility. They wrote systems that prioritize the opposite of those things. And so i have a belief that its possible to create technology and interest of tamoxifen human rights. We see it with things like mastodon and we can see more of it. I cocreated the ethical for less and ethical for luscombe you can find for free is a tool that gives technologists a bunch of provocation how to think about the problem that could happen with the technology for building as their building. Sort of a toolkit to think through things. Introduction of Computer Scientists stanford has used it. We talked to other workstation but the ways in which they could leverage it and thats exciting because that means theres things we can do and questions we can ask. The longterm is more of a challenge, and simplest level we have to reinvest in Critical Thinking in public schools. We have to reinvest in Media Literacy in public schools. Its not enough, its not fair that, its that fair to expect that this is a user fix or a peoplebased problem as we been told, when the education institutions we have in this country dont keep your Critical Thinking to get to college and a lot of us dont get there. Its time for us to create a robust system also a Public Interest technologists. In the 1950s a bunch of foundations came together in the truth to create Public Interest law. You might sue some extent thats triggered too good an answer to write but weve come a long way. We need the same thing for technology. We cant always have the best and brightest owing to google and facebook to make a a tenany because they think theyre going to do something good. We need the best and brightest also going to nonprofits, and universities when people to understand technology to helping build legislation. We need people to understand code to be helping build the legislation that informs how was etiquette competition propaganda. While Dianne Feinstein local hero called the bot accountable accountability and abuse act that was put before congress, or before the senate a year or two ago was laudable. It was also not feasible at all because i dont think technologists had looked at it because the bill was written in such a way it didnt understand. Bots are infrastructure to the weather over half of all internet traffic comes from automated accounts. That bob would happen if the Public Interest. When we studied germany around 2016 2016 during the last election, the election then, we thought we find a lot of people sharing disinformation but we didnt. We were wondering why that was it we knew the far right in germany had gained a foothold, had become very powerful. What we quickly realize was germany has a really robust media public system. Germany has a Robust Program for Critical Thinking public schools. After world war ii what germany explains with the nazis, germany made it illegal to talk about White Supremacy of things in the public domain. We are very, very afraid of ofe scrunched think about hate speech because we are concerned about free speech. We cant always treat the First Amendment as if its at odds with our right to safety and the right to security. Weve got to take out a better way. In the introduction to this book i begin with a quote from betty whos from the bay area, shes in east bay, in the late 90s, a park ranger at the rosie the riveter museum which is amazing. My wife took me to see her give a talk i was reluctant to go, and i found myself in tears at the end of the talk because shes one of the most inspirational speakers ive ever heard. It was so amazing. I highly recommend looking up on you too. This is what i will end with a let you ask questions. She says every generation i know now has to recreate the market in the sun because democracy will never be fixed. It was not intended to. Its a participatory form of governance and all have the responsibility to form that more perfect union. Thank you. [applause] so now i think we can do some q a, and that happy to manage it. Dont be shy. Any questions anyone . Yes. I enjoyed the book and your talk. Candidly, and that the question. First off i thought is interesting how you brought in the emergence component, anthy would you brought up the kgb, weve seen the stuff before. But to me when i think about emergence i think about as you scale some of these things that you get more than just the sum of the parts. We do see kind of inverted effects of some of the stuff at scale that does render them kind of a new beast in a lot of ways. And, think about deep fake stuff, for a few more years we can do it for sure, five years maybe, dangers optimistically. But like ultimate a lot of these techniques are asymmetric to the attacker and i think thats like something really important that gets lost in the discussion. The other question i wanted to ask about was what you brought up kind of in the book about a lot of this blame gets put on the people that building these solutions which i agree with. I totally agree its techno libertarian thinking when it comes whos pulling this platforms and algorithms. But ultimately to me those systems are just answering to a ceo who answers to puerto edges to shareholders. To me, me all of this is symptomatic of capitalism driving tanks at the base. Im wondering how do you see solutions existing in this kind of incentive model . I think, so under first point, point well taken. I guess to get into the heart of that, and for everyone else, we see this information top additional propaganda launched at public and they have disinformation when they begin. They are purposefully spread false information but they become misinformation quickly. When i worked at google jigsaw as as a phone for your it was interesting, kind of an experience that really shook me, what they call it was seeding and fertilizing you just basically the cecum in your fertilizer and then you let regular people do the work and spread the propaganda for you. It is difficult to track. There is this problem where he can figure out where the snakes mouth begins and the tail end, to use the Aurora Morris as a metaphor. I think with the second question, so yes, this is a problem of capitalism in the free market, no, no were gettg around it. When you prioritize for profit you prioritize systems that take advantage of people all the time. Not unless obama masters his political Cultural Studies sias spent time thinking about this stuff. I believe that hsu states have to be made but it dont think where to throw the baby out with the bathwater. Its the best we have right now maybe, maybe not. Its the best matheson bridge. Bettys quick is no because its as i get to moxley at different in different generations. I dont quite know what the answer is. I cant really answer your question. I wish it didnt answer because you know, i collected know it all, probably just leave in live in a beautiful house. We can rebuild democracy in a different way that interacts with capitalism in a different way. Maybe we need to speak the language of the market. When i been talking to people facebook and google what i been saying is maybe it would be Good Business for you to do something thats beneficial for society. Think about that. Like how do we market this as Good Business. Other questions . Anyone . All right. Thats great. Short and sweet. Thats by far the shortest and sweetest of all of them, so thank you. Thank you very much for having me. This has been great. [applause] here are some of the current bestselling nonfiction books according vromans bookstore and has been a california. Some of these authors have appeared on booktv and you can watch them online at booktv. Org. Good evening. Welcome to quail ridge books. It is my honor to introduce to you jim jenkins who was an editor, editorial writer and columnist on the editorial page of the news and observer for 31 years. He will will introduce our special guest please help me welcome jim jenkins. [applause]