Stanford Publishes AI Index 2021 Annual Report
Like
by
Stanford University’s Institute for Human-Centered Artificial Intelligence (HAI) has published its 2021 AI Index annual report. This underlying data for this year s report has been expanded compared to the previous year s, and the report includes several perspectives on the COVID-19 pandemic s impact on AI research and development.
To improve the latest report, over the last year the Institute requested feedback from over 140 members of academia, government, and industry. In response, the new report contains more data and analysis on technical performance, diversity, and ethics. The final report contains seven chapters, and the report summary distills nine key takeaways. In particular, AI applications for molecular biology and drug discovery received the most private investment, in part due to the pandemic. The pandemic also caused many AI conferences to switch to a virtual format, resulting in an increased participa
April 4, 2021
Donald Trump and his allies relied on misinformation to bolster support for the former US president ahead of the last election. His campaign, meanwhile, turned to deceptive design to bump up donations from unsuspecting Americans, the New York Times reports.
Last September, the paper reveals, when the Trump campaign faced a cash shortage, it leaned on supporters to turn their one-time donations into monthly and, eventually, weekly contributions. The problem is the campaign’s website didn’t ask people to
opt–
in to this enhanced giving schedule, it asked them to
opt-out. Trump backers only discovered later that WinRed, the for-profit company that processed Trump campaign payments, was taking hundreds or thousands out of their bank accounts. The Times’ Shane Goldmacher writes:
Dark patterns, the tricks websites use to make you say yes, explained Vox.com 2 hrs ago Open Sourced logo
If you’re an Instagram user, you may have recently seen a pop-up asking if you want the service to “use your app and website activity” to “provide a better ads experience.” At the bottom there are two boxes: In a slightly darker shade of black than the pop-up background, you can choose to “Make ads less personalized.” A bright blue box urges users to “Make ads more personalized.”
This is an example of a dark pattern: design that manipulates or heavily influences users to make certain choices. Instagram uses terms like “activity” and “personalized” instead of “tracking” and “targeting,” so the user may not realize what they’re actually giving the app permission to do. Most people don’t want Instagram and its parent company, Facebook, to know everything they do and everywhere they go. But a “better experience” sounds like a good th
Dark patterns, the tricks websites use to make you say yes, explained vox.com - get the latest breaking news, showbiz & celebrity photos, sport news & rumours, viral videos and top stories from vox.com Daily Mail and Mail on Sunday newspapers.
E-Mail
Information on individuals mobility where they go as measured by their smartphones has been used widely in devising and evaluating ways to respond to COVID-19, including how to target public health resources. Yet little attention has been paid to how reliable these data are and what sorts of demographic bias they possess. A new study tested the reliability and bias of widely used mobility data, finding that older and non-White voters are less likely to be captured by these data. Allocating public health resources based on such information could cause disproportionate harms to high-risk elderly and minority groups.
The study, by researchers at Carnegie Mellon University (CMU) and Stanford University, appears in the