vimarsana.com

Page 2 - Coding Rights News Today : Breaking News, Live Updates & Top Stories | Vimarsana

How AI systems undermine LGBTQ identity

Access Now 6 April 2021 | 4:32 am Most of us interact with some sort of Artificial Intelligence (AI) system several times a day, whether it’s using the predictive text function on our phones or applying a selfie filter on Instagram or Snapchat. Some AI-powered systems do useful things that help us, like optimize electricity grids. Others capture our most sensitive personal information your voice, your face shape, your skin color, the way you walk and use it to make inferences about who we are. Companies and governments are already using AI systems to make decisions that lead to discrimination. When police or government officials rely on them to determine who they should watch, interrogate, or arrest or even “predict” who will violate the law in the future there are serious and sometimes fatal consequences.

DW analysis: Google image search cements national stereotypes of racy women | Press | DW

DW analysis: Google image search cements national stereotypes of ‘racy’ women An exclusive data-driven investigation by DW reveals how Google s image search perpetuates sexist stereotypes. Women from Latin America, eastern Europe and southeast Asia are particularly affected. In an exclusive analysis, DW has found that Google’s image search propagates sexist stereotypes. The analysis was a collaborative project between DW s Data Journalism and Culture departments. The two teams analyzed over 20,000 images and websites and reveal an inherent bias in the search giant’s algorithms. For instance, image searches for the phrases Brazilian women, Thai women  or Ukrainian women, show results that are more likely to be overly sexualized than the results for American women, the analysis shows.

© 2025 Vimarsana

vimarsana © 2020. All Rights Reserved.