Olay tackles beauty bias in search algorithms prweek.com - get the latest breaking news, showbiz & celebrity photos, sport news & rumours, viral videos and top stories from prweek.com Daily Mail and Mail on Sunday newspapers.
Informationweek
How Well Conduct Algorithmic Audits in the New Economy
Today s CIOs traverse a minefield of risk, compliance, and cultural sensitivities when it comes to deploying algorithm-driven business processes.
Image: Montri - stock.adobe.com
Algorithms are the heartbeat of applications, but they may not be perceived as entirely benign by their intended beneficiaries.
Most educated people know that an algorithm is simply any stepwise computational procedure. Most computer programs are algorithms of one sort of another. Embedded in operational applications, algorithms make decisions, take actions, and deliver results continuously, reliably, and invisibly. But on the odd occasion that an algorithm stings encroaching on customer privacy, refusing them a home loan, or perhaps targeting them with a barrage of objectionable solicitation stakeholders’ understandable reaction may be to swat back in anger, and possibly with legal action.
Can auditing eliminate bias from algorithms?
Shares
For more than a decade, journalists and researchers have been writing about the dangers of relying on algorithms to make weighty decisions: who gets locked up, who gets a job, who gets a loan â even who has priority for COVID-19 vaccines.
Rather than remove bias, one algorithm after another has codified and perpetuated it, as companies have simultaneously continued to more or less shield their algorithms from public scrutiny.
The big question ever since: How do we solve this problem? Lawmakers and researchers have advocated for algorithmic audits, which would dissect and stress-test algorithms to see how they work and whether theyâre performing their stated goals or producing biased outcomes. And there is a growing field of private auditing firms that purport to do just that. Increasingly, companies are turning to these firms to review their algorithms, particularly when theyâve faced criticism for biased outcomes
Plus: Dead pop star brought back to life by ML, OECD develops effort to monitor AI power
Katyanna Quach Mon 1 Feb 2021 // 20:47 UTC Share
Copy
In brief Today s artificial intelligence can autocomplete a photo of someone s face, generating what the software predicts is the rest of their body.
As an academic paper pointed out, though, these neural networks are biased, presumably from their training data. That means when you show this code a woman s face, it s likely to autocomplete her in a bikini or other revealing clothes. White people tend to be shown holding tools while Black people are pictured holding weapons.