Dear Quartz members—
Artificial intelligence is riddled with bias. Journalists, researchers, and activists have pointed out countless examples in recent years. They’re also working to fix them. This week’s field guide is about the quest to mitigate bias and put AI to good use.
In her state of play, contributor Helen Edwards explains how AI systems driven by machine learning become biased. She explores efforts to address those biases, not just through technical fixes but also with design thinking, law, and regulation. Ultimately, she says, biases in AI are forcing us to confront the power imbalances in our society.
One solution to AI bias is to hire ethicists. In this piece, Helen talks to some “AI ethicists” and explores whether the rush to hire them—by tech companies, governments, even the US Defense Department—will help, or whether it amounts to “ethics washing.” In this one, she covers research that flipped the script on the use of AI in criminal justice: the researchers decided to see how judges liked being assessed by algorithms.
Finally, she provides a list of what to read and who to follow if you want to learn more.
WORKING FROM HOME
If you’re suddenly working from home for the first time due to the coronavirus, we’ve got you covered. As a Quartz member, you have access to our Quartz at Work guide to working from home as well as this one on managing remote employees. And don’t miss this article on prioritizing mental health when working remotely.
Finally, as a reminder, you can sign up for our coronavirus newsletter here.
Best wishes for a productive week,
Walter Frick
Membership editor, Quartz