Tech companies are beginning to accept that the artificial intelligence they’re building their futures on could be flawed. From studies showing that language-processing AI can be sexist to more recent research on facial recognition’s failures on darker skin tones, years of research have erupted into a flurry of actions from Microsoft, IBM, Google, Mozilla, Accenture, and even Congress.
It’s difficult to pin a reason on “why now?” It could be the unexpected speed at which AI has become pervasive on tech platforms and in the real world, with demos of the disconcertingly-human-sounding Google Duplex and Amazon’s cashier-less Go store, where customers walk in, grab what they want, and walk out, with the entire affair monitored and recorded by cameras and computers. Or maybe it’s how big tech companies are suddenly seen as complicit in invasive national security projects with the US Department of Defense or Immigration and Customs Enforcement, contributing to the perception of a creeping police state. Or maybe the world is just becoming more tech-literate and conscious.
No matter the reason, announcements from these companies have been a lot to keep up with, so here’s a quick rundown of what has happened in the last few weeks:
- Google released its ethics principles. The principles specifically mention checking AI algorithms for bias, and a supplementary website for those using machine learning outlines ways to guard against bias.
- Congress continued to make clear that AI bias must be addressed by tech companies. During a hearing of the House Committee on Science, Space, and Technology, members of Congress asked experts from Google and OpenAI whether Congress should regulate aspects of the AI industry, ranging from automation to bias. “We need to grapple with issues regarding the data that are being used to educate machines. Biased data will lead to biased results from seemingly objective machines,” congressman Dan Lipinski (D-Ill.) said during the hearing on June 26.
- IBM announced a new dataset to train facial recognition to see more skin colors. The dataset contains 36,000 images curated from Flickr Creative Commons, and was made public to promote the development of more accurate facial recognition. IBM was previously shown to have facial recognition systems that worked considerably worse on women of color, presumably due to biased data.
- Microsoft announced its facial recognition algorithms now operate better for people of color. The algorithms the company sells to third party developers now operate “up to 20 times” better than it did before on women and people of color, Microsoft wrote in a blog post. Microsoft was implicated in the same study as IBM.
- Accenture released a tool to combat bias in machine learning datasets. The tool, which is available to Accenture customers, can find relationships in datasets that correlate with age, race, gender, or any other demographic. By identifying these correlations, data scientists can retool the systems to give more equitable results to all demographics. The tool has only been in the works for a few months, Accenture’s global head of responsible AI Rumman Chowdhurry told Quartz, but it will play a bigger role when AI governance and transparency become a larger part of business regulation.
- Mozilla announced a $225,000 fund for art illustrating dangers of AI like bias. ”We really want to take this abstract looming sense of fear, and help people get their heads around them,” said Mozilla executive director Mark Surman. “Unless you can imagine what [the danger] is, people can’t be asked to take action. Artists often play a very critical role that’s surprising.”
- The CEO of a facial recognition company says that the technology isn’t ready for law enforcement. Read the whole story on Quartz.
It’s hard not to feel as though the industry is beginning to shift. Facial recognition is not a new technology, but now it seems there’s real momentum for companies to reform the way they collect data and train their algorithms. Now, we’ll watch for how this momentum carries into the services and products the world actually uses.