Until Sep. 7, LinkedIn users searching for female contacts on the site may have noticed some strange results. Searches for common female names were yielding suggestions for male names as well.
Take a LinkedIn search for “Stephanie Williams.” Earlier this week, that query returned the result, “did you mean Stephen Williams?” (in addition to the 2,500-plus users actually named Stephanie Williams). A search for “Stephen Williams,” however, simply displayed the 7,200 results for people with that name.
The same was true of searches for at least a dozen other popular female first names in the US, a Seattle Times investigation revealed. LinkedIn wondered whether users searching for Andrea meant Andrew, Danielle meant Daniel, and Alexa meant Alex. Searches for the US’ 100 most common males names didn’t return suggestions for female names.
LinkedIn’s “did you mean” results are fueled by an algorithm designed to suggest names with similar spellings. The algorithm makes recommendations based on how frequently names have shown up in past queries of the company’s more than 450 million member profiles, says spokesperson Suzi Owens. “It is not anything to do with gender,” she said.
All the same, on Sept. 7 the Silicon Valley-based company rolled out a change to the algorithm that enables it to explicitly recognize popular names as such, so that the algorithm doesn’t try to correct them.
It appears to be working: Searches for first names like Dana, Joan, Danielle, Alexa, and Stephanie no longer return any “did you mean” results.
The issue underscores the biases present in artificially intelligent systems that learn from other users’ behavior. Earlier this year, Microsoft was forced to take its millennial chatbot offline after it learned to make racist and sexist remarks from users on Twitter. (Microsoft is also acquiring LinkedIn.)
“As with all machine learned systems, there are always edge cases and we are constantly working hard to improve and create the best possible experience for our members,” says Owens.