Skip to navigationSkip to content
AI is using to rescue trafficked sex workers
Reuters/Kacper Pempel
Operating in the shadows.
GOOD TECH

Amazon’s AI is being used to rescue children from sex trafficking

By Natasha Frost

From our Obsession

Machines with Brains

AI is upending companies, industries, and humanity.

Domestic sex trafficking of minors in the US and Canada usually doesn’t resemble Hollywood’s dungeons-and-handcuffs nightmare.

Instead, in the vast majority of cases, it looks more like this: A traumatized child—usually, but not always, a girl—of about 15, from a difficult home situation, meets someone a little older, perhaps 18, 19, 20. Sorely in need of affection, the child sees nothing but promise in this new boyfriend, who offers everything they’ve missed out on. He’s generous, kind, understanding, loving—until he’s not.

“I thought that I loved him and then it came to the point where I was deathly afraid of him.”

After that, things take a sudden, treacherous turn for the worst. Sold for sex, these children may be told to bring home $1000 in a single evening, or risk a beating or an ice bath. They write their own online ads and field their own calls while working for a trafficker who may be running three or four minors at a time. Until recently, they could be prosecuted for prostitution, despite being below the age of consent.

The nonprofit Thorn, founded by actors Ashton Kutcher and Demi Moore in 2009, wants to help to find these children and bring them to safety. To do so, it’s looking to AI.

Julie Cordua, Thorn’s CEO, said she had heard the same story time and time again. Over interviews with 260 survivors across 21 cities, a few patterns began to emerge. Most entered “the life,” as they called it, at 15; one in six was less than 12 years old. One person had been trafficked from birth. Over two thirds grew up without a father present and around 40% had spent time in foster care.

Even after escaping their trafficker, the vast majority had no interest in prosecuting him, and instead spoke of how much they missed their “boyfriend,” Cordua told an audience at Amazon’s Re:Mars conference in Las Vegas last week. ”They need protection, because they’ve never had it, and then that person exploits them for money.”

“Hey babes! I’m Hailey. 25yrs old 5’8, 36DD, all the right young curves in all the right places! “

Where solicitation might once have taken place on the street, today, it is overwhelmingly online. To find trafficking victims means going online—though they aren’t always easy to find in the throng of adult listings, where lurid, emoji-heavy descriptors—Busty Blonde Bombshell; Beautiful Asian Petite; Rave Queen—provide few clues.

About 150,000 new ads selling sex are posted per day in the US. Until 2018, when it was shut down by legal authorities, most of these online sex listings were hosted on classified advertising site Backpage. Its seizure splintered the market, resulting in dozens of new websites springing up to fill the gap. On these sites, every ad looks much the same, even though listings for women who said they were 25 might conceal children as young as 11. “The question for us,” Cordua said, “was: how do we know which ad was a child, if we’re going to focus our energies on that?” 

The solution has often involved first identifying a missing or at-risk child, then using machine learning technology to match listings to individuals.

That’s where tools such as DetectText or IndexFaces, both Amazon Rekognition products, come in. Increasingly, phone numbers in ads appear as images rather than part of a textstream, making it much harder to perform conventional searches. DetectText quickly extracts this information from the images, allowing Thorn to work backwards to find children from their last known number. IndexFaces, meanwhile, detects and matches faces to images of missing and exploited children from open web data sources, such as the National Center for Missing and Exploited Children’s register of missing children. (In the saddest cases, Cordua noted, the children in question had never had photos taken of them.)

“Very quickly, we can get a hit,” Cordua said. The first time they’re sold, children are not “seasoned” enough to know not to use their usual phone number or Facebook pictures. “After three or four months, that goes away,” she said, with the children then becoming much harder to find. “They’re lost in the ether.”

Both of these services make up part of Spotlight, Thorn’s cloud-based data-collection and analysis service, which the nonprofit says has been used to sift through torrents of information, in turn identifying thousands of victims of sex trafficking. (Spotlight has come under fire from critics over privacy concerns: Violet Blue, a writer for technology blog Engadget, described the tool as “terrifying and practically purpose-made for abuse,” in a story about the nonprofit’s somewhat murkier stance on adult sex workers.)

Volunteer machine-learning engineers, sometimes working through Google.org’s fellowship program, have helped to refine the tool further. Spotlight is now used by law enforcement agencies across the country and said to reduce investigative time by 65%. In one case in Northern California, Cordua said, a search for a missing minor revealed 600 ads soliciting that child.

Finding an ad for a trafficked minor is only half of the battle, of course.

Actually rescuing them, Cordua said, necessitated meeting the child in person, to bring them to safety. “The only way is to get an appointment,” she said—which meant being one of the first to pick up the phone and respond to their ad, in a scrum of as many as 85 calls in the first hour after posting their ad. “Speed is of the essence.” Ideally, those who are found then work with victim advocates who can ascertain whether they are best returned home, put in touch with other family members or center to assist with rehabilitation and recovery, Cordua said.

But this, in some respects, seems the hardest part of the process, and exposes the limits of technology as a solution to trafficking. Though Spotlight has been used to identify 9,000 individuals, the number who have actually been recovered is much lower—the most recent figure, from the end of 2016, was just 103 minors brought to safety.