If a computer tells you with 64% certainty that what you’re wearing isn’t your best, would you change? Would you maybe even buy new clothes?
Amazon announced a new addition to its Echo line of products, the Echo Look. Until now, Amazon’s Echoes have been speakers that can stream music and podcasts, and through their built-in voice assistant Alexa, control internet-of-things devices and tell you information about the weather and traffic, and connect to third-party apps. The new Look gives Alexa eyes as well as ears.
The Echo Look is shaped like an elongated webcam, and combines the microphone and speaker technology of the last iterations of the Echo, with a depth-sensing camera and LED lights to let users take and share short photos and videos of themselves by calling out to Alexa. Thankfully, there’s a button on the side to turn off the camera if you’re not comfortable having an AI-powered camera watching you at all times of the day.
The online retailer is pitching the Echo Look as a fashion-forward device, meant to help you figure out what from your wardrobe suits you best every day. The device’s app ships with software called “Style Check,” which Amazon describes as “a new service that combines machine learning algorithms with advice from fashion specialists,” that lets users compare two photos taken through the Look to decide which outfit is better.
Amazon wasn’t immediately available to confirm what sort of data it was training to build this software, and whether it would apply the same fashion sense for a customer in New York as it would in London or Hong Kong. It also didn’t respond to comment on who exactly designed the product, and whether it was something that was specifically designed for women (its advertising almost entirely comprises women), by women.
The device seems to hint at a larger strategy for Amazon in the world of fashion, and potentially knowing when and what to sell you more things in general. By selling a camera that can discern what you’re wearing, and run it through Amazon’s proprietary knowledge base of other clothes, it’s not much of a stretch to see Amazon suggesting new clothes to buy that it believes (according to machine learning!) you’ll like.
This is very similar to what Amazon does on its website now, recommending products related to things you’ve bought or searched for, except it that the Look could in theory tell you with some degree of certainty what items of clothing will look good on you, thus convincing you to ask Alexa to order you more of those clothes. Several startups are working on creating accurate 3D scans of a person’s body from just a webcam to help retailers figure out your clothing size without forcing you to try on clothes—Amazon could use such technology to scan through the Echo Look and send a new pair of jeans your way that fit perfectly, letting you keep the fun of shopping without the pain.
Amazon is quickly becoming one of the largest online retailers of clothing in the US, and any sort of system that could help it automate the process of selling clothes on a massive scale—especially without the hassle and cost of mailing in returns and exchanges—could be a huge boon for the company. It recently patented a system for creating an on-demand clothing factory, where a customer places an order with Amazon, a series of robots fabricates the clothes, which are then checked, packed, and delivered to the customer. The company is also working on automated delivery systems, such as self-flying drones (and possibly self-driving trucks), that could mean in the not-too-distant future you could ask Alexa to scan you, tell it find a new outfit for you to wear that evening, and having a custom set of clothes show up on your doorstep a few hours later.
This system hints at even more invasive sales opportunities. Retailers already know how to glean life-altering moments from sales data, such as when Target managed to figure out a woman was pregnant before she told her family. When a company can actually see you, it’s a lot easier.
As Zeynep Tufekci, a sociology professor at the University of North Carolina at Chapel Hill, suggests in her tweet, Amazon’s computer vision could be attuned to look for all sorts of personal signifiers beyond what you’re wearing. It could tell if you’re pregnant, and start offering you maternity clothes, cribs, and diapers; it could tell if you’re unhappy, and offer you music and movies you like (or possibly alcohol); it could tell if you’re tired and suggest some coffee; and it could tell if you’re overweight and suggest a Fitbit—the possibilities to sell you things when a company has a device that can infer your life from photographs are relatively infinite.