
In This Story
In order for its Autopilot driver assistance system to work, Tesla fits cameras to every corner of its electric vehicles to monitor surroundings and assess the road ahead. While the car’s internal computers make on the fly decisions about how to respond to things like road signs and parked cars, Tesla TSLA-5.05% has a whole team of researchers programming Autopilot to respond to different situations that may arise on the road.
However, while you might expect that these workers are training Autopilot to follow the rules of the road, a report from Business Insider
found that they’re teaching cars to ignore certain regulations:
Workers can run into data from any number of countries in a single workflow, meaning they must constantly be aware of the different road rules for each region. At times, Tesla seemed to take a more relaxed stance on those rules, seven former and current workers said. For example, some workers said they were told to ignore “No Turn on Red” or “No U-Turn” signs, meaning they would not train the system to adhere to those signs.
“It’s a driver-first mentality,” one former worker said. “I think the idea is we want to train it to drive like a human would, not a robot that’s just following the rules.”
Sometimes the role requires workers to label videos from accidents and near-misses. Seven workers recalled labeling videos that included Tesla accidents or those involving nearby vehicles. At one point a worker even distributed a video between employees of an incident that involved a young boy on a bicycle getting hit by a Tesla, four workers said. It was one of many videos and memes workers used to exchange, they said.
As if programming cars to break the law wasn’t bad enough, workers at the site are doing so under some pretty harsh conditions. Workers at three Tesla facilities across America tasked with watching the footage have their progress closely monitored by Tesla:
Employees are also very closely monitored using two different software systems.
One system, called HuMans, gauges how long they should spend on each clip, four workers said. Annotators who take consistently longer than the allotted time are likely to receive poor performance reviews or be put under a performance improvement plan, or PIP, they said. The software was originally designed to help pilots in the US Air Force and also has the capability of tracking employees’ eye movement and taking audio recordings, according to its website. But it’s unclear if Tesla uses the software to track staff’s eye movement.
The company also uses a measure called “Flide Time” to track annotators’ active time on the labeling software, 17 workers said. It can track keystrokes and how long workers spend with the labeling software open, but it won’t track the time workers spend using other tools on their computer, they said. Depending on their level, workers can be expected to log anywhere from five to seven and a half hours of Flide Time, meaning they must be active on the software for at least that amount of time.
Work such as this is designed to teach Tesla cars operating Autopilot of Full-Self Drive how to behave on the roads correctly. However, the software has so far been linked to dozens of crashes on America’s highways, including run-ins with parked police cars and rail crossings.
A version of this article appeared on Jalopnik’s The Morning Shift.