Skip to navigationSkip to content
FBI agents knock on a door.
Reuters/Lucas Jackson
The doors aren’t always closed.

Here’s how often Apple, Google, and others handed over data when the US government asked for it

By Joon Ian Wong

Silicon Valley is closing ranks in the fight that’s brewing between Apple and the US government. Facebook, Microsoft, and others have now issued statements supporting Apple against a court order compelling it to help unlock an iPhone belonging to one of the assailants in the San Bernardino, California shootings.

But while the battle looks like it’s between the defenders of user privacy–represented by the tech companies–and a prying US government, that may not accurately represent the reality on the ground.

As the tech companies have pointed out in their statements, they comply with lawful requests for user data from the government. The companies involved publish statistics around these requests. These numbers show that the firms usually do what the government asks, and often produce user data that’s shared with law enforcement.

Here’s a look at how the companies responded to US law enforcement requests for user data, from the companies’ own transparency reports. These requests can take the form of a court order, a subpoena or a warrant, according to the company reports. The latest figures are for the first half of 2015. Here’s the volume of requests received, for each company:

Here’s how successful those requests were:

Apple also provides a metric for requests for “device requests.” That’s defined as requests for information about device registration, repair, purchase information, and the dates it was used to access Apple services. Apple notes (pdf) that the “vast majority” of device data requests it gets concern lost or stolen devices. Here are the latest numbers for that:

This time it’s different

Apple’s argument in the case of the San Bernardino iPhone is that this time, it’s different. Tim Cook says the FBI’s request for a modified version of iOS amounts to a request for a “backdoor” that will create a ”master key” that could be used to open all iPhones.

That has sparked some debate in the security world about whether Cook is being accurate with his use of the term “backdoor.” The FBI’s request might more accurately be called a “bypass,” or a more limited way to circumvent a system’s security controls in a way that the manufacturer never intended, Dominic Chell, author of the Mobile Application Hacker’s Handbook, told Quartz. That’s subtly different from a backdoor, he said. Both methods could potentially be used repeatedly on multiple devices, but there’s a difference in the degree of the manufacturer’s responsibility for deliberately creating the possibility of intrusion. One way to think of it is the difference between giving the government a key to your house (a backdoor) versus requiring them to break down your door if they want to get in (a bypass.)

The semantic distinction matters because there’s a big debate about whether the US should have a law forcing tech companies to build backdoors to their data. But the White House stopped its push for the legislative measure in Oct. 2015, after coming to the conclusion that a backdoor accessible by the US government might also be exploited by others–a victory for the tech companies, and a loss for law-enforcement agencies.

In the San Bernardino case, the White House has said it’s not seeking a universal backdoor, but for access to a single device. Now Apple’s decision to describe the request as a backdoor could reignite that debate in an unfavorable way to the tech firm, as Anna Lysyanskaya, a computer science professor at Brown, told BGR. “It’s dangerous that Apple equates an encryption backdoor with software bypass because the public might not see the difference, and that may give the FBI leverage in the encryption backdoor debate,” she said.

Indeed, Chell, the security specialist, says that Apple could create a solution that limited access to a single device. That wouldn’t necessarily entail forging a master key, he said. But other observers, like Steve Murdoch, a computer science professor at University College London, think Cook’s argument that compliance would set a precedent to be exploited by others in future overshadows the technical question. “Whether it’s a backdoor or bypass, it doesn’t change Apple’s arguments, which is the consequences of them being asked to do so by the FBI,” he tells Quartz.