The fight between Apple and the FBI over iPhone security has turned nasty and dangerous. Especially for the US government’s credibility.
Power corrupts. Absolute power corrupts absolutely.
In an April 5th, 1887 letter to archbishop Mandell Creighton, Lord Acton coined words that were to be widely quoted–and too often forgotten when inconvenient [as always, edits and emphasis mine]:
“Power tends to corrupt and absolute power corrupts absolutely. […] still more when you superadd the tendency or the certainty of corruption by authority.“
Today, these words resonate in my mind as I try to get to the core of the fight between the FBI and Apple. Here, one challenge is avoiding technical or legal arcana without oversimplifying. Another difficult task is giving wide berth to inflammatory utterances, statements of principle included. Consequences are what we ought to focus on, not principles, always mindful that invocations of principle are often the anchor of a con.
Let’s walk slowly through the maze.
The FBI wants access to the information contained on the iPhone 5c that was used by San Bernardino shooter Syed Rizwan Farook, a phone that’s owned by an agency of the local government. iPhone security includes a provision against repeated attempts to guess the four-digit passcode that unlocks the device. After the first five attempts, the iPhone enforces a one-minute wait time, which then grows to 15 minutes, then one hour. After 10 unsuccessful attempts, you’re SOL, Sadly Out of Luck: the iPhone erases its data.
(Only for nerds: If you want a glimpse at Fortress iOS, take a look at Apple’s iOS Security Guide. Impressive for what it reveals, and hides.)
Since the only way to break the code is by guessing—and nine guesses aren’t enough—the FBI needs a workaround, a hack, that will suppress the doomsday security measures.
Did our servants turn to their own technical experts, or seek advice from another government agency with deep cryptography expertise? Did the G-Men turn to hackers under its thumb? We don’t know. What we do know is that the FBI got a court order aimed at forcing Apple to create a special iOS version that circumvents the Ten Strikes and You’re Out limitation.
(John McAfee, a well-known and sometimes rogue security expert, has offered to unlock the iPhone in three weeks, for free. This looks like a publicity stunt, but the piece is worth reading for its references to hacker culture:
“Cyberscience is not just something you can learn. It is an innate talent. The Juilliard School of Music cannot create a Mozart. A Mozart or a Bach, much like our modern hacking community, is genetically created. A room full of Stanford computer science graduates cannot compete with a true hacker without even a high-school education.”
(The offer drew no response.)
This gets us to the next, pivotal question: Can Apple use its knowledge of iOS to bypass its own security lock? Is the request actually viable?
Unfortunately, yes. Without requiring a passcode, an iPhone can be put into Device Firmware Update (DFU) mode which allows a new operating system to be loaded onto the device. (The method is used by some to “jailbreak” an iPhone, that is to install unofficial software on it. This is probably a step McAfee had in mind.)
In its intended, “white hat” use, DFU mode is used to resuscitate a “dead” iPhone. The phone is connected to a Mac or PC running iTunes, DFU mode is turned on, and a genuine iOS image is pushed to the device. There’s a crucial element here: the iPhone isn’t completely dead. Deep in its firmware resides the ability to check for a key that’s presented by the iOS image. Only Apple has that key. Attempts to fool the iPhone into loading contraband software won’t work. (Again, McAfee holds a different view.)
We now reach the core of the FBI’s request:
OK, Apple. You know the key that will get an iPhone (any iPhone, mark those words) to accept an iOS image. So, write a firmware update that bypasses the passcode lockout. We’ll bring the iPhone 5c to your offices so your own engineers can “update” the device. All we ask is that you give us a remote connection—while the device is still under your control—so we can keep guessing the passcode. It’s only four digits, 10,000 attempts, tops. [If you’re interested in the odds of crypto-guessing, see the addendum below.]
Assuming Apple can trump (I had to do it) the security check, why would it refuse to cooperate? It’s for an indisputably good cause and it’s just for this one phone, you know?
The problem is one of consequences, of what comes next on this road paved with good intentions. If Apple gives in for just this one phone, there will be requests for more…and more. Of course, they’ll all be in the interest of solving the most abominable crimes, or for thwarting credible threats. How can anyone be against fighting terrorists, drug dealers, pedophiles…?
Further down the road, other countries see that Apple has cried uncle and demand the same access. Will these foreign demands be as “enlightened” as those from our (not so) trusted US agencies?
Make no mistake, what we have here is, by any measure, a backdoor.
We know now, from a secret memo unveiled by Bloomberg, that the US government has a broad strategy for cracking phones:
“In a secret meeting convened by the White House around Thanksgiving, senior national security officials ordered agencies across the US government to find ways to counter encryption software and gain access to the most heavily protected user data on the most secure consumer devices, including Apple Inc.’s iPhone, the marquee product of one of America’s most valuable companies, according to two people familiar with the decision.”
If true, this is more than forcing Apple to “jailbreak” some iPhones, or forcing Google to do the same for some Android devices. This is our government playing with fire. Note that the Bloomberg piece doesn’t refer to legislation but to technical means to pierce existing encryption:
“National Security Council spokesman Mark Stroh declined to comment on the memo. But he provided a statement from a senior Obama administration official: ‘We should not preemptively conclude that technical and policy options to address this challenge are out of reach. While creating mechanisms for accessing encrypted information does create vulnerabilities, there may be technical and process steps that can be implemented to limit such risks.”
We’re back to the old delusion: We’re the good guys, we’ll find a way to break encryption–but only for us. Criminals and enemies of our country won’t have access to such tools.
Authoritarian dreamers have to let go of their dangerous fixations on imaginary solutions. Backdoors For Good Guys Only is pliable material for chest-thumping politicians, but legitimate leaders must have the courage to tell the plain truth: It won’t work, it will only hurt good people.
As I attempted to explain in an earlier Monday Note titled “Let’s Outlaw Math,” savvy criminals will use encryption for which the government has no backdoor. Anyone (“with a command of Linux”) can download and use Open-Source encryption software that’s unbreakable and customizable. Those who truly have something to hide can obscure their communications using steganographic techniques.
If you think, correctly, that I lack the cryptography credentials to support this view, let’s turn to General Michael Hayden, the former CIA and NSA director who unequivocally states that backdoors are a bad idea:
“America is more secure with unbreakable, end-to-end encryption. It’s a slam-dunk if I widen the field of view to the broad health of the United States.”
There is more.
When the consumer version of the Internet of Things (IoT) is broadly deployed on smart connected objects everywhere, we’ll see an astounding flow of private data. Naturally, conscientious suppliers will protect these most intimate details of our daily lives with encryption…but government busybodies will want to know if we’re following the law as we fish, swim, eat, smoke, multiply, and die.
If you think that’s going too far, that I’m being paranoid, pause to consider the rich history of laws that have banned “unnatural” private acts between consenting adults, acts that, in other countries, have long been considered “no one’s business but you’re own”.
We can also turn to the sorry consequences of anti-terrorist provisions of the Patriot Act that are used to invade citizens’ privacy in cases that have nothing to do with terrorism. In an article in the Washington Post, Radley Balko writes:
“When critics point out the ways a new law might be abused, supporters of the law often accuse those critics of being cynical — they say we should have more faith in the judgment and propriety of public officials. Always assume that when a law grants new powers to the government, that law will be interpreted in the vaguest, most expansive, most pro-government manner imaginable.”
Let’s not forget the revelations from WikiLeaks, or Edward Snowden’s exposures of CIA and NSA practices. Keep in mind that a breach of the US Office of Personnel Management compromised the data of 18 million people. Breaches and leaks have happened and will happen again. Entrusting a government agency with a set of backdoors keys will inevitably lead to bad outcomes.
Furthermore, consider financial-system advances, such as bitcoin, that need unbreakable encryption to work. These systems will wither if backdoors allow well-intentioned guardians of the peace and criminals alike to peek and poke. How can any company that relies on security expect to export compromised technology?
Who do these government officials plotting to break encryption technologies think they work for?
It now transpires that the fat fingers of unnamed government personnel changed the iCloud password on the iPhone 5c in their possession. Had they not done that, the iPhone might have automatically backed itself up to iCloud, as it previously had, thus yielding the data the FBI is looking for. Apple has shared previous iCloud backups with investigators.
In what could be seen as a sign of nervousness, the Department of Justice declared Apple’s stance on security and privacy a “Marketing Strategy.” Thus, instead of arguing points of law, the DOJ has resorted to impugning motive. Well, yes, better security and privacy are valid selling points for Apple, just as are the company’s positions on social and environmental issues. Doing well and doing good don’t have to be mutually exclusive.
I think the DOJ is playing with fire. Does it want this fight to evolve into legislation that would force all devices makers (phones, iOT, cars) to include backdoors, with dire consequences for individuals everywhere–and for US companies?
Does the DOJ want to make Tim Cook a martyr by holding him in contempt and dragging him to jail? I seriously doubt it will go that far…imagine the images–and the marketing benefits.
I just hope for yet another set of talks and hearings that will end, as on previous occasions, in a non-decision.
An addendum for numbers geeks:
According to calculations by the Intercept, a six-digit passcode (a million combinations) can be cracked in 22 hours at most, 11 on average. The cracking time escalates quickly with longer passcodes:
“Seven-digit passcodes will take up to 9.2 days, and on average 4.6 days, to crack eight-digit passcodes will take up to three months, and on average 46 days, to crack nine-digit passcodes will take up to 2.5 years, and on average 1.2 years, to crack 10-digit passcodes will take up to 25 years, and on average 12.6 years, to crack 11-digit passcodes will take up to 253 years, and on average 127 years, to crack and so on… ”
If you want to increase security on your phone—or any account—use a longer password. The Intercept helpfully suggests an easy way to remember an 11-digit number: Start with a phone number that’s already 10 digits, avoiding numbers that are within your social circle; the Los Angeles FBI office would work: 310-477-6565. Add a number “to taste,” as they say in cookbooks.
I’ll use phone numbers from my childhood in France, they include alpha characters for enhanced cracking fun.
This post originally appeared at Monday Note.