We could argue endlessly over the legal, political and technical fine points of the FBI getting a court order requiring Apple to assist it in cracking open a locked iPhone 5c. That’s not really the point.
True, Apple has cheerfully helped the government look into customers’ data before. No, the FBI isn’t asking for an iPhone backdoor —this time. All Magistrate Judge Sheri Pym of the U.S. District Court for the Central District of California really wants is for Apple to provide a one-off, signed iOS image that will enable the FBI to try different iOS 9 passcodes quickly without triggering the iPhone’s auto-erasure feature after 10 failed attempts.
Some people think Apple shouldn’t do this. Nonsense!
As Dan Guido, CEO of Trail of Bits, an information security startup, points out, “As many jail-breakers are familiar, firmware can be loaded via Device Firmware Upgrade (DFU) Mode. Once an iPhone enters DFU mode, it will accept a new firmware image over a USB cable. Before any firmware image is loaded by an iPhone, the device first checks whether the firmware has a valid signature from Apple. This signature check is why the FBI cannot load new software onto an iPhone on their own — the FBI does not have the secret keys that Apple uses to sign firmware.”
Give me those keys and I could crack your iPhone! And, while I may know more tech than the average bear — or the average IT pro for that matter — I am far from being a top-notch programmer.
But giving the FBI those keys is still a lousy idea for reasons that have nothing to do with the law or the politics of the issue.
Because, as Rich Mogull, a security analyst at Securosis, pointed out, the real issue is companies should be “required to build security circumvention technologies to expose their own customers?” Or, as I put it, “Should companies be required to put back doors in their software?”
To that question, I answer: “Hell no.”
My reasoning for this position is very simple. First, if a business or a government can crack open our records for a good reason, how long will it be before they can do it for a bad one? Answer? No time at all.
And it’s not just a slippery slope because other government agencies could (will?) get their noses into our private business. It’s a slippery slope because, once there’s a back door of any sort, it’s only a matter of time before it’s misused by hackers.
Glance at Computerworld’s pages. Almost every day there’s a serious security breach of software that was designed to be as safe as possible. Now, take that same program and put in a deliberate weakness, a designed keyhole for a software lock picker.
Besides, why would you think the government can be trusted to keep secrets? My security clearance secrets, circa 1985, were revealed in the Office of Personnel Management (OPM) hacks. In the X-Files, the government covered up the Roswell UFO crash and implanted alien DNA in U.S. citizens. In the real world, they’re nothing like that competent.
Do you get my point? Even if the official decryption key is, by some miracle, kept secret and only used for good, transparent reasons — say, as in this case, the terriorist attack on the Inland Regional Center in San Bernardino, Calif. — there is no reason whatsoever to think that these built-in security holes won’t be used by criminals.
My view is that, while in this specific case the FBI has a compelling reason to want Apple’s help in breaking into iOS, it is, as my friend David Gewirtz, the director of the U.S. Strategic Perspective Institute, put it, a “dangerous and far-reaching precedent.”
Amen. Let’s not go any further down this road. Ultimately, it will only lead to even worse troubles in the future. So, Apple, I hope you win. I’m not at all sure you will, but take it all the way to the Supreme Court, if you must. This issue is too important — for all of us — for you to surrender meekly.