Although the FBI’s attempt to force Apple to backdoor an iPhone is over, the larger issues raised—and there are many—defy a straightforward resolution.
The iPhone 5c in question was used by the San Bernardino terrorist, Syed Rizwan Farook. He and his wife, Tashfeen Malik, were killed by police in a shootout after the couple’s terror attack on Farook’s co-workers.
I’ve been reading a lot of the analysis that has followed in the ensuing weeks and months, and one thing is clear: a rather large and unsavory can of worms has been opened. At the core of the debate are some fundamental issues about privacy rights, government authority, and corporate responsibility—both to sovereign governments and customers.
Not-so-compelling court order
The FBI did obtain a court order compelling Apple to unlock the iPhone 5c in question. Apple fought the order, and ultimately the FBI backed off, as it was reportedly able to unlock Farook’s iPhone by other means.
At the time, Apple CEO Tim Cook wrote on Apple’s website:
We are challenging the FBI’s demands with the deepest respect for American democracy and a love of our country. The United States government has demanded that Apple take an unprecedented step which threatens the security of our customers. We oppose this order, which has implications far beyond the legal case at hand.
What should be made clear—and it was in a transparency report that Apple released last week—the company cooperates with governments when it can. What it has refused to do is to compromise the security it offers on its iPhones and other devices.
Microsoft gets involved
Now, in an unrelated but no less relevant case, Microsoft is suing the U.S. Department of Justice. Turns out the DOJ has sought information from Microsoft about customer data over 5000 times in the last year, while insisting that the company not let their customers know about those requests.
With both the Apple and Microsoft cases, we are immersed in some very compelling questions of law.
We are also seeing these technology companies fighting desperately to maintain the trust of their customers. And it is in this area, I believe, where the government’s strong-arm tactics are misguided.
In Apple’s case, the government is effectively asking the company to compromise all of the data on all of its devices. Not just the data of terrorists and alleged criminals who happen to use those devices.
Under court order, Microsoft is providing the data. The company believes, however, that those customers have a right to know.
Seems clear to me under the Fourth Amendment. But I’m not a lawyer. Just a citizen who, if it were me, would want to know.
Say what you will about corporate greed. And let’s stipulate to Apple and Microsoft being at least somewhat motivated by business interests here. These and other technology companies are selling products, platforms, and entire ecosystems that are designed to not only hold, but secure, customer data.
Compromise that, and watch customers head for the door.
Rights and stuff
There is the matter of the rights of U.S. citizens. First, do we have a right to privacy or not? I believe we do, and the Fourth Amendment seems to support this:
The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, and no warrants shall issue, but upon probable cause, supported by oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized.
At the very least, the Fourth Amendment says that “the people” have a right to know “the persons or things to be seized.” I think data qualifies as a thing here.
Now, of course, it can be argued that the San Bernardino terrorist, Farook, committed a heinous act, and was no longer protected by the Fourth Amendment. That’s entirely fair.
However, if Apple were forced to create a backdoor to Farook’s encrypted phone, then the proverbial genie would be out of the bottle. Further government requests would follow. Apple products would no longer be trusted to be secure havens of personal data.
Second, is encryption legal in the U.S.? Yes it is. Encryption used to be categorized as a munition, and there are still regulations against selling encryption to “rogue states.” But, for the most part, encryption is out there, it’s legal, and available to most everyone.
Do companies, therefore, have a right to include encryption in their products? Yes they do. Apple is not alone in providing encryption. Google does in its Android operating system. Microsoft does. Facebook does. Many other companies do.
If companies can be compelled by government to maintain a back door to encrypted data, is encryption really worth anything?
Can companies outside of the U.S. build secure encryption into their products? Yes, and its unlikely that the U.S. government would have jurisdiction over those companies to compel them to break the encryption.
Can criminals and terrorists obtain secure encryption and use it for their nefarious purposes? Yes they can.
For all these reasons, I don’t think it’s ethical or wise for the U.S. government to compromise the security that U.S. companies promote in their products. Such actions could jeopardize the businesses of said companies, cause job losses, and hurt the economy.
Say no to capitulation
If Apple, Microsoft, and other companies capitulate with the U.S. government, then they’ll be expected to do likewise with China, Russia, Iran, and on and on.
Even if it fails, there will be further proposals from Congress. Or, judges will make rulings that go one way in one jurisdiction, and another way in another jurisdiction.
In an era when our privacy and freedom appear to be largely eroding, it’s time for our government to build those rights back up, not continue to tear them down.