Privacy vs. Lawful Need to Know
This past week has seen a lot of heated discussion about the court order compelling Apple to assist the FBI with unlocking the iPhone used by the San Bernardino shooter Syed Rizwan Farook. Apple is fighting that order, opening the floodgates of opinion on the matter.
This case is a perfect storm -- it has it all:
Homeland terrorism resulting in tragic deaths of innocents
Law enforcement's legitimate need to investigate
Compelling 3rd parties to serve the government
The very future of encryption
Close and keen observation by industry and foreign governments
Huge consequences for US and foreign companies, governments, and citizenry no matter how this case plays out.
I've been asked by a number of my clients what I think about this whole thing. As I've read various facts and opinions, I've noticed that many fail to cover some aspect or another of this case. Mine will probably fail on that point as well -- this is simply too big to coherently describe in a single article. To be sure, books will be written about this.
Like many controversial topics, there's legitimate ideas and concerns on all sides. There are no "good guys" or "bad guys" in this debate. Just an honest difference of position that deserves a full examination by experts from all involved quarters.
I'll try to deconstruct this case into logical components and offer my take.
First, some facts about this case and in general
The US magistrate's order is fairly narrow in scope. The order doesn't say that Apple must "crack the encryption" or to provide any kind of "backdoor" in general. It narrowly states that Apple must assist the FBI in unlocking this particular phone (Farook's). And not necessarily by breaking the encryption (which is probably impossible anyway) but by disabling a couple of security features that prevents someone from even attempting to brute-forcing their way in. "Brute force" means trying all possible PINs or passwords until you find the right one.
The court order is relying on the All Writs Act, passed in 1789, as the underlying justification for compelling Apple to toil on the government's behalf.
The government has been far less than candid with the American people about the level of surveillance it's been performing for many years.
Apple is fighting this order using legal remedies and processes available to everyone. I point that out as a rebuttal to Calif. Senator Diane Feinstein's complaint that Apple is operating "above the law", should stop their obstructionist behavior, and follow the lawfully-issued order.
Encryption algorithms are either secure (absolute) or they aren't. There is no middle ground.
It's not presently clear that Apple can do what the court order demands even if they wanted to. Apple has not said one way or the other at the time of this writing.
This isn't about Apple per se'. It's about unbreakable security which is supported by pretty much all technology companies.
My reasoned opinion
Apple has helped law enforcement in the past with private data requests but those efforts didn't require hacking activities of the kind being sought by the FBI in this case.
To the extent that Apple could assist the FBI and do so without fear of even more bad precedent being set then it probably should. Indeed, as I said above, Apple has helped law enforcement in the past with accessing data from locked iPhones. But it's not that simple. Without getting overly technical, suffice to say that as the iPhone and it's operating system (iOS) has been improved over the years, the difficulty with which Apple could obtain private data and the level of hacking required to do so has exponentially increased. These increases aren't just technically more difficult, they broach into different areas of law and what rights a party has. These are the reasons Apple is fighting the order.
Whereas before Apple may only have needed to access backup data on its servers, today there are far more deliberately engineered roadblocks that make it very difficult if not impossible for Apple to access.
Then there's the Snowden effect. We all remember Edward Snowden, whose explosive revelations in June of 2013 detailed the extent of government surveillance. Those revelations confirmed in black and white what some already suspected and others had no clue about: Often warrantless wholesale snooping on pretty much anyone, foreign and domestic, suspects or innocents. That has made many tech companies like Apple far more antsy about cooperating with the feds with investigations. It's also made tech companies develop ever more secure devices and protocols to protect against such spying. People are demanding secure devices which is why devices makers are developing them.
Courts making law
Question: What's the definition of an "activist" court or judge?
Answer: One that rules in a way you don't agree with.
OK, that was supposed to be a bit funny, but there's a certain truth to it. We've all heard carping about "activist judges making law from the bench" from the losers of a big case or from interested parties that disagree with a ruling.
So how does a court "make law" when that power is clearly reserved for legislators? By establishing or furthering a precedent. Legislation is often vaguely worded (for all sorts of reasons we won't discuss here) so as courts interpret vague legislation and rule based on that interpretation, that creates precedence. Then in future cases, lawyers may cite that precedence if it helps their case. Not all precedent is original or groundbreaking. Lots of precedent, probably most, is created incrementally. Today's incremental precedent-setting case based on yesterday's precedent makes tomorrow's case easier to win. That is, each incremental win moves the ball (precedent) a little further down the field.
That is what pundits mean when they express a fear of creating precedent. Precedent is important in shaping the outcome of court cases.
What Apple and others fear is if they (Apple) lose their case and is compelled to assist the FBI in this more-complex hacking of an iPhone, the precedent created/furthered-along may ease the following happening in the future. e.g. Making it more difficult to win a fight against the following:
Compelling a party (company or person) to perform "work" for the government.
And in technology, making it easier to promote and ultimately force technology companies to include decryption backdoors in their products.
Point one: If I have some (perhaps rare or esoteric) skill or knowledge that the government needs to investigate a case, I cannot be compelled to provide "work" even if it was critical to the success of their case. The canonical example is compelling a locksmith by court order to pick a lock or crack a safe. The feds are not supposed to be able to do that. But they are trying to and sometimes succeeding through the All Writs Act.
Point two: Precedents can also help influence related, parallel efforts. In this case, the parallel effort is to ultimately compel technology companies to install backdoors into their secure products, making it easier for law enforcement to gain access (with a proper warrant, presumably). The FBI is insisting that this order does not constitute a backdoor. And they're mostly right on that point. But if Apple loses the case, then the precedent set could make it easier to promote backdoor legislation later on. This is the larger fear in the minds of security advocates. Legislators don't legislate in a vacuum. They are lobbied intensely and endlessly by all parties. Precedent that is set in a court of law can easily be used by a lobbyist to help further the cause when pushing for a bill to be drafted.
A New York federal magistrate has ruled today that federal investigators cannot rely on the All Writs Act to compel Apple to unlock an iPhone. This particular case involves a one Jun Feng, a meth dealer. This ruling is at odds with the Calif. ruling discussed in this article.
This new decision, too, sets a precedent and sets up an interesting split between this case and the case involving Farook's phone. Clearly, the courts are divided on the use of the All Writs Act in this manner. If this split survives appeal in their respective appellate courts, a Supreme Court hearing is almost a certainty given the importance of this topic.
A Red Herring? Farook's Phone
No one really knows what's on Farook's iPhone, which is putatively why the FBI wants to investigate. But consider a few facts here:
Farook destroyed two other personal cell phones. Police found them smashed in a trash bin. The FBI hopes to extract data but that is far from certain.
Farook removed the hard drive and motherboard from his computer and reportedly disposed of them in a nearby lake. Divers have searched the lake but police so far have not revealed the results of that search.
The iPhone in question didn't belong to Farook; it belongs to his employer, the San Bernardino Health Dept. Given that Farook had the mindset to destroy and dispose of other data-containing devices, it follows reason that he would have to know that his employer could possibly monitor his phone and, hence, not do anything incriminating on it. Or if he did, then to destroy the iPhone as well.
The San Bernardino Chief of Police concedes "there is a reasonably good chance that there is nothing of any value on the phone."
But even in light of all that, I still fully grant that those aren't reasons to disregard the iPhone lead. The iPhone is a piece of evidence involved in a crime so the FBI is correct in their desire to pursue that lead. My pause is that the FBI is aggressively pursuing that lead that is in all likelihood a dead end. I hope I'm wrong.
Fear in the Hearts of Men
So why are the feds pursuing this so aggressively?
Just as the Republicans easily passed the Patriot Act in the panic-filled and heart-wrenching days following 9/11, this too is a perfect case for the government to win maximum popular support for expanded investigative powers. The Patriot Act never would have passed were it proposed for more mundane reasons like drug investigations and the like. Even horrific cases involving kidnapping, rape, child exploitation, and other such crimes probably wouldn't have been enough, either.
But homeland terrorism is the nuclear warhead of panic-inducing angst in our citizenry. And with good reason. It's strikes us in the heart and soul. That religious radicals from centuries-old conflicts in the war-torn middle east spreads their terrorism to our land is upsetting, to put it mildly. These are the moments when government is able to seize new powers without a lot of pesky protest or blowback from We The People.
The Patriot Act was a disservice to American liberty. Let us not pass even more ill-advised legislation using such an emotionally-charged event to manipulate the public into acquiescence. Once laws are passed they are notoriously difficult to repeal even after the unintended consequences become obvious and apparent. The Patriot Act is a prime example of that.
The NYC PD has some estimated 150 smartphones in their evidence lockers that they cannot access. The number of smartphones in police evidence lockers nationwide probably number in the thousands. Most of these phones were likely seized during arrests for relatively low-level offenses, mostly drug-related I expect. These are the sort of cases that do not raise to the importance of enacting new legislation to compel companies to include backdoors. But they, and other similarly mundane cases, would certainly benefit. That benefit is the hidden power generating the force that's pushing forward bills that, publicly discussed, is only about thwarting terrorism.
The Snowden Effect
One cannot underestimate the Snowden effect in all this. Snowden revealed a stunning array of surveillance overreach by our government. The federal government really can't claim the high road in the Apple fight because of its history of abusive and secretive surveillance. One can convincingly argue that stronger device encryption exists today precisely because of that overreach. We really do want to trust our government. But the secrecy surrounding the FISA Court, national security letters, warrantless wiretapping, Stingray, and all the rest, have contributed to this latest fight.
The Backdoor Itself
People and organizations that are against an encryption "backdoor" aren't against the government having investigative powers. Even if the government could be trusted to do only what it publicly says it'll do (which the Snowden revelations pretty much put to bed), backdoors are still an incredibly bad idea. Why is that?
Security is already frightfully difficult to get right. Hardly a month goes by before we're reading about the next major data breach, malware, or other threat to our private data. The list is long: Equifax, Anthem, eBay, JP Morgan, Tricare, Citibank, Heartland (a huge credit card processor) Home Depot, Target, Adobe, Ashley Madison (that made a lot of people very nervous!), medical records, financial records, government breaches, Yahoo (more than once), and many more.
A backdoor is an avenue to break encryption without knowing "the password". It's a master key of sorts that fits the lock of vulnerability in the encryption used to protect your data. And make no mistake: A backdoor is a vulnerability. Dedicated hackers will find a way in. It's not a matter of if, but when.
Imagine you're an attorney and your laptop is filled with privileged client information. Imagine that laptop was stolen at the airport, from your car, a hotel, or even from your office during broad daylight (it happens). How panicked would you be? Don't rely on that login password, either. That's easy to bypass, I can do it in five minutes. So maybe your laptop doesn't have client data on it so you think you're safe? Think again. Your browser probably has your email password saved. That's all I need to own you.
There are ways to guard against a data breach of this nature by using full disk encryption. In Windows Pro editions this is called BitLocker. On a Mac it's called FileVault. But what if that otherwise unbreakable BitLocker or FileVault protection had a federally-mandated backdoor? How secure would you feel then?
Do you trust the government not to lose the master key? The same government where the OPM (Office of Personnel Management) lost millions of personnel records containing incredibly sensitive personal information used to grant security clearances? The same government where the director of the CIA(!) had his personal AOL(!!) email account hacked? There's no indication that the director had any work-related content on that account, but still, it looks pretty sad.
Same goes for your mobile phone. Millions of phones are stolen every year in the US. How secure would those owners be if a lost or stolen phone could be accessed because of a backdoor?
Given how difficult security already is, do we want to mandate a systemic weakening of security that would affect literally the entire internet?
Targeting the Wrong People
All this talk of compelling technology companies to implement backdoors is really for naught anyway. There already exist many apps that provide end-to-end encryption that are designed by companies beyond the reach of the US government. e.g. ISIS uses an app called Telegram (which also has a ton of legitimate users) that provides end-to-end encryption secured with a user-generated encryption key.
Even if some of those foreign countries passed their own backdoor legislation, it would be for naught. Terrorists groups have the resources to easily bypass such regulated apps. They could even write their own or pay someone to do so if need be. It's not that difficult.
Much has been said that foreign governments, perhaps those with a poor record of human rights, don't have to wait around for the US before demanding backdoors of their own. e.g. China or Russia could pass such legislation any time they so desire, regardless of what the US does.
It's certainly true that US law and court rulings carry no legal authority beyond our shores, but they do carry influence. A US government seen as protecting privacy rights and promoting data security by not mandating backdoors can meaningfully and with a straight face influence foreign powers that may want such backdoors. To the extent the US wields such influence, we'd certainly piss it away if we required those backdoors. Foreign governments do look at that sort of thing.
Dissidents of despotic regimes, journalists, and NGOs operating in those countries also have plenty to fear. Secure communications may well be the only thing keeping these people safe as they communicate and report.
Even hermetic North Korea, virtually cut-off from the ideas and advancements of the free world, has managed to produce nuclear weapons and missile systems. And they are already suspected in at least one high-profile cyber-attack on US interests.
The loudest voices clamoring for weakened data security, e.g. backdoors, either simply do not understand the consequences of doing so or believe their need for those backdoors trumps the importance of data security.
Security is hard. You cannot hobble encryption and still have data security.