Apple's Open Letter About Encryption

Subject: Mobile
Manufacturer: Apple

It's Easier to Be Convincing than Correct

This is a difficult topic to discuss. Some perspectives assume that law enforcement have terrible, Orwellian intentions. Meanwhile, law enforcement officials, with genuinely good intentions, don't understand that the road to Hell is paved with those. Bad things are much more likely to happen when human flaws are justified away, which is easy to do when your job is preventing mass death and destruction. Human beings like to use large pools of evidence to validate assumptions, without realizing it, rather than discovering truth.

Ever notice how essays can always find sources, regardless of thesis? With increasing amounts of data, you are progressively more likely to make a convincing argument, but not necessarily a more true one. Mix in good intentions, which promotes complacency, and mistakes can happen.

View Full Size


But this is about Apple. Recently, the FBI demanded that Apple creates a version of iOS that can be broken into by law enforcement. They frequently use the term “back door,” while the government prefers other terminology. Really, words are words and the only thing that matters is what it describes -- and it describes a mechanism to compromise the device's security in some way.

This introduces several problems.

The common line that I hear is, “I don't care, because I have nothing to hide.” Well... that's wrong in a few ways. First, having nothing to hide is irrelevant if the person who wants access to your data assumes that you have something you want to hide, and is looking for evidence that convinces themselves that they're right. Second, you need to consider all the people who want access to this data. The FBI will not be the only one demanding a back door, or even the United States as a whole. There are a whole lot of nations that trusts individuals, including their own respective citizens, less than the United States. You can expect that each of them would request a backdoor.

You can also expect each of them, and organized criminals, wanting to break into each others'.

Lastly, we've been here before, and what it comes down to is criminalizing math. Encryption is just a mathematical process that is easy to perform, but hard to invert. It all started because it is easy to multiply two numbers together, but hard to factor them. The only method we know is dividing by every possible number that's smaller than the square root of said number. If the two numbers are prime, then you are stuck finding one number out of all those possibilities (the other prime number will be greater than the square root). In the 90s, numbers over a certain size were legally classified as weapons. That may sound ridiculous, and there would be good reason for that feeling. Either way, it changed; as a result, online banks and retailers thrived.

Apple closes their letter with the following statement:

While we believe the FBI’s intentions are good, it would be wrong for the government to force us to build a backdoor into our products. And ultimately, we fear that this demand would undermine the very freedoms and liberty our government is meant to protect.

Good intentions lead to complacency, which is where the road to (metaphorical) Hell starts.

February 17, 2016 | 03:59 PM - Posted by Anonymous (not verified)

Do we work for the government or does the government work for us?

February 17, 2016 | 04:18 PM - Posted by Scott Michaud

Ideally, there's a balance between The People, The Government, and The Market. Bad times happen when one of the three gains too much power.

February 17, 2016 | 04:50 PM - Posted by Anonymous (not verified)

More like, There should be balance between The People and The Government (Don't put Market in the same level). Bad things happen when Market or Government gains more power than the People.

February 17, 2016 | 05:26 PM - Posted by Scott Michaud

You don't really want individual people to be above the law.

February 19, 2016 | 09:35 AM - Posted by BBMan (not verified)

The presumption here is that the people are bad in the first place and the constitution was written for bad people. If that's the case then government and everything else- which is hopefully made of people- is inherently bad as well.

So when you get bad people monitoring bad people- wtf did you expect?

If the government wouldn't setup moral hazards and situations where dys-interpretations occur with workarounds in the first place we wouldn't be here. Every fukking good engineer understands 1 key rule: KISS- Keep it simple stupid. The Constitution- even with the bill of rights- is shorter than almost all legislation our FU'd government passes today. So how the hell does anyone comply?

Rule #2: They can't.

February 17, 2016 | 04:28 PM - Posted by djotter

While there are some legitimate concerns in their letter, this sentence is just scaremongering "The government could extend this breach of privacy and demand that Apple build surveillance software to intercept your messages, access your health records or financial data, track your location, or even access your phone’s microphone or camera without your knowledge."

February 17, 2016 | 04:51 PM - Posted by Anonymous (not verified)

It's scary because it's true. The government has requested to intercept many encrypted online services before, especially Apple's iMessage. Health records (steps, heart rate, etc) and other data are backed up online via iCloud if set up to do so. If the FBI creates a close enough legal precedent, it could wreak havoc on privacy.

February 18, 2016 | 12:56 AM - Posted by brucek2

Did none of the Snowden documents convince you that there is a very real risk of the government overusing a surveillance power?

It's not just those though. For another example, think of how the extraordinary powers given to the government for extreme cases such as child kidnapping, organized crime, and drug kingpins start being used for much less dangerous ordinary situations over time.

We'd all be better off it they didn't keep doing that, and left us in a position of being comfortable with more trust, but we are where we are.

Personally I'd bet that all of those ideas that you are calling scaremongering are all actual requests that Apple has received from US and/or other governments in the past, but is prevented from saying so.

February 17, 2016 | 05:43 PM - Posted by thezfunk

Great editorial, Scott. This is exactly what I try and get through people's heads. You having done nothing wrong or feel like you have done nothing wrong does not mean someone else with cart blanche to check if you have done nothing wrong won't find something.

This has the potential to set a scary precedent because whatever Apple comes up with to get into this one iPhone can be used to get into any iPhone. The FBI is using this as a beach head to get into any electronic device it already can't.

I hate Apple products but good on them to actually attempt to protect their customers. I just hope it is for real and there aren't secret behind-the-scenes dealings between Apple and the FBI on this.

February 17, 2016 | 05:50 PM - Posted by Scott Michaud

Well, in this case, part of the problem is how terrible Apple's encryption is. It funnels through a 4-6 digit, purely numeric key. My concern is that this gives people false-assumptions about how easy encryption is to break for the magical Silicon Valley.

February 17, 2016 | 09:20 PM - Posted by Anonymous (not verified)

Do you have some evidence of this? Because it sounds like you're confusing the 4 - 6 digit PIN required to unlock the device with the encryption key which are 2 different things.

February 17, 2016 | 10:23 PM - Posted by Scott Michaud

When the phone is unlocked, it automatically decrypts the drive. This means that you just need to unlock the device to access the contents. It does protect against attackers from doing offline attacks outside of iOS though, because that pin is strengthened with a per-device UID.

That said, apparently you can lock modern Apple devices with alphanumeric passcodes, which break the attack that the court ordered Apple to perform on this device.

February 18, 2016 | 02:18 AM - Posted by Richard1101 (not verified)

After 10 failed attempts at the passcode, iOS devices are set to erase itself

February 18, 2016 | 03:04 AM - Posted by Scott Michaud

That's why the FBI is telling Apple to create a version of iOS that doesn't have that security mechanism.

February 17, 2016 | 06:01 PM - Posted by Anonymous (not verified)

Looks more like a Big Brother test case :

If Apple's encryption is weak then it should already have been hacked shouldn't it?

If Apple's encryption is strong, then it will take even Apple months or years to break it. Pending the speed of the phone itself.

February 17, 2016 | 06:52 PM - Posted by Scott Michaud

From what I understand, the internal encryption is strong. It's gated with a 4-6 digit, numeric key, though. The thing is, too many attempts nukes the phone. They want to safely automate the process.

February 17, 2016 | 07:54 PM - Posted by Anonymous (not verified)

well, then the encryption is effective. Doesn't need a 1024 key to be effective. The odds of getting a 4 to 6 digit keycode right in 10 tries with no guidance are low enough for me.

February 17, 2016 | 10:28 PM - Posted by Scott Michaud

Which is where Apple's open letter comes in. The FBI is demanding that Apple patch a specific phone's OS to remove the 10 try restriction (as well as the rate limiter when tethered to a PC).

Millions of combinations per second will brute-force a 6-digit, numeric key pretty quick.

As I stated in another comment above, though, you apparently can force alphanumeric passwords, at least on new devices. If so, then we're back in "age of universe" land.

February 17, 2016 | 11:17 PM - Posted by Luthair

Encryption is only as strong as the key. This is what, a 12-bit passphrase that wouldn't have been considered secure 30-years ago.

February 17, 2016 | 06:03 PM - Posted by Lance Ripplinger (not verified)

Something to point out to on people who say "I don't care because I have nothing to hide": The "government" can take and twist whatever they find out about you, and use it against you. While your interactions with people in your daily life are innocent and normal human behavior, that can be used against you regardless.

"Government is not reason; it is not eloquence; it is force! Like fire, it is a dangerous servant and a fearful master." - George Washington

February 17, 2016 | 06:54 PM - Posted by Scott Michaud

It's not so much "twist whatever they find out about you, and use it against you."
They could fully believe in the twisted version of reality, and still come to the same, bad results for you.

February 18, 2016 | 01:07 AM - Posted by brucek2

To me the bigger point is it's not about them in the first place.

I don't object to the NSA having the communications of everyone in the country because I'm concerned about my own boring emails (I don't read them all, I sure hope they don't have to.)

I object because that everyone includes all the senators, representatives, judges, journalists, CEOs, and anyone else who is supposed to be able to help maintain the balance of powers in our government.

The official in charge of the NSA should not have unfettered access to every communication of the people who supposedly oversee him, but he does. That's a problem even assuming all those people are fully legitimate with only ordinary privacy concerns. The fact that in all likelihood at least some have secrets that expose them to blackmail control makes it even worse.

February 18, 2016 | 02:33 PM - Posted by thezfunk

brucek2, I feel like your point is the dirty secret/elephant in the room in this whole privacy debate.

These intelligence agencies want everything. They don't feel like there should be a corner of the internet or a electronic device that they can't penetrate and keep record of. How are our political leaders (lets assume for a second that they actually represent us. A Princeton University study proves that they don't but lets assume for this discussion that they do) supposed to over see these organizations if these organizations hold power over the over seers?

Intelligence organizations have an enormous amount of power that they don't want to lose. If they have everyone's skeletons at their disposal you don't think they won't use them to protect themselves?

It has been said that a fear of speaking freely is a first step to losing the right to free speech. The 1st amendment isn't there to protect speech that is popular, it is there to protect the speech that is unpopular.

February 17, 2016 | 06:51 PM - Posted by Anonymous (not verified)

I would never trust the FBI on any other government intelligence agency to respect an individual's right to privacy! There is too much past history of such broad and unchecked powers vested in the hands of these agencies being abused! Apple is the maker of the hardware, firmware, and the OS on their phone platforms and certainly Apple has the knowledge of and source code of the firmware in which they could very easily override any of iOS's security functionality. Apple even made/designed the fully custom CPU/SOC in the phone so Apple has the methods to turn off in the loaded/modifiable microcode loaded at start-up into the CPU's cores any security features of its SOC SKUs in which to override any security measures for its A series CPU cores. That is Apple's IP and its bread and butter, and for sure Apple is not going to turn that very IP over to the FBI for the FBI to use on a single device! And every SOC maker has its proprietary back door into that companies specific IP (SOC/other processor) systems and associated firmware and microcode for their CPU/GPU/Other processor cores.

Let the FBI/other agencies get a proper court order with appeal rights and judicial review for Apple to protect their IP, IP that if it was ever released to anyone would cause Apple great financial harm! Let the FBI make a copy of the encrypted drive and send it to Apple and let Apple decrypt the drive and send the contents back to the court, and the court can give unencrypted info to the FBI, but never should the FBI be allowed at any of Apple's private and essential IP, it should be done under the court's supervision and the FBI should have to get it from the court with Apple never having to share its methods for getting around any security/privacy settings! If there is no other way but to use brute force methods to decrypt the drive then give it to Apple and let them at least copy the contents still encrypted off of the phone's SSD drive onto a standard storage medium and let the FBI have at the contents on that standard storage drive to brute force decrypt, but Apple should have to share NONE of their proprietary methods with the FBI or any government agency without both houses of congress, the executive branch, and the high courts getting involved.

February 18, 2016 | 01:36 PM - Posted by thezfunk

This does seem to make the most sense if you are looking at a compromise. However, Apple must also weight the cost of doing even this option. What will current or potential customers think about Apple willing to do even this much?

Any admission that this is possible is also an admission that their encryption and protection is not absolute. That might seem obvious to you and me but how will this effect their business? Keep in mind, corporate America is not interested in protecting you and me, they are interested in protecting their profits above all else.

February 17, 2016 | 08:03 PM - Posted by CJ (not verified)

The fact that some are trying to frame this as a privacy/security issue is beyond ridiculous. It is well known that Apple has the ability to disable certain security features that help protect iOS devices.

Apple isn't being asked to give the government a backdoor into its devices, they're being asked to deactivate a feature on the phone of a known terrorist for law enforcement in the investigation of a terrorist act. The government still wouldn't be able to decrypt other Apple devices unless Apple assisted.

Apple's stance is nothing more than posturing and they will comply. 

February 18, 2016 | 01:40 PM - Posted by eagle63

" It is well known that Apple has the ability to disable certain security features that help protect iOS devices."

Citation needed here. (because I don't believe you for a second)

February 18, 2016 | 05:41 AM - Posted by Anonymous (not verified)

1984 is coming

February 18, 2016 | 10:55 AM - Posted by sean eskilsen (not verified)

1st rule of the Cosa Nostra. due all business face to face no deals are to ever be talked about on a PHONE. this is why i have no stupid fing PHONE.

February 19, 2016 | 12:53 PM - Posted by btdog

Let me be clear: I don't like Apple. I own one Apple product (the original iPod Touch) and I only own it because it was free. I dislike their software, but more so I dislike their business practices.

Apple stifles competition, mistreat their own suppliers, shift their profits offshore to reduce their taxes, all at the expense of customers who happily pay 50% premium so they can be part of the "in" crowd. And Apple's legal team is vicious, ruthless and one of the most dangerous and scariest forces in America, much less the tech industry.

Still, I support Apple's decision 100%. I realize the authorities want to do the right thing, but the government has proven it can't be trusted. I'm not willing to sacrifice my freedom. I have nothing to hide, but that doesn't mean I want my government (or anyone) to have access to everything I do.

Those who would give up essential Liberty, to purchase a little temporary Safety, deserve neither Liberty nor Safety.
-Benjamin Franklin

February 20, 2016 | 04:34 AM - Posted by Relayer (not verified)

If potential evidence was locked inside of a safe (or your house or anywhere else) and there was a court order to open it could you or a company refuse? I don't really understand, whether right or wrong, why this is any different.

February 21, 2016 | 06:08 PM - Posted by brucek2

There's nothing new about opening a safe to which you have a combination. The legal rules about that have long since been settled.

What's new here is a situation where the combination to the "safe" is not known to any living party. The government is ordering a technology company to use its software development skills to create a new piece of safe cracking technology, something that had not formerly existed in the world.

Ordering a company to create a new technology or process is a much different deal and is not at all settled law. Since the case may end up partially settling it, that's why it's getting so much attention.

February 23, 2016 | 10:36 AM - Posted by Anonymous (not verified)

There in lies the rub.... They don't want access to one "safe", they want a modified version of iOS to be able to open everyone's "safe" at the same time. Can we say illegal search and seizure?

February 27, 2016 | 04:32 PM - Posted by Scott Michaud

Everyone who uses four- or six-digit pins, at least. If you use strong, alphanumeric passcodes, then you will be immune to this by Statistics (unless they demand more).

February 23, 2016 | 03:19 PM - Posted by david (not verified)

What if theres a chance to prevent a dirty bomb going off by unlocking that phone? Jihadists everywhere must be applauding Apple's decision. As far as I'm concerned someone will eventually figure out how to hack that phone anyhow. Until then its a philosphical debate what to do.

February 25, 2016 | 03:32 PM - Posted by Anonymous (not verified)

All this talk about precedence is very good. I expect that we do the same for all topics this coming election.

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.