Take it easy. Woof. So the FBI is asking Apple to change their encryption so it is crackable, give the FBI the key, plus either make flash memory that can't be erased but looks like it is when the phone is reset or include a built in back-up that cannot be deleted? Am I close?
Not really. It's a VERY complex case, and what the FBI is asking for is not at all obvious to someone who doesn't know the complex innards of the iPhone.
So here's the scoop. Bear with me, this might get long and technically complicated.
When you secure your iPhone with a passcode, the phone's contents are encrypted. But--and it's really important to understand this for reasons I'll explain in a second--the passcode you type in is not the encryption key.
The phone generates a random encryption key that is very long (256 bits). This encryption key is not stored in the phone's flash storage or RAM. It's stored in a special, high-security, tamper-resistant chip, called the iPhone Secure Enclave.
The Secure Enclave can not be accessed directly by the phone's CPU, and it does not communicate with the rest of the phone over the normal data bus. It communicates over a special, dedicated, encrypted link to the CPU.
When you type your passcode, the passcode is sent to the Secure Enclave. The Secure Enclave looks at the passcode and returns "yes" or "no". If it is correct, the Secure Enclave uses the key stored in its own tamper-resistant memory to decrypt the phone. If it is not, the Secure Enclave does not decrypt the phone.
For this reason, you can't just read the contents of flash and put them in another phone, for example. The flash contents are encrypted using military-grade 256-bit AES; there is no known way to attack this encryption, and all the world's computers combined would take more than a billion years(!) to brute-force decrypt it.
Now here's where it gets weird.
The iPhone 5C and earlier and the newer iPhones are a bit different in what happens if you enter the wrong passcode several times.
If you enter the wrong passcode in an iPhone 5C or earlier, then with each wrong attempt, iOS forces you to wait a little longer to try again. If you enter the wrong passcode 9 times, iOS forces you to wait an hour before you try the 10th and last time. If you enter it wrong the 10th time, iOS erases the phone.
On an iPhone 6 or later, the Secure Enclave chip handles all these functions. When you enter the wrong passcode, the Secure Enclave starts a hardware timer and will not permit you to enter the passcode again for a longer and longer time. If you enter the passcode wrong on the 10th time, the Secure Enclave wipes its special high-security memory containing the decryption key, forever and irreversibly vaporizing the key and making the phone's contents forever unreadable.
Nobody, not even Apple, has a realistic means of getting the encryption key off the Secure Enclave chip. It is theoretically possible to do, maybe, but the process would involve taking the phone apart in a cleanroom, using acid to dissolve the top casing of the Secure Enclave chip, turning on power, and then attempting to read the Secure Enclave's memory using something like an atomic force microscope. This might work, but it has a very high likelihood of destroying the Secure Enclave chip and if you do that, it's adios, muchachos--the key is gone forever and you're done.
The difference between how the iPhone 5C and the iPhone 6 works is important. In the iPhone 6, all the security is handled by that special chip. In the 5C, it's handled by iOS.
What the FBI is asking Apple to do is write a special version of iOS and put it on the phone
. The special version of iOS would be different from "normal" iOS in two regards: it would not make you wait longer and longer times to enter the passcode, and it would not wipe the phone after 10 wrong tries.
On the iPhone 6, iOS does not do this, the security chip does, so nobody, not even Apple, can change that. On this phone, the FBI hopes that installing an altered version of iOS on the phone will let them try all 10,000 possible passcode combinations until they get the right one.
Why doesn't Apple want to do this?
Lots of reasons.
First, it's not clear that it matters. This phone didn't belong to the terrorist, the terrorist stopped using it before the attacks, and he destroyed all his other phones, which means there's almost certainly nothing interesting or important on it.
Second, the FBI already has the iCloud backup and the contact list and call record from the phone, and there was nothing interesting or important in it. Again, that means there's almost certainly nothing interesting or important on the phone.
Third, it would set a precedent: with any older iPhone, any government anywhere in the world could issue a subpoena to Apple saying "load your special rigged software on this phone, because we told you to." That's potentially scary. There's no way to stuff the genie back into the bottle. (In fact, it was precisely to head off this possibility that Apple changed how the Secure Enclave works in later phone models.)
Fourth, if Apple capitulates on this, it will make it that much easier for the FBI to strong-arm other companies (Google? Facebook?) into weakening the encryption they use on their devices. Weakening encryption is potentially a big problem. The thing about encryption is it's just math. Math does not know about good guys or bad guys. Math is math. If there's a mathematical way for the FBI to break encryption or use a back door, there's a mathematical way for anyone--cybercriminals, virus writers, Eastern European organized crime, hostile governments--to do it. Math is math. If it's mathematically possible to break encryption, it can be done. The whole point of encryption is it works because it's NOT mathematically possible to break.
The law under which the FBI is asking Apple to do this is an old one. It's the All Writs Act of 1789. The All Writs Act requires citizens and companies to take any measures "necessary or appropriate in aid of their respective jurisdictions and agreeable to the usages and principles of law." The FBI is saying that means Apple has to create a new version of iOS just for them, because creating this new version of iOS is necessary and appropriate in aid of this investigation.
What's not clear is, can the government force private citizens to work for it? That's a key part of the debate that I feel is being glossed over. Can the government require that a private company like Apple or private citizens like Apple employees do work on its behalf if they don't want to? Is that "necessary and appropriate"?
There would be a lot more work involved than just changing some lines of code and hitting Compile. Forensics laws are specific about what has to happen with any forensic software used to extract data from a digital device in evidence. The software must be regression-tested, every part of it must be documented, it must be evaluated by peer review, and it must be tested on hardware identical to that of the target device. This is necessary to preserve the chain of evidence and make sure the forensic software isn't inadvertently modifying the data.
And you can't just clone this iPhone and use the new iOS build on the clone, because cloning the phone does not clone the encryption key locked away inside the Secure Enclave chip.
So they're asking Apple to do a great deal of work and to document all that work under legal procedures, and submit the new version of iOS for peer review, in order to get at the contents of the phone. Is this "necessary and appropriate" to do when the people you're ordering to do it are private citizens and not government employees? The All Writs Act of 1789 was clearly never intended to apply to a situation like this; how could it be?