Originally Posted By: ganbustein
Originally Posted By: Virtual1
But the keys themselves can't be stored exclusively in a register, otherwise when you reset or powered down the phone, the keys would be gone.

That's the whole point. When you reset or power down the phone, the keys are gone. The phone needs the passcode to recalculate the keys. The only thing that's permanent is the unique ID, and stuff that is encrypted with keys that can only be recalculated from the passcode.


But this has a very important implication with security. You have to be able to unlock your phone after it's been shut down and restarted.

This means that the entire state of the system must be stored outside the registers. If you wanted to take the really long route, you could shut down the phone, image it, boot it up and try the first 5 of 10,000 codes (say 5 attempts before wipe), then when it obviously failed to unlock, shut it down, restore it, and try the next five. Nothing prevents you from eventually succeeding.

If you have apple's cooperation, (or if you spend time on it yourself) you could disable the wipe counter AND the retry counter. This would dramatically cut the time required. (there are numerous videos of people building a lego mindstorm brute force breaker for garmin GPSs) Imagine if that was done electronically, with the garmin 5 sec timer removed? It'd be done in well under a second. I see NO evidence of a long hash being performed during the unlock process, so the delay is nearly 100% artificial and arbitrary.



Originally Posted By: ganbustein
But eventually the whole thing boils down to xkcd/538/.
Quote:


only if you have physical access to the user. If he dies in a shootout and you get his phone, that doesn't work.

[quote=ganbustein][quote=Virtual1]That's the sort of technology that CAN break into a smartphone such as the iPhone. You need to have the gear, the technique, and the knowledge of how to apply it. (that last part can be very difficult to obtain... UNLESS you're the company that manufactured it, OR have a badge to flash AT said company)

I can't believe you're comparing the processor in an iPhone to a satellite TV decoder.


I understand that it's an older tech. But at THAT time, that method looked bulletproof. We are here and now. The comparison still applies, it just has to be scaled. At some point you have to stop saying "ok THIS can't be broken, ever!" and start giving the guys some credit based on their track record of breaking down your protection.



Originally Posted By: ganbustein
The big problem with TV decoders, and most DRM for that matter, is that they don't use unique keys. They base their security on asymmetric cryptography, thinking they can keep the secret key secret, and then write the secret key into millions of consumer devices, each using the cheapest technology they can find, thinking no one will look. One person does, tells everyone, and pretty soon even the script kiddies know it. Even the script kiddies that don't know how to spell "secret" are in on the secret.


I addressed that previously in some detail with regard to obtaining device keys.





Originally Posted By: ganbustein
I believe Apple when they say they have no record of the unique ID, and cannot even go back and discover it after the fact. Why? For one thing, because it's dead simple to design a system that works that way.


"because it's easy to implement the way they say they did" is very shakey reasoning to believe someone. I prefer to err on the side of "what CAN they do" rather than "what did they TELL me they did" or "what am I HOPING they did". You are also then placing faith on their (A) implementing it correctly and (B) a new powerful, previously unknown explot being developed and used. "Heartbleed" has been in use for years by several of your favorite three letter groups. Perfect example.

I prefer to hit that group with occam's razor, and simply say "why should I believe them?"




Originally Posted By: ganbustein
No sane Apple executive would ever authorize such a program. Part of the design of whatever method is used generate the unique ID would be the institution of polices guaranteeing that no Apple nor Samsung employee, sane or otherwise, could change the method in such a way that IDs could be recorded, without setting off alarms that would reach all the way to the top of both companies.
Apple's security policies, not just for the iPhone but for everything Apple does, are designed so that a rogue employee cannot defeat them.


You might want to review my keychain example in my previous post. It's an excellent counter-example to that belief. It remains to this day a gross security hole with computers that have auto-login enabled. Users were led to believe that the entries in their keychain were unobtainable by (A) unauthorized persons and (B) unauthorized apps. This has been proven beyond all doubt to be false. It may very well BE that a rogue employee (or cooperation from Apple) is what led to this app that vacuums up keychains. Or they developed it themselves. But if the latter is true, it would be incredibly difficult to clean-room such an app, and Apple would have an excellent chance of curb-stomping them in court. They are aware of this software. This causes serious problems for your "they wouldn't do it because it could end up being a PR disaster" idea. Why haven't they gone after them? Either they were helping, or they don't think it's worth the Streissand Effect to go after them. Either way, "bad PR" is not working as a deterrent.



Anyway, I am curious to know your thoughs on the methods being used to obtain the keys from the cable boxes. The methods work, it's proven on older tech. Just because the tech is newer doesn't mean it can't be (or hasn't been) adapted. Look at how many rounds of Tag that group has already played? Oh they're doing that, we'll just do THIS. Oh, they defeated that, so we'll do THIS. Oh snap I didn't realize they could do that, we'll have to start doing THIS. And on and on it goes.



I work for the Department of Redundancy Department