An open community 
of Macintosh users,
for Macintosh users.

FineTunedMac Dashboard widget now available! Download Here

Previous Thread
Next Thread
Print Thread
search of iphone
#30396 06/28/14 12:53 PM
Joined: Aug 2009
OP Offline

Joined: Aug 2009
http://www.dailydot.com/politics/stop-police-phone-search-warrant-warrantless-illegal/

The article says to "lock your phone with a passcode" to prevent searches. I thought I'd read (several times?) that companies make a device that can be plugged into an iphone (or most any other smartphone) and drain it dry in minutes, regardless of phone settings.

Does anyone know which way this goes? I have read multiple times that police can take an iphone to Apple and they will dump the data unless it's secure wiped itself already. But this is obviously much less convenient than a 2 minute session with a gadget in the passenger seat of their patrol car.


I work for the Department of Redundancy Department
Re: search of iphone
Virtual1 #30402 06/28/14 03:57 PM
Joined: Aug 2009
Likes: 8
Online

Joined: Aug 2009
Likes: 8
For what it's worth, the US Supreme Court just ruled that a warrant is required to look at anything on a person's phone.

See this NY Times article.


On a Mac since 1984.
Currently: 24" M1 iMac, M2 Pro Mac mini with 27" BenQ monitor, M2 Macbook Air, MacOS 14.x; iPhones, iPods (yes, still) and iPads.
Re: search of iphone
Virtual1 #30416 06/28/14 08:09 PM
Joined: Aug 2009
Offline

Joined: Aug 2009
Originally Posted By: Virtual1
The article says to "lock your phone with a passcode" to prevent searches. I thought I'd read (several times?) that companies make a device that can be plugged into an iphone (or most any other smartphone) and drain it dry in minutes, regardless of phone settings.

Nobody can drain a modern iPhone that is locked with a passcode. Nobody. Not FBI. Not NSA. Not even Apple.

Apple recently (Feb 2014) released an iOS Security white paper that goes into depth about the security of iPhones. The paper focuses on the 5s model, but much of it has been in play from the very beginning. It's a fascinating read. Apple takes security very seriously.

Just as one example: the Touch ID sensor scans your fingerprint, extracts relevant characteristics, and sends them to the Secure Enclave, the device-within-a-device-within-a-device responsible for security. That signal needs to travel through a cable a few centimeters long, and then through the CPU. To prevent someone tapping either the cable or the CPU (through a jailbreak), the data is encrypted using a unique asymmetric key hard-coded into both the sensor and the Secure Enclave during manufacture. Yes, that means they're paired at the factory.

Just as another example: Each file on disk (by which I mean the SSD, of course) is encrypted using a randomly-generated per-file key, and that key is in turn encrypted with the key for its protection class. Protection class keys are generated using both your passcode and a unique 256-bit id hardcoded into the phone. Basically, a 4-digit password with 256 bits of hash, making rainbow tables useless despite the shortness of the key. The passcode itself is stored nowhere; instead, when you enter one, the protection class keys are generated from it; the protection keys actually work only if the passcode was correct. The algorithm is designed to take at least 5 seconds on the phone to compute the key, making brute force attacks time-consuming even if the user has not enabled the wipe-after-10-mistakes option.

But encrypting files is not enough. Everything in RAM is encrypted too, with encryption/decryption happening on the bus. That means an intruder can't even open the phone and extract the contents of RAM, and get anything useful.

The different protection classes allow the phone to unlock some files without unlocking everything. For example, mail messages can be received in the background, while the screen is locked, but once the message has been completely received it switches to a protection class that makes it readable only while the screen is unlocked.

Every once in a while there will be report of a flaw in iOS security that exposes user data. In every case, those flaws derive from the desire to make some data readable even when the user has not recently entered a passcode. For example, every cell phone must be able to make emergency phone calls, even when locked. Users want more and more features available directly from the lock screen, and every app and data file needed by those features must allow that access. The exploits come about because some sequence of touches and button presses fools the system into granting access, even without a passcode, to something that should have been in a higher protection class, and that's possible only if some file was mistakenly placed into the wrong protection class.

Such exploits are quickly closed by Apple. In the meantime, the exposure is limited to only files that need to be accessible from the lock screen. For example, recent photos, but only photos taken from the lock screen. Or too much info from your Contacts database, which needs to be readable to show caller-ID.

But there is no way, even using jailbreaking and electronic probes on the circuit board, to get full access to the data on a phone that has been locked with a passcode. (If the owner did the jailbreaking, security might be weaker, but the intruder cannot gain elevated access using after-the-fact jailbreaking.)


Early iPhone models (I think through the iPhone 2) did not encrypt SSD unless you entered a passcode. And may have used whole-disk encryption, not per-file encryption, but I'm not sure. Basically, there was enough unencrypted SSD to hold the code to read the passcode, and entering the passcode let you decrypt the key to decrypt the rest. A device wipe consisted of erasing the encrypted decryption key, so that the passcode only unlocked garbage.


There is a case on record where DEA got hold of a Columbian drug lord's iPhone. As a non-national, he was fair game for both CIA and NSA. Even with Apple's help, and despite the incredible value of the data to be gleaned, they were unable to decrypt the phone.

Nobody is going to be able to pull the data from your iPhone after it has been locked with a passcode. If you didn't lock it, though, Apple provides this nifty little UI software called "iOS" that will give them a very nice interface to browse everything. Your call.

Re: search of iphone
ganbustein #30425 06/28/14 11:26 PM
Joined: Aug 2009
OP Offline

Joined: Aug 2009
cellbrite is the app I was thinking of, check this:
http://thenextweb.com/us/2011/04/20/us-police-can-copy-your-iphones-contents-in-under-two-minutes/

Quote:
Michigan State Police have been using a high-tech mobile forensics device that can extract information from over 3,000 models of mobile phone, potentially grabbing all media content from your iPhone in under two minutes.

The CelleBrite UFED is a handheld device that Michigan officers have been using since August 2008 to copy information from mobile phones belonging to motorists stopped for minor traffic violations. The device can circumvent password restrictions and extract existing, hidden, and deleted phone data, including call history, text messages, contacts, images, and geotags.

O2MOBILE SOFTWARE Cellebrite UFED 520x286 US Police Can Copy Your iPhones Contents In Under Two MinutesIn short, it can copy everything on your smartphone in a matter of minutes.


I have read about cellbrite from numerous sources. This may be old news that is not accurate today though.

BUT... what it comes down to is this: it doesn't matter if files or memory are encrypted. SOMEWHERE the key has to exist. From what I've seen there are at least two interesting keys. The first is the one being used to encrypt the ram. it has to BE IN RAM somewhere to be getting used. If I dump all the ram, and look at it, I can find where ASL has located the code that handles the ram encrypion. From there I can see where it is storing the key. THen I go there and get the key. Firewire and other ports that access the bus directly may have access to dump ram from a running system. If there's security in place, who's to say that someone (like Apple) wouldn't give LEA the key to open that security?

This should give me access to everything IN ram. It won't give me access to files on the SSD.

THAT however is also available, though more difficult to get to. Apple is using a method similar to what the cable boxes use, they are "burning in" the key, in the form of software programmable fuses. They're usually set up so the programming process is write-only, and only the internal engine can read it. Unlike the cable companies, the keys on each iphone appear to be uniquely and randomly set during manufacture.

But someone with some VERY expensive gear can get that number. I've read some in-depth articles about the "cable box hackers" that buy very purpose-built machines (that WILL get you on a watch list if you purchase) that can peel those chips off a few atoms at a time until they expose the fuse grid. Then they go in with a microscope and visually identify which gates are burned, and get the code. Very labor intensive, requires costly hardware, but not impossible.

And certainly not beyond the reach of organizations like the NSA. If they're telling you they can't break into it, they're lying, hoping you continue to feel private. Granted, you have to give them a really good reason to want your data, but they can get it.

Let me toss out one more example of how there are steps that can be "skipped" in security. Look at your keychain. It unlocks when you login. But you can't look at any entry in keychain access without entering your password. Why? Because. That's why. Just because. Because the public API requires it.

Did you know you can buy (for 10 grand, and a flash of a badge) software that has a modified security API that gets around that "pesky arbitrary restriction"? I've seen it. If you can launch that app on a computer that has an unlocked keychain, you get a full dump of the keychain. That's an excellent real example of how we think our data is secure, when it's actually only "security through obscurity". Did they involve Apple? probably not. BUT.... if an NSA agent wanted data from a keychain on a logged in computer, and demanded they provide a tweaked copy of their API, they probably would get it, wouldn't you agree? There's the problem. You have to know about these "shortcuts" before you can start to recognize the weakness.

It all boils down to "if the computer can access it, a determined, well-funded person can also access it." It's like Rule 34, there are NO exceptions.

(segway! the FBI and up also have access to special UPS's and wall saws. they wil cut a hole around the outlet your computer is plugged into, expose the 120v wires. Clamp on the UPS's special power connector, then cut the feed wires. They will then take your computer, wiring to it and all, powered on, and cart it away, so forget about them "having to log you out or turn your computer off", they don't have to)

Last edited by Virtual1; 06/28/14 11:43 PM.

I work for the Department of Redundancy Department
Re: search of iphone
Virtual1 #30439 06/29/14 08:11 AM
Joined: Aug 2009
Offline

Joined: Aug 2009
Originally Posted By: Virtual1
BUT... what it comes down to is this: it doesn't matter if files or memory are encrypted. SOMEWHERE the key has to exist. From what I've seen there are at least two interesting keys. The first is the one being used to encrypt the ram. it has to BE IN RAM somewhere to be getting used.

Nope. The key is stored in a special-purpose processor register. It is not stored in RAM.

When you lock the device, the key is discarded. When you enter your passcode, the key is recalculated, as a function of the passcode you entered and a device-specific 256-bit unique identifier burned into the silicon during the manufacturing process.

The code to calculate the key uses special instructions that use the unique identifier without revealing what it is. Even the OS cannot determine the unique id.
Consequently, the unique ID is not and cannot be saved in any backup, sent across any network to anywhere, nor stored in RAM. The unique ID is generated randomly at the factory in a cryptographically secure manner, so even Apple does not and cannot know even what IDs have ever been used, let alone on which devices.

The function that computes the key from the passcode MUST run on the phone itself. It cannot be offloaded to a supercomputer or gate array or anything like that, because no such device has access to the unique ID. Even on the phone, the calculation takes on the order of 5 seconds, to discourage brute force attacks. (Apple achieves this by iterating the function several thousand times, with the output of each iteration being fed as input to the next. The passcode goes into the first iteration, and the final key is the output of the final one. The unique ID is used on each iteration.) The final key is then used to decrypt the SSD's encryption key into a machine register.

Even if you made an exact copy of all of RAM and all of the SSD onto another iPhone of the same exact model, and entered the same passcode, it would not work. The other iPhone would have a different unique ID, so the final key would come out wrong, and decrypting would yield only useless garbage.

On the iPhone 5s, Apple pushes this work onto the Secure Enclave, a separate processor (on the same chip as the CPU but not sharing any registers or RAM with it). The Secure Enclave has its own independent on-chip NVRAM where it stores its own secrets (like your fingerprint data). The Secure Enclave talks to the rest of the system only over serial communications lines, and does not divulge any of its secrets. It will encrypt/decrypt data on request, but will not reveal the keys it's using to do that. For extra security, even the Secure Enclave discards keys that are not needed but can be recalculated.

Peeling the Secure Enclave apart an atom at a time would destroy the unique ID; it would not reveal it. Well, it might, if it were really an atom at a time, but "atom at a time" is hyperbole. No one has the capacity to store that much data, let alone interpret it. There are a lot of atoms in even a single logic gate. What's really being described is a process wherein the chip is peeled back in layers, each a few atoms thick, and each newly exposed layer is examined under an electron microscope to try to discern where the transistors are and how they're connected. But if the data is stored as a doping differential, say N-doping for a "0" and P-doping for a "1", there would be nothing for the electron microscope to see.


Re: search of iphone
ganbustein #30440 06/29/14 08:32 AM
Joined: Aug 2009
Likes: 15
Online

Joined: Aug 2009
Likes: 15
Fascinating, but you've got a unique ability to leave me with crossed eyes. tongue


The new Great Equalizer is the SEND button.

In Memory of Harv: Those who can make you believe absurdities can make you commit atrocities. ~Voltaire
Re: search of iphone
ganbustein #30441 06/29/14 12:44 PM
Joined: Aug 2009
OP Offline

Joined: Aug 2009
Interesting info on the registers.

But the keys themselves can't be stored exclusively in a register, otherwise when you reset or powered down the phone, the keys would be gone.

Also interesting idea of storing the grid of data using different dopings, but not practical on mass production scale. The problem is that you want each phone to use a different key. Different masks for each phone is totally impractical.

I don't know why they don't like to use eeprom or flash, but they don't. Maybe not reliable enough?

So they use either a fuse array or ROM. Both of which can be "peeled" open. I need to look around and see if I can find a link describing the machine. It's one of those very purpose-built things that is very difficult to obtain and gets you on watch lists you don't want to be on when you buy it. The satellite tv companies really hate them because they can't keep their sat box keys secure because of them. (there's a good case for differential doping... wonder why they don't do it? must be a reason)

http://security.stackexchange.com/questi...ng-a-smart-card
http://www.theguardian.com/technology/2002/mar/13/media.citynews

see also "microprobing"

There's an interesting tech war going on between these groups. I recall reading one description of a system where the pirates were opening a card and boring in with a laser. THey had knowledge of how the chip was laid out but not the fuse pattern. So they found how to drill in and cut specific tracks to enable reading of the cells. This led to the manufacturer laying down another layer on top that zigzagged across the entire secured area and ran back, that zigzag track was required to connect a few lines that run the chip, so if you drilled down through it to get at the critical places to cut, you'd disable the chip. It was a giant expensive game of tag they were playing. They had posted pics of scans of those chips, it was really an interesting read, but that's been a few years ago and I didn't store any of it. It may be difficult or impossible to find again. That's the sort of information that gets taken down. frown

huh. this is pretty good, I think you'll really enjoy it! http://www.youtube.com/watch?v=tnY7UVyaFiQ

That's the sort of technology that CAN break into a smartphone such as the iPhone. You need to have the gear, the technique, and the knowledge of how to apply it. (that last part can be very difficult to obtain... UNLESS you're the company that manufactured it, OR have a badge to flash AT said company)

Anyone that tells me that Apple can't break into their own phone, I am going to have a very diffcult time believing. Just throwing out a very generic scenario... NSA work with Apple and say, that guy from the video wink OK now we have the money, the gear, the technique, and the location of the bits we want. The entire device gets dumped, while on, including ram and flash. (but not registers) He goes to work and gets the bits off the trust chip. (we're going to assume apple doesn't just plain keep this data... who can really say for sure they don't?) Once they have the key from the original, Apple enters the key into a new unit, but in a "not locked" state on a new phone.

Really the key thing here that needs to happen is (A) disabling the wipe option (trivial if you know what you are doing, and not terribly devasatating if you have already dumped it) and (B) defeating any timeout if you're going to haev to guess it. 10^4 is pretty fast to guess if you've disabled the failed-attempt timer. Apple most CERTAINLY has the ability to fab a modified trust chip with those features disabled. Then use the above guy's skills to snatch the key (assuming apple doesn't have it) and then simply program the modded chip with the code and plug in the SSD. Then just run it through a 2 second sweep of all 10k codes and you're in. This assumes cooperation from Apple of course, to make things easier, but if you read the articles I posted, you'll see there are groups that pull this off WITHOUT cooperation from the manufacturers. It just takes time.


I work for the Department of Redundancy Department
Re: search of iphone
Virtual1 #30443 06/29/14 08:52 PM
Joined: Aug 2009
Likes: 16
Moderator
Online
Moderator

Joined: Aug 2009
Likes: 16
Data security and encryption schemes are designed by fallible humans, not omniscient Gods, so any scheme that can be devised can and probably will eventually be broken. The literature is rife with famous or infamous, according to your viewpoint, examples of this. So it is safe to say that with electronic data there is no such thing as absolute security. Given sufficient, time, money, and resources, or an ingenious breakthrough algorithm, any security scheme will eventually be broken. In the final analysis the best that you can hope for is to accomplish one of two things:
  1. make the resources required to break the security system so expensive that it costs more, perhaps many times more, than the value of the data to be gained or…
  2. it will take so long to break the scheme the data simply is no longer relevant.

This is particularly true if you have physical possession of the device containing the data. The weakness in Apple's scheme is, of course, the passcode. If you have the iPhone and the passcode, or you can guess the passcode, there is zero security. If you only have the data or even the data and the passcode, but you don't have the iPhone itself the difficulty is increased exponentially. Just because you know the security algorithm and you have broken into the data from one iPhone, the next iPhone is a whole new ballgame.


If we knew what it was we were doing, it wouldn't be called research, would it?

— Albert Einstein
Re: search of iphone
Virtual1 #30445 06/29/14 09:40 PM
Joined: Aug 2009
Offline

Joined: Aug 2009
Originally Posted By: Virtual1
But the keys themselves can't be stored exclusively in a register, otherwise when you reset or powered down the phone, the keys would be gone.

That's the whole point. When you reset or power down the phone, the keys are gone. The phone needs the passcode to recalculate the keys. The only thing that's permanent is the unique ID, and stuff that is encrypted with keys that can only be recalculated from the passcode.

Originally Posted By: Virtual1
Also interesting idea of storing the grid of data using different dopings, but not practical on mass production scale. The problem is that you want each phone to use a different key. Different masks for each phone is totally impractical.

I don't know for sure how the unique ID is physically stored, but what's impractical about an ephemeral mask? One step in the process could use the x-ray equivalent of an LCD (metal windows that can be opened/closed, perhaps?) to be the mask for a small region of the chip, just large enough to store 256 bits. Or 256 1-bit masks? Or not use masks at all. Draw the pattern directly on the photo-resist using a focused electron beam. The only reason we use masks at all is so that the whole chip can be imaged in one step, but that's parallelism that isn't needed for only 256 bits. Or cut right to the chase, and dope the silicon directly with a focused beam of ionized dopant. The rest of the chip can be fabricated using normal masking techniques. The point to remember is that just because you know one way to fabricate chips doesn't mean all chips are fabricated that way. What's impractical for one fabrication method may be facile for another.

Or maybe it is stored in flash memory. Not "the flash", of course. Just a 256-bit array on the CPU. Then it's just a pattern of electrical charges, charges that can be made to dissipate when the layer that covers them is removed.

But eventually the whole thing boils down to xkcd/538/.

Originally Posted By: Virtual1
That's the sort of technology that CAN break into a smartphone such as the iPhone. You need to have the gear, the technique, and the knowledge of how to apply it. (that last part can be very difficult to obtain... UNLESS you're the company that manufactured it, OR have a badge to flash AT said company)

I can't believe you're comparing the processor in an iPhone to a satellite TV decoder. That TV decoder only had three layers, and the circuit components were huge, large enough to be visible in an optical microscope. The A7 processor in the iPhone 5s uses 28-nm technology. I'd love to see the guy from the video try to tap into that with a paper clip.

The big problem with TV decoders, and most DRM for that matter, is that they don't use unique keys. They base their security on asymmetric cryptography, thinking they can keep the secret key secret, and then write the secret key into millions of consumer devices, each using the cheapest technology they can find, thinking no one will look. One person does, tells everyone, and pretty soon even the script kiddies know it. Even the script kiddies that don't know how to spell "secret" are in on the secret.

Apple isn't that stupid.

Originally Posted By: Virtual1
Anyone that tells me that Apple can't break into their own phone, I am going to have a very diffcult time believing. Just throwing out a very generic scenario... NSA work with Apple and say, that guy from the video wink OK now we have the money, the gear, the technique, and the location of the bits we want. The entire device gets dumped, while on, including ram and flash. (but not registers) He goes to work and gets the bits off the trust chip. (we're going to assume apple doesn't just plain keep this data... who can really say for sure they don't?) Once they have the key from the original, Apple enters the key into a new unit, but in a "not locked" state on a new phone.

I'm the one having a very difficult time believing your scenario. Even with the design details of the A7 processor in hand (obtainable by bribing someone from Samsung, the company that actually fabs the chip, but it would have to be someone with access to the vault where the design details are stored), you wouldn't be able to access the unique ID, even by destroying the chip, and certainly not in a way that lets you continue running the chip under power, as you suppose.

I believe Apple when they say they have no record of the unique ID, and cannot even go back and discover it after the fact. Why? For one thing, because it's dead simple to design a system that works that way. Any graduate-level CS student (and any bright under-graduate) could and probably has designed such a secure system, probably as a homework exercise. Absolutely secure black boxes may sound impossible, but they're really easy to design and build. (By "secure black box" I mean a tamper-proof box that has secrets it uses to do its thing, but will not reveal those secrets unless tampered with, and you can't tamper with it.)

Another reason to believe Apple is that they have no incentive to keep the unique ID, and many reasons not to. There's nothing Apple can do with the ID. Having said publicly that they do not record it, there would be grave and expensive repercussions if they were ever found to have been lying. Lots of lawsuits, plus damage to their reputation that would translate into lower sales.

No sane Apple executive would ever authorize such a program. Part of the design of whatever method is used generate the unique ID would be the institution of polices guaranteeing that no Apple nor Samsung employee, sane or otherwise, could change the method in such a way that IDs could be recorded, without setting off alarms that would reach all the way to the top of both companies.

Apple's security policies, not just for the iPhone but for everything Apple does, are designed so that a rogue employee cannot defeat them. Apple understands that attacks can come from inside as well as out, and designs accordingly. There may be companies that give interns badges that grant them access to everything. Such companies may be the norm in Hollywood movies, but they're rare in practice.

Not unheard of, unfortunately. Target's security was breached when they gave an HVAC company free passage through the corporate firewall, so the company could monitor HVAC equipment in Target's data centers. Then that HVAC company got hacked, giving the hackers access to the computers where Target did their software development, letting the hackers install malicious software in Target's Windows-based point-of-sale devices. The hackers walked off with debit card numbers and their PINs. Some of those debit cards were also credit cards. Target made two mistakes: the obvious one was punching too big a hole through their firewall, but the important mistake was thinking that attacks could not come from within. Savvy companies know better.

Re: search of iphone
ganbustein #30446 06/29/14 10:12 PM
Joined: Aug 2009
OP Offline

Joined: Aug 2009
Originally Posted By: ganbustein
Originally Posted By: Virtual1
But the keys themselves can't be stored exclusively in a register, otherwise when you reset or powered down the phone, the keys would be gone.

That's the whole point. When you reset or power down the phone, the keys are gone. The phone needs the passcode to recalculate the keys. The only thing that's permanent is the unique ID, and stuff that is encrypted with keys that can only be recalculated from the passcode.


But this has a very important implication with security. You have to be able to unlock your phone after it's been shut down and restarted.

This means that the entire state of the system must be stored outside the registers. If you wanted to take the really long route, you could shut down the phone, image it, boot it up and try the first 5 of 10,000 codes (say 5 attempts before wipe), then when it obviously failed to unlock, shut it down, restore it, and try the next five. Nothing prevents you from eventually succeeding.

If you have apple's cooperation, (or if you spend time on it yourself) you could disable the wipe counter AND the retry counter. This would dramatically cut the time required. (there are numerous videos of people building a lego mindstorm brute force breaker for garmin GPSs) Imagine if that was done electronically, with the garmin 5 sec timer removed? It'd be done in well under a second. I see NO evidence of a long hash being performed during the unlock process, so the delay is nearly 100% artificial and arbitrary.



Originally Posted By: ganbustein
But eventually the whole thing boils down to xkcd/538/.
Quote:


only if you have physical access to the user. If he dies in a shootout and you get his phone, that doesn't work.

[quote=ganbustein][quote=Virtual1]That's the sort of technology that CAN break into a smartphone such as the iPhone. You need to have the gear, the technique, and the knowledge of how to apply it. (that last part can be very difficult to obtain... UNLESS you're the company that manufactured it, OR have a badge to flash AT said company)

I can't believe you're comparing the processor in an iPhone to a satellite TV decoder.


I understand that it's an older tech. But at THAT time, that method looked bulletproof. We are here and now. The comparison still applies, it just has to be scaled. At some point you have to stop saying "ok THIS can't be broken, ever!" and start giving the guys some credit based on their track record of breaking down your protection.



Originally Posted By: ganbustein
The big problem with TV decoders, and most DRM for that matter, is that they don't use unique keys. They base their security on asymmetric cryptography, thinking they can keep the secret key secret, and then write the secret key into millions of consumer devices, each using the cheapest technology they can find, thinking no one will look. One person does, tells everyone, and pretty soon even the script kiddies know it. Even the script kiddies that don't know how to spell "secret" are in on the secret.


I addressed that previously in some detail with regard to obtaining device keys.





Originally Posted By: ganbustein
I believe Apple when they say they have no record of the unique ID, and cannot even go back and discover it after the fact. Why? For one thing, because it's dead simple to design a system that works that way.


"because it's easy to implement the way they say they did" is very shakey reasoning to believe someone. I prefer to err on the side of "what CAN they do" rather than "what did they TELL me they did" or "what am I HOPING they did". You are also then placing faith on their (A) implementing it correctly and (B) a new powerful, previously unknown explot being developed and used. "Heartbleed" has been in use for years by several of your favorite three letter groups. Perfect example.

I prefer to hit that group with occam's razor, and simply say "why should I believe them?"




Originally Posted By: ganbustein
No sane Apple executive would ever authorize such a program. Part of the design of whatever method is used generate the unique ID would be the institution of polices guaranteeing that no Apple nor Samsung employee, sane or otherwise, could change the method in such a way that IDs could be recorded, without setting off alarms that would reach all the way to the top of both companies.
Apple's security policies, not just for the iPhone but for everything Apple does, are designed so that a rogue employee cannot defeat them.


You might want to review my keychain example in my previous post. It's an excellent counter-example to that belief. It remains to this day a gross security hole with computers that have auto-login enabled. Users were led to believe that the entries in their keychain were unobtainable by (A) unauthorized persons and (B) unauthorized apps. This has been proven beyond all doubt to be false. It may very well BE that a rogue employee (or cooperation from Apple) is what led to this app that vacuums up keychains. Or they developed it themselves. But if the latter is true, it would be incredibly difficult to clean-room such an app, and Apple would have an excellent chance of curb-stomping them in court. They are aware of this software. This causes serious problems for your "they wouldn't do it because it could end up being a PR disaster" idea. Why haven't they gone after them? Either they were helping, or they don't think it's worth the Streissand Effect to go after them. Either way, "bad PR" is not working as a deterrent.



Anyway, I am curious to know your thoughs on the methods being used to obtain the keys from the cable boxes. The methods work, it's proven on older tech. Just because the tech is newer doesn't mean it can't be (or hasn't been) adapted. Look at how many rounds of Tag that group has already played? Oh they're doing that, we'll just do THIS. Oh, they defeated that, so we'll do THIS. Oh snap I didn't realize they could do that, we'll have to start doing THIS. And on and on it goes.



I work for the Department of Redundancy Department

Moderated by  cyn, dianne 

Link Copied to Clipboard
Powered by UBB.threads™ PHP Forum Software 7.7.4
(Release build 20200307)
Responsive Width:

PHP: 7.4.33 Page Time: 0.024s Queries: 34 (0.018s) Memory: 0.6576 MB (Peak: 0.7770 MB) Data Comp: Zlib Server Time: 2024-03-28 21:45:40 UTC
Valid HTML 5 and Valid CSS