Is he though? I’ll admit I haven’t read everything about what’s happened, but it seems to me that Tim Apple says nice things about Trump and gives him a (literal) shiny toy to play with as a way of keeping him from meddling too deeply with what Apple do.How much longer will Apple have robust encryption like this? Tim Cook is willing to do anything to keep the king happy.
What happens if Donald tells Tim that the Apple tariff exemption will only continue if the Apple encryption is disabled?
What will Cook do when presented with the option to choose protecting users privacy and profit / shareholder value...
Okay, for the sake of argument, I will entertain the idea that every administration does it.People talk a lot about "this" administration, but the government has been doing similar type stuff forever under every single administration.
No matter their political POV the type of people who get into politics will always support this type of stuff.
That’s why being able to trigger the lockdown mode would be helpful. That way you don’t erase the data.Just a caution on the suggestion of having a secret wipe mode. Destruction of evidence is a thing and you can be convicted on that even if you're convicted on nothing else. The court can also presume you destroyed it because it was incriminating.
The biometrics on her work computer aren't a big deal. If they have a warrant to search it they can serve it to the Post and have them unlock it.
My employer strongly discourages storing anything on laptops, it's all supposed to be on corporate servers.
Without a doubt, that's what their looking for on the watch but that is not the Government's highly classified information.Location/ time / date data.
If you're talking about an iPhone, you can set it to auto-wipe after ten incorrect tries, so that should defeat any attempt unless you set a predictable passcode.Just curious, for anyone who's in a position to answer:
I understand why someone would want to avoid using biometrics because of the legal case. But is a 4- 6- or 8-digit passcode really that secure? Seems like it would possible to brute force the passcode with some kind of specialized forensic tool, especially if it used numbers only? Or would that attempt be thwarted by the same increasingly long retry times an end user gets after multiple bad passcode entries?
Just curious, for anyone who's in a position to answer:
I understand why someone would want to avoid using biometrics because of the legal case. But is a 4- 6- or 8-digit passcode really that secure? Seems like it would possible to brute force the passcode with some kind of specialized forensic tool, especially if it used numbers only? Or would that attempt be thwarted by the same increasingly long retry times an end user gets after multiple bad passcode entries?
It's really not. Just turning the laptop off will disable biometrics. You need to enter your password to unlock FileVault, before biometrics can be enabled.Personally, I don't think biometrics are a problem for most people. Especially on a phone, where there's an easy way to disable them (press and hold Volume Up and Power for 5 seconds and the phone vibrates) until you enter the passcode. Obviously YMMV, especially if you're a journalist protecting sources, or in the intelligence trade or something.
Trickier for laptops.
My 90-year-old father’s shaky hands so easily fudge’s the password, he’s managed to impose on himself some rather lengthy timeouts.Just curious, for anyone who's in a position to answer:
I understand why someone would want to avoid using biometrics because of the legal case. But is a 4- 6- or 8-digit passcode really that secure? Seems like it would possible to brute force the passcode with some kind of specialized forensic tool, especially if it used numbers only? Or would that attempt be thwarted by the same increasingly long retry times an end user gets after multiple bad passcode entries?
I do wonder how the current push for passkeys instead of passwords will be affected by the potential for situations like this
I'd subsequently like to know where Passkeys fall under this apparent distinction between biometrics and passwords.
In my (admittedly very limited) understanding, I'd consider a passkey to be more closely aligned to biometrics than a password.
Putting aside that the human owner is a weak link. There are hardware exploits (hence unpatchable), there are software exploits when the implementation of the crypto isn't perfect (even a mathematically perfect algorithm can have an implementation flaw and an exploitable sidechannel), and more extreme methods like invasively exploring the innards of the chip.The software doesn't know the key so the software can't be hacked to reveal the key
Short of an unknown vulnerability in AES ( and I wouldn't put money on anybody finding one of those in 10 years) there's no escaping the need to brute force the key, and short of a hardware hack to extract the hardware key there's no way to do that off-device.
When you have a passcode and want to check it, you can by design only do it on the iPhone itself. You can't do it on some supercomputer.Just curious, for anyone who's in a position to answer:
I understand why someone would want to avoid using biometrics because of the legal case. But is a 4- 6- or 8-digit passcode really that secure? Seems like it would possible to brute force the passcode with some kind of specialized forensic tool, especially if it used numbers only? Or would that attempt be thwarted by the same increasingly long retry times an end user gets after multiple bad passcode entries?
There's no sidechannel on something that isn't running. The key simply isn't present to decrypt the data. Side-channels are how you get at AFU devices.Putting aside that the human owner is a weak link. There are hardware exploits (hence unpatchable), there are software exploits when the implementation of the crypto isn't perfect (even a mathematically perfect algorithm can have an implementation flaw and an exploitable sidechannel
Gee if only somebody had already explained that possibility in this thread .. who might have said that? hmm, think his name starts with cloud and ends with gazer! And you should have noticed because I was talking about it in the very post you replied to in starting this discussion with me!and more extreme methods like invasively exploring the innards of the chip.
The key isn't present where, Cloudgazer? Typing the PIN doesn't invoke a "non-present" key, does it? You probably wanted to say "the key isn't present in RAM", but BFU devices still very much hold the key. If the key is there, a determined attacker with effectively unlimited funds will find a way to get it sooner or later. Wherever the key is stored (on the chip, interacting with firmware), it can have a sidechannel and be exploited. Ask Yubikey.There's no sidechannel on something that isn't running. The key simply isn't present to decrypt the data. Side-channels are how you get at AFU devices.
Gee if only somebody had already explained that possibility in this thread .. who might have said that? hmm, think his name starts with cloud and ends with gazer! And you should have noticed because I was talking about it in the very post you replied to in starting this discussion with me!
But thanks for explaining my point to me. Kudos.
No - BFU devices hold a key. They don't hold the key. They hold a hardware key which is cryptographically combined with the PIN to make the actual decryption key. The hardware key alone can't decrypt your private data. There are devices where a biometric unlocks a key that is stored on the device, iPhones aren't those devices - which is why you can't unlock a phone from power on with a biometric.The key isn't present where, Cloudgazer? Typing the PIN doesn't invoke a "non-present" key, does it? You probably wanted to say "the key isn't present in RAM", but BFU devices still very much hold the key. If the key is there, a determined attacker with effectively unlimited funds will find a way to get it sooner or later. Wherever the key is stored (on the chip, interacting with firmware), it can have a sidechannel and be exploited. Ask Yubikey.
The Nazis objection to Jews was not Judaism per se (unlike almost all previous persecutions since before the second temple) but the biological taint of Jewish cultural identity (and no, that makes no sense to anyone else, not even the Nazi officials categorising mischlinge). That was an important difference: previously, converts to the approved religion (Roman paganism, the correct Christian denomination, or the correct branch of Islam) were safe, so long as they could convince the authorities that they had actually converted (which wasn’t easy, for fairly obvious reasons).The ICE agents aren't Nazis. I'm tired of hearing that. That was about Judaism.
A single short press on the power/ID button does an ordinary screen lock, equivalent to win-l, but can be unlocked using Touch ID.Soooo, press and hold the power button... That's not really a shortcut. That's just how laptop power buttons work.
You can also clear the Touch ID setup so it just won't be available in any case, but obviously you'd then have to type in your password at every relevant opportunity which will expose you to others reading or recording it while you keep re-entering it.A single short press on the power/ID button does an ordinary screen lock, equivalent to win-l, but can be unlocked using Touch ID.
I believe you can set up a shell script that can disable fingerprint unlock and trigger that from a shortcut action, but I haven’t tested that all the privileges work correctly without further interaction (because the script needs sudo). Cmd-Opt-Ctl-power doesn’t do anything different to just power, at least for me.
No, it is because a search warrant signed by a judge is considered "reasonable" in the eyes of the law.? Why is being "secure in their persons, houses, papers, and effects, against unreasonable searches and seizures" a recognized right if the "key to the lock" is a password in one's mind but not if it's a part of one's person? Is it because everyone knows that for the latter, they could just beat the shit out of you until they get what they need? Am I saying the quiet part out loud?
Yes, retry delays is the primary way brute force pin attacks are counteredJust curious, for anyone who's in a position to answer:
I understand why someone would want to avoid using biometrics because of the legal case. But is a 4- 6- or 8-digit passcode really that secure? Seems like it would possible to brute force the passcode with some kind of specialized forensic tool, especially if it used numbers only? Or would that attempt be thwarted by the same increasingly long retry times an end user gets after multiple bad passcode entries?
Dont all phones these days reset or brick the phone after just a handful of wrong PIN entries?I shudder to think what would happen if authorities found some reason to visit my home. I've got collections of computers and computer-adjacent electronics going back to the 1980s.
Yeah; and this is the reason that my work devices require a PIN before you get to the Yubikey prompt (which also needs a different PIN), and I never use index finger or thumbprints for unlocking my personal devices. And I use a full keyboard password instead of a numpad PIN on my phone, so a robo-unlocker is going to have serious difficulties.
Dont all phones these days reset or brick the phone after just a handful of wrong PIN entries?
It's not the only way though ... even if you evade that it's slow, at least on iOS.Yes, retry delays is the primary way brute force pin attacks are countered
Erasing to factory conditions can be enabled on an iPhone after ten attempts. Same on a Samsung phone I am told.Dont all phones these days reset or brick the phone after just a handful of wrong PIN entries?
Yes, but the entangled device key is exactly the lever that enforces attempts having to be done on the actual device and on the device the SoC enforces the delay.It's not the only way though ... even if you evade that it's slow, at least on iOS.
The passcode or password is entangled with the device’s UID, so brute-force attempts
need to be performed on the device under attack. A large iteration count is used to make each attempt slower. The iteration count is calibrated so that one attempt takes approximately 80 milliseconds. In fact, it would take more than five and one-half years to try all combinations of a six-character alphanumeric passcode with lowercase letters and numbers.
https://help.apple.com/pdf/security/en_US/apple-platform-security-guide.pdf
Correct - if you can extract the hardware key then you can brute force low complexity passwords in a reasonable amount of time.Yes, but the entangled device key is exactly the lever that enforces attempts having to be done on the actual device and on the device the SoC enforces the delay.
Without the entangled and hidden device key an attacker could escape the delay enforcement on separate hardware, which would be the main motivation to try getting at the device key, and for Apple to counteract such attempts as securely as possible.
Because that delay is enforced by the SoC hardware, and the device key cannot be read out by software to move the brute-forcing to an external system.But you can't get around the 80ms in software, even if there are implementation errors that might allow you to get around the artificially introduced delays.
Years ago there was a bug in iPhones that allowed getting around the delay. You can read the complete encrypted content of the SSD and later write it back.But you can't get around the 80ms in software, even if there are implementation errors that might allow you to get around the artificially introduced delays.
You can get around the programmed delay, assuming there is a suitable exploit but you can't get around the cryptographic delay, not on device anyway. That's the entire point of what that quote is saying.Years ago there was a bug in iPhones that allowed getting around the delay. You can read the complete encrypted content of the SSD and later write it back.
Now after five passcode attempts, obviously the number five has to be written somewhere and read again when you reboot the phone. It used to be written to the SSD. So if you made five attempts, shut down the phone, wrote the original SSD contents with a zero count back, and restarted the iPhone, you had ten attempts again. Still time consuming but you could crack 4 digits in a day or two.
This was fixed by writing the count to the Secure Enclave which is much much harder to hack.
Yes. And forensic investigators have discovered that if you reset the phone between each attempt, the counter tracking the number of attempts is cleared. Upside: you have to wait the entire reboot cycle for each attempt. Downside: you have an infinite number of attempts with a fixed delay.Dont all phones these days reset or brick the phone after just a handful of wrong PIN entries?
I believe that is no longer correct on modern iPhones, the counter isn't stored in RAM or on the SSD but on special secure enclave storage, which has undergone several upgrades over time.Yes. And forensic investigators have discovered that if you reset the phone between each attempt, the counter tracking the number of attempts is cleared. Upside: you have to wait the entire reboot cycle for each attempt. Downside: you have an infinite number of attempts with a fixed delay.
And if this was modified so that the number of attempts was stored unencrypted on disk... well then, DFU mode would allow an attacker to continually reset that value to 0.
I believe they found a way to leverage debug mode to get around that? And that technique is disabled if the phone is in lockdown mode.I believe that is no longer correct on modern iPhones, the counter isn't stored in RAM or on the SSD but on special secure enclave storage, which has undergone several upgrades over time.
Cellebrite can't brute force any device tha has the 2nd gen Secure Enclave Storage Component.
Not according to their roadmap which I linked earlier and shows all devices from the iPhone 12 onwards as only accessible after first unlock.I believe they found a way to leverage debug mode to get around that? And that technique is disabled if the phone is in lockdown mode.
I don't have anything Apple...but I've had that problem frequently all over the place. PCs, Android devices, smart-locks, Busch Gardens (which apparently uses fingerprints to verify passes for the past some years), work ID cards...usually takes half a dozen tries to get the system to recognize my finger exists at all, and then its a toss-up if it saves the "data" whether or not it will work later or come up as a mismatch.Obviously, as she herself must have trained Touch ID to recognize her own finger, and she will have unlocked her Mac routinely with it.
With actual Touch ID on an Apple product? That has been extremely reliable for me across one iPhone, one iPad and one Mac for several years by now.
But it also depends on how well your actual fingerprints can be detected, which can vary individually (also does not work well with wet hands).
Well with Tim CookApple says that LockDown Mode “helps protect devices against extremely rare and highly sophisticated cyber attacks,” and is “designed for the very few individuals who, because of who they are or what they do, might be personally targeted by some of the most sophisticated digital threats.”
The distinction is simple:I don't get how courts have ruled biometrics could be compelled, but passwords not. But here we are, I hope journalists and others who are targets use strong passwords
That wasn't kissing his arse, that was playing him. Trump is not the brightest bulb in any room, unless he is on his own. So giving him a shiny trinket was enough to satisfy his desire to be admired, and Tim Cook gave him a shiny trinket to avoid any situation where fighting for privacy is needed.Well with Tim Cooksucking Donald Trump's D... kissing his ass last year, I'm betting Apple's hisorical stance on privacy is a bit wobbly (the CSAM situation in 2018 not withstanding).
Indeed. There is no evidence of Apple actually having compromised on any of their relevant policies.That wasn't kissing his arse, that was playing him. Trump is not the brightest bulb in any room, unless he is on his own. So giving him a shiny trinket was enough to satisfy his desire to be admired, and Tim Cook gave him a shiny trinket to avoid any situation where fighting for privacy is needed.
Apple's Touch ID is one of the most solid implementations (both in terms of recognizing correct and rejecting incorrect fingerprints) and many others are not quite on the same level, but besides wetness I've had a bit of trouble with it after some strenuous mechanical labour which had roughed up my fingertips;I don't have anything Apple...but I've had that problem frequently all over the place. PCs, Android devices, smart-locks, Busch Gardens (which apparently uses fingerprints to verify passes for the past some years), work ID cards...usually takes half a dozen tries to get the system to recognize my finger exists at all, and then its a toss-up if it saves the "data" whether or not it will work later or come up as a mismatch.
At work I was told they think its because they're too dry...but then using moisturizer my finger is detected reliably but it gets on the sensor and then it can't read the print thru the smudges so it still fails.