It's not entirely clear how the exploit works. Microsoft says it's investigating.
See full article...
See full article...
Before I could even consider the use of Bitlocker a good friend lost ALL his stuff after he became unable to access/decrypt his drive. Perhaps he's stupid but some time after his move to Linux the same thing happened AGAIN.
So a Big Nope to encrypted drives for me thank you.
Not that it would be useful.
I have nothing on my drives that needs this kind of protection. EXCEPT MY DS9 BACKUP! Not sure if that's still as precious as it once was though, I should check if there's an updated higher quality version.
There are at least two ways to accomplish the third step. One way is to boot into Windows, hold down the [Shift] key, click on the power icon, and click restart. Another is to power on the device and restart it as soon as Windows starts booting.
3. Boot up the device and immediately press and hold down the [Ctrl] key
I suppose that depends on the employer. IIRC, a lot of security tools can lock that at the OS level, or at least notify a SIEM that a USB device has been inserted and take action based on that. Of course, there's nothing stopping an employer from preventing USB access in the BIOS, although that could have unintended downsides. My work laptop relies on USB-C for charging and periperals (docking station, expernal monitors, KBM, etc). Specifically preventing USB boot should be pretty easy though. They would also need to password-protect the BIOS at that point, or else the employee could just change it back.Most employers tend to brick USB thumb drive functionality because of all the things that can now go wrong with them....but are those locks at the OS or the UEFI level?
Bitlocker is only supposed to cough up the key if the boot chain is verified to be the correct OS. Somehow WinRE (in Win11 only) got "signed" (used loosely) to be the same as the main OS, and since it isn't as hardened, the other mysterious flaws provide a way to get it right to a command prompt after making it pass the boot-chain test.So it's not technically defeating Bitlocker encryption at all is it?
It's just using the TPM to cough up the key, then using the key, so it's defeated the TPM.
YellowKey can be triggered simply by merely copying some files to a USB stick and rebooting to the Windows Recovery Environment. We tested this ourselves, and sure enough, not only does it work, it bears all the hallmarks of a backdoor, down to the exploit's files disappearing from the USB stick after it's used once.
Other sites report that the exploit files in FsTx directory are deleted when the access occurs. That makes it look a lot more like a backdoor than a bug. If true that seems to be quite an oversight in this article?
https://www.tomshardware.com/tech-i...day-exploit-demonstrates-an-apparent-backdoor
Seconding this. It’s one thing to have that threat model accounted for, but when the actions needed by the malicious user are so quick and relatively trivial to perform … I liken it to getting fragged before you’ve even begun to pick up your first weapon.I know the rule is "physical access = game over", but I didn't think we were going to go back to the days where I could fire up chntpw (or equivalent) tools from a live USB and just have open access to a computer's filesystem.
So basically, now is the winter of our disk content.It reliably bypasses default Windows 11 deployments of BitLocker, the full-volume encryption protection Microsoft provides to make disk contents off-limits to anyone without the decryption key
Because it implies intent. Skipping a security check or flag can be treated as a (severe) bug or oversight (e.g Apples goto fail). Bypassing a security check in the presence of a file on another drive, then deleting that file, and handing over unfettered admin access to a computer takes actual work.I thought that was odd also, but I have no idea why "files disappearing" signifies "backdoor". Sounds like a movie thing.
It's not a movie thing at all. It's just plain old good OpSec. With few exceptions, you want there to be zero evidence of a penetration. To avoid the human fallibility factor, you automate the cleanup.I thought that was odd also, but I have no idea why "files disappearing" signifies "backdoor". Sounds like a movie thing.
Other sites report that the exploit files in FsTx directory are deleted when the access occurs. That makes it look a lot more like a backdoor than a bug. If true that seems to be quite an oversight in this article?
https://www.tomshardware.com/tech-i...day-exploit-demonstrates-an-apparent-backdoor
This was the first thing crossing my mind when I read this. If it had been a Chinese system, everyone would have call it a backdoor. Strange indeedIt's really a wild one—someone (maybe Kevin Beaumont, actually) speculated that it could be an insider who worked on this and was asked to build a backdoor.
And I have confirmed it does work just fine on my supposedly secure work Dell.
Yeah, generally that's the case. So with all the steps this takes to implement an attack, it's not likely to be AS widespread as remote attacks are. But for focused penetration of encrypted data, that also typically requires hands on.I know the rule is "physical access = game over", but I didn't think we were going to go back to the days where I could fire up chntpw (or equivalent) tools from a live USB and just have open access to a computer's filesystem.
Doesn't setting a BIOS password prevent the computer from booting past the BIOS? In that case you cannot get to Windows Recovery and therefore the exploit can't even start to work? I assume the exploit doesn't work if you move it to another PC since the TPM keys are stored on the original device which is still required to decrypt its contents.While using BIOS password locks is a good practice, it’s unclear how they provide any protection against this particular exploit.
It's not a movie thing at all. It's just plain old good OpSec. With few exceptions, you want there to be zero evidence of a penetration. To avoid the human fallibility factor, you automate the cleanup.
So basically, now is the winter of our disk content.
I just tested the exploit. The files on the USB stick do not disappear.Other sites report that the exploit files in FsTx directory are deleted when the access occurs. That makes it look a lot more like a backdoor than a bug. If true that seems to be quite an oversight in this article?
https://www.tomshardware.com/tech-i...day-exploit-demonstrates-an-apparent-backdoor
I’ve not seen anyone call Linux zero days a back door even though Linux is open source.This was the first thing crossing my mind when I read this. If it had been a Chinese system, everyone would have call it a backdoor. Strange indeed
While people are immediately jumping to backdoor, isn't YellowKey exploiting transactional NTFS in some way? Wouldn't that imply that the files are possibly just being misinterpreted as some sort of unfinished transaction, and possibly just being "cleaned up" after the transaction is completed? (And, somehow, also causing the exploit to fire off)
OP was being sarcastic, friend.This was the first thing crossing my mind when I read this. If it had been a Chinese system, everyone would have call it a backdoor. Strange indeed
I wonder if it's not a kernel issue—because as far as I was aware, only the kernel holds the certificate to request the TPM key, and nothing else can request it. As for the rest of it ... it's been a long while since I looked at Windows code, but the USN journal has to be stored on the volume it's attached to, except in truly weird circumstances (specifically, I'm only aware of WHS doing something wacky with it, and if you're doing offline volume recovery).As soon as I saw "transactional NTFS" and "atomicity", yeah, it sounded like an analog to a standard SQL database "begin transaction" log that keeps everything until a "commit" in a memory/disk buffer. If you leave off the commit, power off the database (not a clean exit), and then restart it, its default "recovery" mode is likely to read any transaction logs and "undo" any changes it made to get back to the consistent "before" state to avoid any inconsistent data, and THEN erase the log to avoid replaying it AGAIN later, and rollback those transactions officially / notify the user and then begin normal startup mode.
Since this is a FILE-system based transaction log (speculation based on the terms used in the description / article), maybe it injects a fake "I changed file/system_value is_bitlocker_password_required from 0 to 1", so the "undo" operation changes it back to 0 (not directly this, but the windows file/system equivalent, which only a system-level process can do, which NTFS transaction log recovery probably is running as), so the system thinks it already prompted the user for the bitlocker recovery code and starts the normal "get the key from the TPM and apply it" procedure and voila..
As to why it works from a different drive? Actually makes a lot of sense - if you run out of disk space while doing an operation.. You can't actually write data to the log to store that information, unless there's some kind of unix-like "5% reserved for root only" setup, which, AFAIK, does not exist for windows. Totally would not surprise me that it supports a transaction log on a DIFFERENT volume, especially if there is a chance of breaking the original volume.
Edit: my guess is that this is a feature of NTFS that was never fully finished or just never found a good use - NTFS came out in 1993, when SQL databases were all the rage - but was never fully deprecated / disabled / hardened. Of course, all speculation.
x:\Users, but that folder is empty, save for a Public folder. None of the actual home folders from the laptop drive are in there.x:\ is not visible.c:\ on the machine when it's running.
They disable USB storage not USB entirely; most firmware will allow USB disable only by shutting down the controller which borks keyboards etc that may be in use - which is why USB disables are generally in the OS which allow finer grained control - and is wide open to the recovery partition.Most employers tend to brick USB thumb drive functionality because of all the things that can now go wrong with them....but are those locks at the OS or the UEFI level?
It is kind of wild to me this little documented system folder/DLL has a function 0 day exploit. Someone AI laid off from MS who knew about this folder/DLL release it maybe?
You can usually set a BIOS / UEFI "boot" password which prevents booting at all or a BIOS "admin" password which prevents accessing BIOS / UEFI settings. The "admin" password plus disabling bootable USB / bootable external devices would be enough to render this moot and leave the USB still functional as "data storage" for users. If you just hard disable the USB it won't work inside Windows either for legit users.Doesn't setting a BIOS password prevent the computer from booting past the BIOS? In that case you cannot get to Windows Recovery and therefore the exploit can't even start to work? I assume the exploit doesn't work if you move it to another PC since the TPM keys are stored on the original device which is still required to decrypt its contents.
Most modern Laptops do no remove or reset the password if the BIOS was reset or the battery was removed based on a Google (AI?) search.
As for when the BIOS password does get reset, maybe this exploit doesn't work if the system is forced to go straight to Bitlocker recovery mode, which a BIOS reset usually triggers anyway?
I don’t see where USB boot is involved in this exploit.plus disabling bootable USB / bootable external devices would be enough to render this moot
Your coat, sir. Suggest leaving through the back, there's an angry mob out front.So basically, now is the winter of our disk content.