Zero-day exploit completely defeats default Windows 11 BitLocker protections

Post content hidden for low score. Show…
Post content hidden for low score. Show…
Most employers tend to brick USB thumb drive functionality because of all the things that can now go wrong with them....but are those locks at the OS or the UEFI level?

It is kind of wild to me this little documented system folder/DLL has a function 0 day exploit. Someone AI laid off from MS who knew about this folder/DLL release it maybe?
 
Upvote
63 (63 / 0)
Post content hidden for low score. Show…
So it's not technically defeating Bitlocker encryption at all is it?

It's just using the TPM to cough up the key, then using the key, so it's defeated the TPM.

Edit: to clarify, my very limited understanding of the TPM role is that it is meant to hold the keys and release them to decrypt the drive and thus enable boot of a properly signed OS image. (I've had an update break this with openSUSE).
It seems the exploit here is just in allowing a different boot state to be accessed?
 
Last edited:
Upvote
-14 (11 / -25)

Rirere

Ars Centurion
324
Subscriptor++
Before I could even consider the use of Bitlocker a good friend lost ALL his stuff after he became unable to access/decrypt his drive. Perhaps he's stupid but some time after his move to Linux the same thing happened AGAIN.
So a Big Nope to encrypted drives for me thank you.

Not that it would be useful.
I have nothing on my drives that needs this kind of protection. EXCEPT MY DS9 BACKUP! Not sure if that's still as precious as it once was though, I should check if there's an updated higher quality version.

You know, with anecdata like that I'm surprised you store things on electronic media at all. After all, I've got plenty of friends who had flash memory die or suffered a head crash on spinning rust. Guess that means none of those new-fangled computers for me!

Seriously, though: while physical access is generally taken to mean game over against a serious adversary, there's no reason to make things easy by leaving your figurative door unlocked in case your computer is ever lost or stolen.
 
Upvote
49 (53 / -4)

mg224

Ars Scholae Palatinae
1,371
Subscriptor
Moonshark, or an Ars editor, might want to check this paragraph

There are at least two ways to accomplish the third step. One way is to boot into Windows, hold down the [Shift] key, click on the power icon, and click restart. Another is to power on the device and restart it as soon as Windows starts booting.

Given that the third step is

3. Boot up the device and immediately press and hold down the [Ctrl] key

Because they don’t seem coherent.

Edit - fixing the numeral in the quoted list to 3 not 1.
 
Upvote
32 (35 / -3)

r0twhylr

Ars Praefectus
3,457
Subscriptor++
Between this and the Linux CVEs that have come out lately, I can't remember a time when so many critical vulnerabilities went public in such a short amount of time. It kinda makes me want to fire up some BSD or OpenSolaris as my desktop.

Does this have any effect on images of bitlocker-protected devices? IOW, if DHS/ICE/LE imaged someone's device some time in the past, is this game over? It sounds like they still need the physical TPM to be present.

Most employers tend to brick USB thumb drive functionality because of all the things that can now go wrong with them....but are those locks at the OS or the UEFI level?
I suppose that depends on the employer. IIRC, a lot of security tools can lock that at the OS level, or at least notify a SIEM that a USB device has been inserted and take action based on that. Of course, there's nothing stopping an employer from preventing USB access in the BIOS, although that could have unintended downsides. My work laptop relies on USB-C for charging and periperals (docking station, expernal monitors, KBM, etc). Specifically preventing USB boot should be pretty easy though. They would also need to password-protect the BIOS at that point, or else the employee could just change it back.
 
Last edited:
Upvote
19 (21 / -2)

Alpha Lupi

Smack-Fu Master, in training
30
So it's not technically defeating Bitlocker encryption at all is it?

It's just using the TPM to cough up the key, then using the key, so it's defeated the TPM.
Bitlocker is only supposed to cough up the key if the boot chain is verified to be the correct OS. Somehow WinRE (in Win11 only) got "signed" (used loosely) to be the same as the main OS, and since it isn't as hardened, the other mysterious flaws provide a way to get it right to a command prompt after making it pass the boot-chain test.

I've read elsewhere that the researcher claims to be withholding a further exploit for tpm+pin.

Reportedly, users can choose to disable WinRE functionality to avoid this exploit until a real fix is made, via this command

reagentc /disable
 
Upvote
45 (46 / -1)
Other sites report that the exploit files in FsTx directory are deleted when the access occurs. That makes it look a lot more like a backdoor than a bug. If true that seems to be quite an oversight in this article?


YellowKey can be triggered simply by merely copying some files to a USB stick and rebooting to the Windows Recovery Environment. We tested this ourselves, and sure enough, not only does it work, it bears all the hallmarks of a backdoor, down to the exploit's files disappearing from the USB stick after it's used once.

https://www.tomshardware.com/tech-i...day-exploit-demonstrates-an-apparent-backdoor
 
Upvote
25 (28 / -3)
Upvote
-4 (5 / -9)
I know the rule is "physical access = game over", but I didn't think we were going to go back to the days where I could fire up chntpw (or equivalent) tools from a live USB and just have open access to a computer's filesystem.
Seconding this. It’s one thing to have that threat model accounted for, but when the actions needed by the malicious user are so quick and relatively trivial to perform … I liken it to getting fragged before you’ve even begun to pick up your first weapon.
 
Upvote
10 (11 / -1)
I thought that was odd also, but I have no idea why "files disappearing" signifies "backdoor". Sounds like a movie thing.
Because it implies intent. Skipping a security check or flag can be treated as a (severe) bug or oversight (e.g Apples goto fail). Bypassing a security check in the presence of a file on another drive, then deleting that file, and handing over unfettered admin access to a computer takes actual work.

And it makes sense for these kind if exploits to delete their traces after use. They are not script kiddy things to infect random computers on the internet. You don't want to leave traces on the computer or yourself after using them. See for example stuxnet being designed to delete itself after a certain date. Or the plethora of malware living in ram only.

Let's see if we get a detailed explanation of how this happened out of microsoft 🤷‍♂️
 
Upvote
70 (71 / -1)

Nilt

Ars Legatus Legionis
21,828
Subscriptor++
I thought that was odd also, but I have no idea why "files disappearing" signifies "backdoor". Sounds like a movie thing.
It's not a movie thing at all. It's just plain old good OpSec. With few exceptions, you want there to be zero evidence of a penetration. To avoid the human fallibility factor, you automate the cleanup.
 
Upvote
40 (40 / 0)

Sukasa

Ars Centurion
235
Subscriptor++
Other sites report that the exploit files in FsTx directory are deleted when the access occurs. That makes it look a lot more like a backdoor than a bug. If true that seems to be quite an oversight in this article?




https://www.tomshardware.com/tech-i...day-exploit-demonstrates-an-apparent-backdoor

While people are immediately jumping to backdoor, isn't YellowKey exploiting transactional NTFS in some way? Wouldn't that imply that the files are possibly just being misinterpreted as some sort of unfinished transaction, and possibly just being "cleaned up" after the transaction is completed? (And, somehow, also causing the exploit to fire off)

E: To add on to this, my understanding is that this requires an external USB key with the files. So why need to erase the files from that USB if it's an intentional exploit, when a hypothetical attacker is almost certainly not going to leave the incriminating thumb drive behind?
 
Last edited:
Upvote
38 (40 / -2)

gautier

Ars Praetorian
564
Subscriptor++
It's really a wild one—someone (maybe Kevin Beaumont, actually) speculated that it could be an insider who worked on this and was asked to build a backdoor.

And I have confirmed it does work just fine on my supposedly secure work Dell.
This was the first thing crossing my mind when I read this. If it had been a Chinese system, everyone would have call it a backdoor. Strange indeed
 
Upvote
10 (15 / -5)

Fatesrider

Ars Legatus Legionis
25,280
Subscriptor
I know the rule is "physical access = game over", but I didn't think we were going to go back to the days where I could fire up chntpw (or equivalent) tools from a live USB and just have open access to a computer's filesystem.
Yeah, generally that's the case. So with all the steps this takes to implement an attack, it's not likely to be AS widespread as remote attacks are. But for focused penetration of encrypted data, that also typically requires hands on.

Which usually means you're going to be pwned anyhow.
 
Upvote
-4 (0 / -4)

riri0

Smack-Fu Master, in training
15
While using BIOS password locks is a good practice, it’s unclear how they provide any protection against this particular exploit.
Doesn't setting a BIOS password prevent the computer from booting past the BIOS? In that case you cannot get to Windows Recovery and therefore the exploit can't even start to work? I assume the exploit doesn't work if you move it to another PC since the TPM keys are stored on the original device which is still required to decrypt its contents.

Most modern Laptops do no remove or reset the password if the BIOS was reset or the battery was removed based on a Google (AI?) search.

As for when the BIOS password does get reset, maybe this exploit doesn't work if the system is forced to go straight to Bitlocker recovery mode, which a BIOS reset usually triggers anyway?
 
Upvote
10 (14 / -4)
It's not a movie thing at all. It's just plain old good OpSec. With few exceptions, you want there to be zero evidence of a penetration. To avoid the human fallibility factor, you automate the cleanup.

Makes sense except that it's the hacker's removable USB drive. The way it was noted in the Tom's article without explanation was odd to me. Shrug.
 
Upvote
-1 (3 / -4)
Other sites report that the exploit files in FsTx directory are deleted when the access occurs. That makes it look a lot more like a backdoor than a bug. If true that seems to be quite an oversight in this article?




https://www.tomshardware.com/tech-i...day-exploit-demonstrates-an-apparent-backdoor
I just tested the exploit. The files on the USB stick do not disappear.

Edit: Scratch that, I just double-checked. The FsTx folder remains (what I originally checked) but the contents do indeed disappear.
 
Last edited:
Upvote
24 (24 / 0)
While people are immediately jumping to backdoor, isn't YellowKey exploiting transactional NTFS in some way? Wouldn't that imply that the files are possibly just being misinterpreted as some sort of unfinished transaction, and possibly just being "cleaned up" after the transaction is completed? (And, somehow, also causing the exploit to fire off)

As soon as I saw "transactional NTFS" and "atomicity", yeah, it sounded like an analog to a standard SQL database "begin transaction" log that keeps everything until a "commit" in a memory/disk buffer. If you leave off the commit, power off the database (not a clean exit), and then restart it, its default "recovery" mode is likely to read any transaction logs and "undo" any changes it made to get back to the consistent "before" state to avoid any inconsistent data, and THEN erase the log to avoid replaying it AGAIN later, and rollback those transactions officially / notify the user and then begin normal startup mode.

Since this is a FILE-system based transaction log (speculation based on the terms used in the description / article), maybe it injects a fake "I changed file/system_value is_bitlocker_password_required from 0 to 1", so the "undo" operation changes it back to 0 (not directly this, but the windows file/system equivalent, which only a system-level process can do, which NTFS transaction log recovery probably is running as), so the system thinks it already prompted the user for the bitlocker recovery code and starts the normal "get the key from the TPM and apply it" procedure and voila..

As to why it works from a different drive? Actually makes a lot of sense - if you run out of disk space while doing an operation.. You can't actually write data to the log to store that information, unless there's some kind of unix-like "5% reserved for root only" setup, which, AFAIK, does not exist for windows. Totally would not surprise me that it supports a transaction log on a DIFFERENT volume, especially if there is a chance of breaking the original volume.

Edit: my guess is that this is a feature of NTFS that was never fully finished or just never found a good use - NTFS came out in 1993, when SQL databases were all the rage - but was never fully deprecated / disabled / hardened. Of course, all speculation.
 
Last edited:
Upvote
40 (41 / -1)

vnangia

Ars Scholae Palatinae
824
As soon as I saw "transactional NTFS" and "atomicity", yeah, it sounded like an analog to a standard SQL database "begin transaction" log that keeps everything until a "commit" in a memory/disk buffer. If you leave off the commit, power off the database (not a clean exit), and then restart it, its default "recovery" mode is likely to read any transaction logs and "undo" any changes it made to get back to the consistent "before" state to avoid any inconsistent data, and THEN erase the log to avoid replaying it AGAIN later, and rollback those transactions officially / notify the user and then begin normal startup mode.

Since this is a FILE-system based transaction log (speculation based on the terms used in the description / article), maybe it injects a fake "I changed file/system_value is_bitlocker_password_required from 0 to 1", so the "undo" operation changes it back to 0 (not directly this, but the windows file/system equivalent, which only a system-level process can do, which NTFS transaction log recovery probably is running as), so the system thinks it already prompted the user for the bitlocker recovery code and starts the normal "get the key from the TPM and apply it" procedure and voila..

As to why it works from a different drive? Actually makes a lot of sense - if you run out of disk space while doing an operation.. You can't actually write data to the log to store that information, unless there's some kind of unix-like "5% reserved for root only" setup, which, AFAIK, does not exist for windows. Totally would not surprise me that it supports a transaction log on a DIFFERENT volume, especially if there is a chance of breaking the original volume.

Edit: my guess is that this is a feature of NTFS that was never fully finished or just never found a good use - NTFS came out in 1993, when SQL databases were all the rage - but was never fully deprecated / disabled / hardened. Of course, all speculation.
I wonder if it's not a kernel issue—because as far as I was aware, only the kernel holds the certificate to request the TPM key, and nothing else can request it. As for the rest of it ... it's been a long while since I looked at Windows code, but the USN journal has to be stored on the volume it's attached to, except in truly weird circumstances (specifically, I'm only aware of WHS doing something wacky with it, and if you're doing offline volume recovery).
 
Upvote
4 (4 / 0)

Smeghead

Ars Praefectus
4,642
Subscriptor
We're in the middle of something that this is quite relevant to, so I figured I'd give it a try for myself. The short version is that I can only partially reproduce the described results.

I have a laptop that I have basically no rights to other than a standard user account that's been further restricted. I cloned the git repo, I prepared a USB drive on another machine, then copied the magic folder and its contents to the correct path on the USB drive.

I went through the steps to reboot the running laptop into recovery mode with the drive attached, and ended up at the shell as described.

What I don't appear to have, however, is "unrestricted access to the bitlocker protected volume" (direct quote from step 5 of the instructions on the git page). I can drop to x:\Users, but that folder is empty, save for a Public folder. None of the actual home folders from the laptop drive are in there.

Similarly, stuff that should be at the root of x:\ is not visible.

As I mentioned, I don't have the rights to the laptop to (say) put anything in the windows dir to prove one way or another that this providing full access. Same goes for the root of c:\ on the machine when it's running.

Yes, I get that at this level, you could potentially mess with the contents of the windows dir to open things up when rebooted. However, that takes a good bit more effort than boot-the-machine-and-copy-files.

I'm trying to understand what, if anything, we might be able to do differently in the face of this sort of thing. Am I missing something, or is this not quite what it seems for the set of restrictions placed on this system?

Edit: Thank you, bbcode for interpreting colon-backslash. Let me see if I can fix that. :\
 
Last edited:
Upvote
0 (2 / -2)

Negative Entropy

Ars Scholae Palatinae
604
Subscriptor++
Our standard setup includes:
BIOS settings:
1) prevent boot from usb
2) prevent boot from anything except the internal SSD by requiring the BIOS password for said altered boot paths
3) bios visual access and modification requires bios admin password

256 bit AES bitlocker
And, on laptops:
1) bitlocker boot PIN required

So it sounds like our workstations would be susceptible if a user can trigger a “boot into recovery mode” from within Windows as we leave the recovery partition enabled and only files are read from the USB, the USB is not used as a boot drive (no boot PIN required on workstations) but our laptops not (due to boot PIN).
 
Upvote
2 (3 / -1)

rhavenn

Ars Tribunus Militum
1,808
Subscriptor++
Technically, BitLocker + TPM only really every protected you from the drive being removed from your device and it still does. This hack / backdoor / bug only works if the USB is plugged into the same device the drive came from. Basically, this is more about bypassing the "login" and getting the TPM to "unlock" the drive in recovery mode.

Personally, my take is that this is obscure enough that I'm 50 / 50 on if it's a purpose built backdoor or just the result of multiple teams working on stuff and just meeting "bullet points" and never really communicating about the finished product, so it's more a "bug".

That being said; the author of the hack also states that they have a way to get around TPM+PIN and that's a scary one.

Technically, if I'm remembering right, just like LUKS on Linux, the BitLocker volume is locked by an encryption key and the TPM / TPM + PIN / password just unlock that encryption key and possibly, just like with LUKS, this has multiple "slots" for different methods of unlock and this is a backdoor to run that "unlock" because you can also store "admin" recovery keys in AD for BitLocker recovery and that would just be a "slot" to unlock the encryption key.

Personally, on Linux, if I had to make it secure, I would use 2 FIDO2 keys in 2 LUKS slots and a really long password printed out and in a safe in a 3rd slot. Keep one key for the "admins" and a user gets their own key.
 
Last edited:
Upvote
-1 (3 / -4)

Wbd

Seniorius Lurkius
42
Subscriptor++
Most employers tend to brick USB thumb drive functionality because of all the things that can now go wrong with them....but are those locks at the OS or the UEFI level?

It is kind of wild to me this little documented system folder/DLL has a function 0 day exploit. Someone AI laid off from MS who knew about this folder/DLL release it maybe?
They disable USB storage not USB entirely; most firmware will allow USB disable only by shutting down the controller which borks keyboards etc that may be in use - which is why USB disables are generally in the OS which allow finer grained control - and is wide open to the recovery partition.
 
Upvote
9 (9 / 0)

rhavenn

Ars Tribunus Militum
1,808
Subscriptor++
Doesn't setting a BIOS password prevent the computer from booting past the BIOS? In that case you cannot get to Windows Recovery and therefore the exploit can't even start to work? I assume the exploit doesn't work if you move it to another PC since the TPM keys are stored on the original device which is still required to decrypt its contents.

Most modern Laptops do no remove or reset the password if the BIOS was reset or the battery was removed based on a Google (AI?) search.

As for when the BIOS password does get reset, maybe this exploit doesn't work if the system is forced to go straight to Bitlocker recovery mode, which a BIOS reset usually triggers anyway?
You can usually set a BIOS / UEFI "boot" password which prevents booting at all or a BIOS "admin" password which prevents accessing BIOS / UEFI settings. The "admin" password plus disabling bootable USB / bootable external devices would be enough to render this moot and leave the USB still functional as "data storage" for users. If you just hard disable the USB it won't work inside Windows either for legit users.
 
Upvote
1 (1 / 0)