I've used CIS benchmarks to do things like harden server operating systems for a very long time now. Some of the stuff was of questionable or borderline utility at best, but a lot of it was solid advice - disable unused services, reduce the number of places you can execute things, make sure you're capturing logs, etc. Over time you borrow/build up a good ansible playbook or whatever to run during image building and it ends up being a good investment, especially when you can point to the benchmarks on DDQs etc.
Now that CIS has gone "cloud native" it all feels completely bonkers. Again, there is good advice in there for sure (don't expose your stuff directly to the internet, use MFA, etc) but there is just so damn much now and so much of it applies to multiple pieces of infrastructure. 10 points on key vaults and 68 key vaults in our production environment alone = 680 lines in a report. Even our relatively small environment we'll easily end up with tens of thousands of benchmark data points.
Another annoying aspect of this is that many of the recommendations directly increase costs with questionable benefits (thou shalt have defender for goddamn key vault!) - but if you don't adopt the recommendations, there is your NONCOMPLIANT right there in red and white in the defender for cloud dashboard and reporting and everyone hates that, don't they precious? The flip side of that coin is that clients have started asking for this level of detail on DDQ engagements, so either you dump your report (if it looks ok) or you suffer through insane bespoke spreadsheets and justifications and evidence requests. It's a brilliant design if you're a cloud provider trying to sell security, but it seems like the net effect is a gamified security theater treadmill with a questionable increase in actual security.
(Why yes, I'm old, and I'm especially cranky today, why do you ask?)