The Generative AI Bubble Is really Going to Pop - Part Deux

As far as the AI bubble popping - I've been on Warren Buffet's often-cited strategy for two decades - constantly put as much as you can into a simple, low-cost S&P 500 index fund, as the vast majority of 'active' investment managers consistently fail to beat the general market in returns. The run up in the S&P, though, has me second-guessing if I shouldn't get out and go money market. OTOH, I've stayed in without a single change during every market 'crisis' going back to 2008 (I was just starting to invest during the dot-com bust so it didn't affect me much), and regardless of how much I'd wished I'd 'gotten out' in retrospect after every crash, it's always come back stronger than ever. Conditions are scary, though. AI unsustainability along with the current energy shock and inflation seems to set the stage for a major correction.
As I've mentioned before, one of the original benefits of the S&P500* was diversification. The S&P500 has less diversification than ever before - I'm now seeing estimates that ~50% of the overall S&P500 value/price is in the "AI ecosystem".

I've had international diversification for well over a decade. Asset allocation has been 50% US, 50% ex-US. Before that, S&P500 starting ~2 decades ago, with a later migration to US Total Stock Market (VTI or equivalent) for a moderate increase in diversification.

More recently I broke from using a single index for the US stock market. My US investment allocation is now 50% large cap, 30% mid cap, 20% small cap.

So, overall: 50% ex-US, 25% US large cap, 15% US mid cap, 10% US small cap.
 

Exordium01

Ars Praefectus
4,323
Subscriptor
The question (IMO) is speed of evolution, and the fairly large amount of low-productivity deadwood in companies today. I'll speak from some recent experience - there is a function in my department (a very large company - over 100K employees) that sends out questionnaires every year to application owners, vets the responses, and does some metrics around the results. There are other functions that take the database of those responses and runs control attestations on them, and does metrics around the results. Other functions audit those various metrics.

As a relative newcomer to the company, it's obvious to me that every person in every one of those functions could easily be architected out of a job by an AI. These aren't difficult tasks, they're structured and repetitive, and they don't have to be done perfectly to still be done better than the (still error prone) human analysts. Collectively, these folks probably make $1MM per year in salary, plus a 30% uplift for benefits. And this is one part of one department - less than a dozen people.

There used to be a T-shirt for sale that was quite popular some years ago and I wish I'd picked one up, and it said on the back "Go away or I will replace you with a very small shell script."
We’ve been calling them bullshit jobs for a long time. They are a feature, not a bug. They allow our economy to operate. In our push for automation, we forgot that we already knew that the jobs didn’t need to be done.
 
We’ve been calling them bullshit jobs for a long time. They are a feature, not a bug. They allow our economy to operate. In our push for automation, we forgot that we already knew that the jobs didn’t need to be done.

I have not read the book bullshit jobs yet, but I do wonder how many so call BS jobs are actually BS job. Mainly on the checker. To me the whole QC/QA unit are just a big checker unit. Without QC/QA, I do not know if companies can still release products that meet quality requirements.
 

NervousEnergy

Ars Legatus Legionis
11,507
Subscriptor
We’ve been calling them bullshit jobs for a long time. They are a feature, not a bug. They allow our economy to operate. In our push for automation, we forgot that we already knew that the jobs didn’t need to be done.
From a societal perspective, yes. There's no way any rational company management could think that way, however, and keep their jobs. Maximize shareholder value is what the owners (shareholders) expect, and doing so is a fiduciary requirement. It's not even that the jobs themselves are BS (though some are, of course) - many of these activities produce data that needs to be produced and analyzed, sometimes even for regulatory requirements. It's that you don't need humans to do them anymore,
 
  • Like
Reactions: Danger Mouse

flere-imsaho

Ars Tribunus Angusticlavius
9,933
Subscriptor
I'm pretty certain that plenty of executives know, or at least suspect, that there are lots of people in their corporation doing jobs that don't add value or even don't need to be done. The challenge typically is figuring out where those people are. Also, those people represent headcount, which theoretically can be redistributed for new needs, so executives usually do their utmost to keep people from poking around their fiefdoms. This is all why you see RIFs being "everyone has to cut 10%", as opposed to going through and identifying jobs that don't need to be done anymore.
 
Thou shalt not make a machine in the likeness of a human mind. We need mentats, though, to make it stick.


As far as the AI bubble popping - I've been on Warren Buffet's often-cited strategy for two decades - constantly put as much as you can into a simple, low-cost S&P 500 index fund, as the vast majority of 'active' investment managers consistently fail to beat the general market in returns. The run up in the S&P, though, has me second-guessing if I shouldn't get out and go money market. OTOH, I've stayed in without a single change during every market 'crisis' going back to 2008 (I was just starting to invest during the dot-com bust so it didn't affect me much), and regardless of how much I'd wished I'd 'gotten out' in retrospect after every crash, it's always come back stronger than ever. Conditions are scary, though. AI unsustainability along with the current energy shock and inflation seems to set the stage for a major correction.
I'll reiterate my thoughts on the fundamentals of AI stock market bubble and how it will impact individual investors. Please feel free to comment ;)

LLM AI companies are not covering the operating costs required to train LLMs and build out additional data centers. I'll lump in inference costs in with those costs as well. The AI companies burning through investors' money at a shockingly high monthly rate of tens of Billions of dollars every month. There are multiple scenarios that could lead to investors losing confidence in the future profitability of these companies, sell their stock, leading to a stock market correction, then an economic recession. 1) AI companies don't get to a "good enough" AI product for corporate users and/or backlash against the AI hype kicks in, 2) AI companies can't cover the cost of inference, I.E. the cost of LLM is too high, 3) Chinese companies undercut the cost of inference, I.E. the profit of LLM is too low.

So then investors start to sell their AI and AI related stock, AI company valuations start to drop, panic selling kicks in, valuations really drop. The stock market as a whole drops. Consumer confidence drops after everyone sees their 401k investments drop, Consumer spending drops as a result. Company profits are impacted. Companies resort to further layoffs. Loans to private equity companies go into default, causing payments from these loans to drop, causing a run against private equity funds, which are not regulated. The "D" team in the white house don't have a clue or a strategy. The Fed has few options because they have no money as a result of the tax cut for the 1%. The downward spiral continues. Happy Monday!
 
I'm pretty certain that plenty of executives know, or at least suspect, that there are lots of people in their corporation doing jobs that don't add value or even don't need to be done. The challenge typically is figuring out where those people are. Also, those people represent headcount, which theoretically can be redistributed for new needs, so executives usually do their utmost to keep people from poking around their fiefdoms. This is all why you see RIFs being "everyone has to cut 10%", as opposed to going through and identifying jobs that don't need to be done anymore.
Jack Welch of GE invented a strategy whereby the top 15% performers get a bonus and the bottom 15% (or whatever) get cut. This happens on a regular basis where I work, at a company with 30,000 employees. It's kind of already built into US capitalism.
 
  • Like
Reactions: timezon3

JiveTurkeyJerky

Ars Legatus Legionis
10,426
Subscriptor
I'm pretty certain that plenty of executives know, or at least suspect, that there are lots of people in their corporation doing jobs that don't add value or even don't need to be done. The challenge typically is figuring out where those people are. Also, those people represent headcount, which theoretically can be redistributed for new needs, so executives usually do their utmost to keep people from poking around their fiefdoms. This is all why you see RIFs being "everyone has to cut 10%", as opposed to going through and identifying jobs that don't need to be done anymore.
I think another part of the problem, is that their systems aren't well defined. Their processes rely on the human to use workarounds or "just figure it out". The AI Agents currently need structures and systems that are well defined to execute accurately. It can be a lot of work to fix broken systems, doubly so for an executive that might not have come up in said systems, so doesn't truly know how they work under the hood.

I feel like that's the reason a lot of AI experiments are failing, because most companies throw people into broken systems and they just find ways to patch it together. The AI Agent isn't good enough to do that. Yet.

But if you do have a well mapped & defined system? Current AI is already better than most humans for those right now. At some point they'll be good enough to be dropped into broken systems, but they're not there yet (I'm wondering if Mythos is the first to approach that line?).
 

w00key

Ars Tribunus Angusticlavius
8,982
Subscriptor
It's also a problem of bad internal IT. The bigger you are the more likely you are stuck with Salesforce, SAP or one of the Oracle products. Any change or upgrade always runs late and at a multiple of the budget.

Trying to duct tape AI onto that is pretty doomed, or the AI layer has to be so good to workaround all the legacy bullshit. I wonder if that works.


We will see new lighter companies appear that will eat legacy and crusty company's lunch, even if the only difference is IT and overhead or meatbags fighting the dumb ERP. AI native of course, and homemade systems without the enterprisey nonsense. I haven't seen many support tickets that a bot can't help with, heck with a good bot that preps the chat, gathers all info and escalate I can get results much faster than via the phone queue.
 

Soriak

Ars Legatus Legionis
12,853
Subscriptor
We’ve been calling them bullshit jobs for a long time. They are a feature, not a bug. They allow our economy to operate. In our push for automation, we forgot that we already knew that the jobs didn’t need to be done.
Companies don't subsidize "bullshit" jobs because they want to promote the greater good of the economy. They just see labor differently from how most employees think about it. If I pay someone $4k/month and they only need to do real work for 8 hours per month, but that generates more than $4k in value, then I'll keep them around. People don't need to be productive every day all day to be worth their salary. In fact, someone might only be productive during a small part of the year but that still justifies their annual salary. You can't just temp hire someone for a month to do a task that requires a specialized skillset.

That's also why AI won't replace all jobs. AI doesn't figure out what needs to be done. You need a human expert to know the "what" and "why." Then, AI takes care of the "how". That's a problem if your entire job is "how" to do something, but that's rarely a good job anyway and has always been the first thing that gets outsourced. Everyone else gets a lot more valuable, because they can very quickly implement the "what".
 
  • Like
Reactions: AndrewZ

Shavano

Ars Legatus Legionis
69,071
Subscriptor
I have not read the book bullshit jobs yet, but I do wonder how many so call BS jobs are actually BS job. Mainly on the checker. To me the whole QC/QA unit are just a big checker unit. Without QC/QA, I do not know if companies can still release products that meet quality requirements.
Bullshit Jobs, as defined in the book by David Graeber, are jobs that according to those who hold them, do not "make a meaningful contribution to the world." He's concerned about several aspects of that. First, because people (especially Americans and Brits) define their self-worth according to their jobs, he believes these jobs are harmful to the people who have them. Their morale is undermined by the belief that they are doing a worthless task, and often bullied into it by denying them the opportunity to do more meaningful work. Second, because people who hold these jobs (he thinks) resent those whose jobs are meaningful. That resentment plays out politically and contributes to the situation where the most useful jobs (he uses examples of trash collectors or teachers) get low pay and status.

Graeber's most interested in the judgment of those who hold them because he thinks few if any know better than those who do them whether their jobs are bullshit, and because of the demoralizing effect he thinks they have on the workers. He thinks most people want to work and that they want to be able to see their work doing something that benefits others.

Some bullshit jobs actually do harm rather than just being useless. Take as an example people whose work is denying claims at the insurance company for bullshit reasons. But they're not necessarily useless in the eyes of those who employ them. Management at the insurance company and stockholders have a stake in the insurance claim deniers doing their jobs zealously.

He goes on to discuss all kinds of policy implications and advocates for basic income to eliminate but also remove the need for many of the bullshit jobs.
 

hanser

Ars Legatus Legionis
43,061
Subscriptor++
Jack Welch of GE invented a strategy whereby the top 15% performers get a bonus and the bottom 15% (or whatever) get cut. This happens on a regular basis where I work, at a company with 30,000 employees. It's kind of already built into US capitalism.
You can be an outstanding performer at a job function that is antithetical to the interests of the company. Performance like this is always stack-ranked within a given function; stack ranking is literally blind to the "this job shouldn't be done at all" problem.

I regularly joke that someone probably got a promotion by adding an especially hare-brained question to a "diligence" questionnaire. "Look at me! I'm making things more secure!"

No you're not. You're wasting your department's and partner's time. You are literally just gumming up the works for no benefit.

Great, let's hire people to create friction, and then hire more people to deal with the friction we created for ourselves. ♻️
 
Last edited:

Shavano

Ars Legatus Legionis
69,071
Subscriptor
Jack Welch of GE invented a strategy whereby the top 15% performers get a bonus and the bottom 15% (or whatever) get cut. This happens on a regular basis where I work, at a company with 30,000 employees. It's kind of already built into US capitalism.
It's a bad strategy. If you were to carry out such a strategy year after year, you'd either run out of employees that were actually bad at their jobs in 2 years or less or you'd fire good people just to make the cut, and that would demoralize the people you didn't fire because they know damn well Dan was doing his job and you fucked him over to make your arbitrary number and there's a 15% chance you'll fuck them over next year. The only way it makes any sense at all is if you're terrible at hiring (you hire 3 bad employees who can't be managed into useful employees out of every 20) or your management is so bad at their jobs they can't even identify or train employees to do their jobs acceptably.

So if you're even considering firing 15% of your workforce 2 years in a row, it should be the managers because that's who wasn't doing their jobs acceptably to get a in a situation where 15% of the workforce was not performing acceptably.
 

Shavano

Ars Legatus Legionis
69,071
Subscriptor
You can be an outstanding performer at a job function that is antithetical to the interests of the company. Performance like this is always stack-ranked within a given function; stack ranking is literally blind to the "this job shouldn't be done at all" problem.

I regularly joke that someone probably got a promotion by adding an especially hare-brained question to a "diligence" questionnaire. "Look at me! I'm making things more secure!"

No you're not. You're wasting your department's and partner's time. You are literally just gumming up the works for no benefit.

Great, let's hire people to create friction, and then hire more people to deal with the friction we created for ourselves. ♻️
I'll add to that, if someone has a bullshit job, that is always the fault of management. Either because you've assigned a person to do something that doesn't need to be done, or you've failed to make it possible/efficient to do what needs to be done, or you've failed to explain to the person why their job is important.
 
  • Like
Reactions: keltorak

Shavano

Ars Legatus Legionis
69,071
Subscriptor
It's a bad strategy. If you were to carry out such a strategy year after year, you'd either run out of employees that were actually bad at their jobs in 2 years or less or you'd fire good people just to make the cut, and that would demoralize the people you didn't fire because they know damn well Dan was doing his job and you fucked him over to make your arbitrary number and there's a 15% chance you'll fuck them over next year. The only way it makes any sense at all is if you're terrible at hiring (you hire 3 bad employees who can't be managed into useful employees out of every 20) or your management is so bad at their jobs they can't even identify or train employees to do their jobs acceptably.

So if you're even considering firing 15% of your workforce 2 years in a row, it should be the managers because that's who wasn't doing their jobs acceptably to get a in a situation where 15% of the workforce was not performing acceptably.
Expanding on this: I worked for a while at a company that forced managers to do stack ranking of all their employees so they could compare across the company and "figure out" who was underperforming or overperforming. Nothing could more exactly be described as a box-checking exercise that did actual harm to the company.
 
It's a bad strategy. If you were to carry out such a strategy year after year, you'd either run out of employees that were actually bad at their jobs in 2 years or less or you'd fire good people just to make the cut, and that would demoralize the people you didn't fire because they know damn well Dan was doing his job and you fucked him over to make your arbitrary number and there's a 15% chance you'll fuck them over next year. The only way it makes any sense at all is if you're terrible at hiring (you hire 3 bad employees who can't be managed into useful employees out of every 20) or your management is so bad at their jobs they can't even identify or train employees to do their jobs acceptably.

So if you're even considering firing 15% of your workforce 2 years in a row, it should be the managers because that's who wasn't doing their jobs acceptably to get a in a situation where 15% of the workforce was not performing acceptably.
As you hire replacements, you generally can't tell if they will be good or bad. You can generally spot certain traits that point towards someone likely being a good hire, but it's not 100%.

Also, some people lose motivation over time for a variety of reasons. And some "low performers" may suddenly land in a project that perfectly aligns with their skills and interests and they take off.

It is still demoralizing in the sense that you are always worried about being in that bottom 15%, but from the company's viewpoint, this motivates people and makes sure that they always have the best of the best in terms of employees.

The managers may deserve some blame if they aren't trying to manage their employees to boost their value, but there's the saying about leading a horse to water.

Also ++ to AndrewZ....I work in a place like this. The cycle is generally more on a 2-3 year basis, seems to come up randomly, and is always explained away as being related to some event in the news ("Due to political instability/war in <wherever>/downturn in the economy/<insert reason here>, we're aligning our workforce to better meet the needs of the business and the customer....", blah blah blah).

Everyone that's been here for a while knows what this means...."We're cutting the people that aren't wanted and getting some new hires.".
 

ramases

Ars Tribunus Angusticlavius
8,703
Subscriptor++
If you tell me I need to fire 15% of my reports each year, what you are really telling me is that I should strive to hire 80% competent and 20% incompetent candidates ... pardon, "buffer hires".

It is 20% instead of 15% because in such a political environment it is always useful to have a number of designated victims that can be thrown under the bus to satisfy unthinking demand for consequences for this or that without impacting the organization's ability to deliver.
 

Vince-RA

Ars Praefectus
5,324
Subscriptor++
"We're cutting the people that aren't wanted and getting some new hires.".
This might be a useful exercise if they actually cut only low performers, but in my experience, the people who end up getting cut are more often expensive (senior) and/or don't have protection from a politically connected executive. Good people get cut, and lower paid idiots in the right place in the org end up living on to do stupid things another day. Too cynical a take?
 
If you tell me I need to fire 15% of my reports each year, what you are really telling me is that I should strive to hire 80% competent and 20% incompetent candidates ... pardon, "buffer hires".

It is 20% instead of 15% because in such a political environment it is always useful to have a number of designated victims that can be thrown under the bus to satisfy unthinking demand for consequences for this or that without impacting the organization's ability to deliver.
We called them "red shirts". :)
 

Davidson09

Smack-Fu Master, in training
33
This might be a useful exercise if they actually cut only low performers, but in my experience, the people who end up getting cut are more often expensive (senior) and/or don't have protection from a politically connected executive. Good people get cut, and lower paid idiots in the right place in the org end up living on to do stupid things another day. Too cynical a take?
No, not too cynical.

The most amusing thing about the process is the survivors, those who didn't get fired after the first few rounds they experience then start defending the system as if it works well. They can't accept the fact that randomly firing people would have been as effective, and they'd still feel like they were successful if they survived.

Success only has to do with the lack of success of your coworkers, so sabotaging their projects is a strategy that's more achievable than always succeeding on projects you only control a tiny piece of.
 
If you fire some percentage of your least valuable employees and re-hire to replace them, things should generally improve for a while. At some point you will hit stasis, where the new hires are no better than the people you just let go. This is when you'd stop. Or at least think about what you are doing.

An analogy that would be more appropriate is to consider how farmers cull their livestock. They aren't friends with the cattle and the cattle are unaware that they are being graded. This presents the ideal environment for this sort of thing to work. Remove the least productive members of the herd, keep the good ones, buy better stock to improve your own herd and over time...things generally improve. At some point, any cattle you can buy to bring into the herd are no better than what you already have.

But when done in the workforce, it becomes a political exercise (managers keeping their friends, for example) or you have employees sabotaging other employees in order to stay around, the whole thing falls apart and the value is lost.

Almost always, the exercise turns into the latter instead of the former and this is why it's doesn't work in practice. I was presenting the value in a perfect world, not a realistic one. :)
 

NervousEnergy

Ars Legatus Legionis
11,507
Subscriptor
This might be a useful exercise if they actually cut only low performers, but in my experience, the people who end up getting cut are more often expensive (senior) and/or don't have protection from a politically connected executive. Good people get cut, and lower paid idiots in the right place in the org end up living on to do stupid things another day. Too cynical a take?
Relevant to the topic, if a company cuts human headcount that can at least mostly (if not entirely) be replaced by AI-driven automation and not backfill the humans at all, then whether or not it's cynical comes down to company performance with AI handling those roles. If financial reporting shows companies are able to reduce headcount and associated employee expenses by utilizing AI for those positions while keeping all other things equal, then you don't have an AI bubble to at least the degree that EPS is improved by the adoption of AI.

Companies reduce headcount for a variety of reasons, though, no matter what the press reports say, so getting accurate information across the landscape on how much bottom line benefit AI is providing to income statements is difficult. From what I've seen the jury is very much out on whether the current run up in AI stock valuations or companies professing to be using AI is a bubble or is a fundamentals-backed rise due to employee expense reduction. My guess is the latter is unlikely to be strong enough to keep the market from being a bubble that will eventually pop, at least partially. Looking at earnings reports and separating out EPS bubble rises due to AI spending just for more AI, and analyzing what's left for whether or not real productivity increases are actually occurring to justify values is the big ask.