The RAM shortage’s silver lining: Less talk about “AI PCs”

Post content hidden for low score. Show…

C.M. Allen

Ars Tribunus Angusticlavius
6,048
wouldn't that require a tremendous amount of investment in telecom infrastructure to handle everyone streaming everything all the time? (not to mention people stuck on rural DSL or satellite with limited bandwidth and/or terrible latency)

if this RAM cartel nonsense keeps up, hopefully someone will come up with a way to import grey market Chinese RAM once they get some more fabs spun up
ISPs would LOVE the inexhaustible slush fund they'd be able to charge to build it. So you're only further cementing reasons why they'd want this....
 
Upvote
10 (10 / 0)

buback

Ars Scholae Palatinae
764
Gotta admit, the ultimate use of AI for me would be better voice-to-computer interfaces. There's a few other uses, but that's the main one.

They seem to get used for everything but that.
Yeah a VUI (instead of GUI). I've been learning proxmox, which has ment: google command, enter command, cross fingers. (It's called a homelab because it's an experiment.)
A VUI that you could just tell "mount that USB drive I just plugged in into the truenas vm" would be great. Just something to translate my intents into command prompts.
 
Last edited:
Upvote
-2 (0 / -2)

corvusman

Smack-Fu Master, in training
11
So basically, one way or another, consumers are going to be forced to pay for AI. Either by buying an “AI PC” or by shelling out extra for a device with decent RAM. Nobody asked for it, but hey, our corporate overlords made a business decision … so shut up and jump. I don't want to live on this planet anymore(c)
 
Upvote
22 (22 / 0)
You could just use a Linux Distro for that, I am running Linux on a 17 year old machine with just 4 GB of ram and it works fine for what I use it for.
Linux Ubuntu on a 6 year old machine with 64 gigs.* Windows 7 Pro for a very long time then Linux for me. Tried to like 10 and 11 but, nope. AI diligently filtered out but it's like crabgrass, always some somewhere.



*Slower ram was cheap at the time and COVID was getting underway, so wanted to experiment.
 
Upvote
8 (8 / 0)
I want the AI hype/forced use train to go away as soon as possible please, because I want a small, quiet gaming PC similar to a Steam Machine and not have to pay 100 trillion clams.

Also, AI tech is hacky shit that has minimal appeal apart from the occasional convenience of search summaries

Also, it is creating the worst slop of internet content imaginable.
 
Last edited:
Upvote
17 (18 / -1)
birthday-happy.gif


"Yayyyyy..."
 
Upvote
0 (1 / -1)

Mardaneus

Ars Tribunus Militum
2,041
When Robert R. Taylor bought the entire supply of hand pumps it was not because he needed that much supply for his product, it was in order to maintain exclusivity in the liquid soap market. He knew that if he bought the entire supply, his competitors would be hamstrung getting competing products to market, giving him an advantage.

I sincerely doubt that OpenAI needs the 40% of global DRAM wafers (not chips, packages, modules, or cards) that it has bought.
AFAIK they didn't buy they signed letters of intent, basically stating we want to do this sometime in the future at this price. OpenAI doesn't have the billions needed to buy 900k wafers a month. The rest is the market panicking.
 
Upvote
19 (19 / 0)

C.M. Allen

Ars Tribunus Angusticlavius
6,048
I'd bet good money they still won't build it though.
Sure. Just as slowly and expensively as they can possibly make it. I mean, why charge once when you can just keep charging to build the same infrastructure over and over? Well, I mean, eventually they deliver, but only enough to keep the money flowing without ever realistically finishing.
 
Upvote
5 (5 / 0)

Eldorito

Ars Tribunus Angusticlavius
7,928
Subscriptor
Apple still uses external DRAM suppliers (Samsung, SK Hynix and Micron). The DRAM dies are stacked onto the Apple silicon during manufacture.

So Apple is as vulnerable as anyone else. However, it's possible they have long-term supply arrangements in place that could insulate them from the immediate pricing shock.

Apple would have already been building stock for their mass release of products in March/April, so I suspect they’re good. They’re renowned for their supply chain for a reason.

Much like there hasn’t been GPU price increases, Nvidia and AMD bundle it with their GPUs. However, Nvidia have already hinted that is ending and I’m sure GPU prices will follow RAM prices in skyrocketing.
 
Upvote
5 (5 / 0)

Fatesrider

Ars Legatus Legionis
24,977
Subscriptor
The increased costs have implications for the concept of “AI PCs,” a term OEMs have used consistently over the last two years as they sought to leverage the growth of generative AI chatbots to drive computer sales.

Shoppers, however, have been either too reluctant or too savvy to buy into manufacturer-made AI PC hype.

“PC OEMs had trouble selling the on-device AI message even before the memory shortages,” Ubrani said.

And this boys and girls is why I build my own systems. I can upgrade as I go, and reuse RAM from previous builds that will work fine in what I build. The people who will be hit the hardest are the bleeding edgers, and they are used to egregious exsanguination (or should be) WRT costs and such.

Computers pretty much reached a level of performance where you need a stopwatch to tell the difference years ago. Side by side with a modern system, sure, often very noticeable differences. But compared to the days when you turned on your computer, then went to make coffee and a sandwich, eat the latter and drink the former, then sign in when the login prompt appeared are long past us.

So the NEED for the latest and greatest falls on those who - generally - can most afford it. The wants are going to be only loosely aligned with that, and that's going to hit those who want, but can't afford, requiring them to sacrifice, or settle.

That's up to the individual builder, of course.

With the useful life of a computer reaching beyond half a decade now, there's not a lot of rush to upgrade.

And there will be a lot less when prices skyrocket over a bullshit move on the part of the tech industry who are convinced, with no credible evidence, that their latest drug for the masses will make a profit (when it hasn't since the drug hit the market).

THEN great used parts will be available dirt cheap. The best news is that could be later this month.
 
Upvote
2 (8 / -6)
Computers pretty much reached a level of performance where you need a stopwatch to tell the difference years ago. Side by side with a modern system, sure, often very noticeable differences.
::wiggles hands::

It's not just about performance. It's also about energy. I replaced my 2010 model Mac Pro with a Mac Studio in 2022 - basically put the order in pretty much as soon as I could, if I remember rightly. This was largely driven by the Mac Pro no longer running a supported operating system - Mojave went out of support in October 2021, and I held on until there was a decent Apple Silicon option to replace the 5,1.

Performance? Noticeably improved. The 5,1 was good enough, but the Studio was a very clear improvement over it, even running x86 software in Rosetta. Power? Hugely improved - the 5,1 put out a lot of heat, although I didn't really recognise it until I was using the Studio on a daily basis.

So yes, you can probably get by with an older system, but the power savings in upgrading are not to be sneezed at.
 
Upvote
29 (29 / 0)
I don't want to have to rent those few services that I find genuinely useful from some feudal lord's datacenter.
AI in an increasingly subscription based O/S can get in the sea, but a lack of research into and capability in the end user hardware will only hurt open versions of solutions that I would prefer to use in the long run.
 
Upvote
10 (10 / 0)
if AI PCs were struggling to sell even before the RAM shortage, is that really a silver lining here? The product line just wasn't working.

It seems to me like the main problem is that the value proposition is really hard to articulate (aka it's bad). For someone who doesn't want AI slop on their computer, it's purely a negative. For someone who likes AI, cloud models are much more powerful and functional, so you're now aimed at someone who's deep enough in the weeds that hardware AI is something they want... but it seems like anyone approaching AI from that perspective would rather use their own hardware and have better control over the software than a PC with a built in gimmick chip.

So between general resistance to AI, local LLMs not being very good yet, and hobbyists probably preferring to work more independently, it's really hard for me to tell who these products were even for.
 
Upvote
26 (27 / -1)

clb2c4e

Wise, Aged Ars Veteran
145
Yes. Yes it is.

Why would it be a good thing? It's a marketing fad based on software that won't exist in a few years. Remember Cortana?

It's a waste of space on the silicon. You could replace that stuff with more cache.
Came here to say exactly this, it is most definitely a bad thing.

It takes the advantage of a pc, as a general tool, and makes it needlessly specialized. Adding that ai is like 'science': what kind, for what use? Still waiting on compelling answers for day to day life.
 
Upvote
8 (9 / -1)

85mm

Ars Scholae Palatinae
1,056
Subscriptor++
This is a typical case of trying to sell an idea, but with no product to back it up. I don't see much in the way of end user friendly local AI apps available, certainly not anything that your average user would know about and demand. The only way to try and sell this stuff is fear of missing out on the next big thing and that's a much harder proposition.
 
Upvote
8 (8 / 0)

MrWalrus

Ars Tribunus Militum
1,709
Satya Nadella had sent an email to engineering heads expressing disappointment with the consumer version of Copilot. He reportedly wrote that tools for connecting Copilot with Outlook and Gmail “for [the] most part don’t really work” and are “not smart.”

"Doesn't really work" and "not smart" are, by remarkable coincidence, also my top criticisms of Satya Nadella, and yet Mircosoft still allows him to connect with Outlook and Gmail for some reason.
 
Upvote
24 (24 / 0)
I keep testing LLMs on areas that might be considered "tricky" where I'm an actual expert.

I'm seeing if they get the answers right, or are very confidently wrong and try to gaslight me on this.

So far I'm seeing about a 25% failure rate, as of a month ago.
As this is a post about AI PCs, I assume you're talking about local models. It's awesome that you're trying out local LLMs, but most "AI PCs" can't run anything but the smallest local models. These have very limited use cases, and certainly can't compete in world knowledge stakes.

You shouldn't be surprised if you find Qwen-3-30B-A3B, GPT-OSS-20B or Gemma 3 4b failing expert edge-case knowledge questions.
 
Upvote
2 (2 / 0)

phuzz

Ars Centurion
270
Subscriptor
You could just use a Linux Distro for that, I am running Linux on a 17 year old machine with just 4 GB of ram and it works fine for what I use it for.
I'm guessing you don't have many tabs open? The amount of memory the OS uses these days is in a minority compared to how much a web browser needs. Switching form Windows to Mint might save you a few hundred MB of RAM. Opening one modern web page will eat all that RAM back up.
 
Upvote
21 (21 / 0)
When Robert R. Taylor bought the entire supply of hand pumps it was not because he needed that much supply for his product, it was in order to maintain exclusivity in the liquid soap market. He knew that if he bought the entire supply, his competitors would be hamstrung getting competing products to market, giving him an advantage.

I sincerely doubt that OpenAI needs the 40% of global DRAM wafers (not chips, packages, modules, or cards) that it has bought.
"Bought."

To most of us, that word means "exchanging cash for goods."

In this case it seems more like "using hype and spin to secure equity investments, then using those as leverage for letters of intent to negotiate purchase contracts, to lock up supply of items not yet manufactured, for use in data centres not yet built, to run software not yet written, to support a market not yet established and that still turns a net financial loss on every operation, chasing profit that can likely never exist in sufficient quantity to justify the above."

And yes. If I were an exec at any chipmaker right now, I would have my staff tripping over themselves to lock up these AI contracts (and using the threat of them limiting supply to hike prices on the other customers), because what hardware company doesn't love the prospect of a couple of grossly overcapitalized high-visibility customers buying half your total global output for five years at three times the profit margin you normally make?
 
Upvote
28 (28 / 0)
Unless you absolutely need a new PC to replace a bricked or truly ancient machine, 2026 is not the year to upgrade. Why overpay for less memory? You'll just be forced to upgrade again as soon as prices deflate, since I sincerely doubt software is going to suddenly get less greedy with system memory in the next 12 months, or ever. If you buy lower specs you will feel it, immediately, and that feeling will only get worse, not better. Really sucks if you're forced into a purchase thus year, though. My condolences.
 
Upvote
7 (7 / 0)

Wheels Of Confusion

Ars Legatus Legionis
75,398
Subscriptor
I am slightly hopeful that the one-two gut punches of W11's hardware floor and the devastating state of RAM/storage pricing for the immediate foreseeable future prompts more people to get Linux on an older machine and extends that device's useful life a little.

But that would be microscopic compensation for (gestures vaguely) all of this.
 
Upvote
11 (11 / 0)

Myself

Ars Centurion
232
Subscriptor++
What amazes me is that OpenAI is a money furnace that has been burning capital like there's no tomorrow and yet somehow SK Hynix, Samsung, and Micron seem to think that they're good for the money. Even companies that are building data centers are starting to question whether OpenAI will be able to pay when the bill comes.
Hypothetically, if they're selling 40% of their capacity to a major customer, who never pays them a cent, but the other 60% goes to a market with prices 500% higher than last year for the exact same product..... might they still come out ahead?

It has the same effect as a price-fixing cabal, but without legally being a cabal because an "external" actor made it happen. Just market forces, nothing to see here!

If this model works, expect the business world to repeat it.
 
Upvote
18 (18 / 0)

Eldorito

Ars Tribunus Angusticlavius
7,928
Subscriptor
Feels like a silver lining in the same way mass starvation decreases obesity rates (yes, overly dramatic, but still…).

I gave up my shiny new pc and stuck a 9070xt in my AM4 one because it had 128gb ram and I use it. Felt bad though, I can’t wait for this to be over. I found 256gb of ddr5 for $2500 Australian (so about $1,600 US) but couldn’t stomach it. Will wait this insanity out, yet again.
 
Upvote
11 (11 / 0)

Diplodocidae_Guy

Smack-Fu Master, in training
81
From the article:

" “Everyone from IT decision makers to professionals and everyday users are looking at on-device AI to help drive productivity and creativity,” he said."

" “… what we’ve learned over the course of this year, especially from a consumer perspective, is they’re not buying based on AI,” Terwilliger said..."

Useless C-Suite sock puppet, pulling down more in a year than most people will make in their lifetime for this garbage. Sounds like Dell needs a new VP and GM of Commercial, Consumer, and Gaming PCs. That's a pretty bad call to shift the entire strategy and manufacturing stream away from your popular products to those that nobody wants or needs. Meanwhile, the janitor was late one to many times and was shown the door.
 
Upvote
12 (13 / -1)
That's only an issue on Windows, Linux can use as much memory as you're willing to give it.

Source? From my understanding they're fine for this quarter but next quarter could see them hit with large price hikes. They're still in negotiations for 2026 orders: https://www.digitimes.com/news/a20260107VL211/apple-dram-2026-nand-production.html
Now you'll find your machine is more powerful than you need.

The only problem is, the commercials have been booked for the year so youtbe3 is full of AI dells doing all sorts of fun things I never knew I needed to do.
 
Upvote
0 (0 / 0)
Yes. Yes it is.

Why would it be a good thing? It's a marketing fad based on software that won't exist in a few years. Remember Cortana?

It's a waste of space on the silicon. You could replace that stuff with more cache.
Clippy popping up and reminding you to run the update for your PC bios and Windows isn't a bad thing. Its bad when "we didn't ask for this!" And if Cortana became the "Cortana" of Halo Master Chief, well, ahem... I would love that as an AI. Until the part her memory corrupted....
 
Upvote
-4 (0 / -4)
Find a way to seize up credit markets to stop the incestuous finance deals between AI firms and the musical chairs will all stop cold along with this garbage.

While I advocate on-device generative and LLM usage for commercial purposes and less so for consumer usage, marketing specific to this purpose should be abhorred and isn't possible while AI data centers are gobbling all the hardware up with no profit in sight.

The longer the musical chairs plays for the AI bubble, the harder the crash will be at the end, and there's even less financial regulation for this type of crash hitting the banking sector than there was for the 2006-2008 housing market crash...
 
Upvote
5 (6 / -1)
I'm sure there are 'think tanks' working out how to dismantle the personal computer market, to replace it with 'computing as a service' cloud systems, so customers own nothing. Ever. It's what big software and game publishers have wanted all along. In addition to massively increasing the barrier to entry for the software/game development business, allowing the existing big players to more easily monopolize the market. At every end of this push, the 'big players' win, and everyone else loses, which is precisely what makes that the outcome those big players will definitely want to try.
They are getting there. The last place I worked pretty much had their entire infrastructure running in azure. They claimed it was cheaper than running everything on prem. Riiiiiggghhhtt.
 
Upvote
6 (6 / 0)

SailingSailing

Wise, Aged Ars Veteran
117
Subscriptor
Linux Ubuntu on a 6 year old machine with 64 gigs.* Windows 7 Pro for a very long time then Linux for me. Tried to like 10 and 11 but, nope. AI diligently filtered out but it's like crabgrass, always some somewhere.



*Slower ram was cheap at the time and COVID was getting underway, so wanted to experiment.
Ditto.

Wife's WIndows 10 machine and my iMac were getting a bit problematic; they're now both happily running Mint and the users are happier, too. I still have to boot MacOS for AutoCAD but am working on switching to FreeCAD. The rest is done and we won't be switching back.
 
Upvote
4 (4 / 0)

Gnothe

Wise, Aged Ars Veteran
185
Subscriptor
New normal for consumer pc: 8G RAM (was getting to 16)

In corporations, 32GB was already quite much standard for developers etc,
but we are probably getting back to 16G. right?
I think there is a significant hype around the need for RAM in (most) consumer PCs, where most I hear would have you believe that 32gb is the bare minimum for Windows and 64 is okay for now but not future proof. I built a new PC in September and was curious about how that held up. RAM was relatively cheap then, but I still got just one 16gb stick intending to get a second one later and test how much better things got. I never hit the ceiling, though, and since it was ddr5 I won't be getting that second stick this year, at least. Even getting a relatively performant CPU feels like a waste, I rarely hit more than 20% load through gaming. If you're actually in it for performance and money is a factor, RAM won't be the issue with pennypinching pre-builts (rather having an underpowered GPU). Or as the article suggests, being able to buy a new budget/mid-range at all.
 
Upvote
3 (4 / -1)

PBG4 Dude

Wise, Aged Ars Veteran
142
The optimist in me wants RAM pricing to bring in a year of performance optimizations to get some bloat out of software. If systems are shipping with less RAM, it sure would be nice to see OSs, applications, and websites optimize for more efficiency.
Between this and the coming GPU apocalypse, game companies will definitely feel the heat to optimize, IMHO.
 
Upvote
1 (1 / 0)
It's not really a silver lining when the only thing that's changing is 16GB of RAM is getting kicked back up to being 'premium'.
I bought a laptop with 32 GB of RAM in 2020. That was not the most you could get even then (though perhaps the most in a thin and light convertible like the one I bought). I certainly think 32 GB is a minimum for a professional software development machine, and has been for years.
 
Upvote
5 (6 / -1)