The RAM shortage’s silver lining: Less talk about “AI PCs”

jhodge

Ars Tribunus Angusticlavius
8,661
Subscriptor++
My Copilot experiments have been unimpressive. Even trying to use it to construct PowerShell scripts or KQL queries has been more trouble than help. Part of the problem is that my PC now has like 17 Copilot icons and tools, so I'm probably using the wrong one, but that's also mostly not a "me" problem.
 
Upvote
182 (188 / -6)
The optimist in me wants RAM pricing to bring in a year of performance optimizations to get some bloat out of software. If systems are shipping with less RAM, it sure would be nice to see OSs, applications, and websites optimize for more efficiency.
They'll optimize for lock-in and recurring income streams, and you'll enjoy your bug-ridden stuttering bloatfest.
 
Upvote
150 (150 / 0)

Resistance

Wise, Aged Ars Veteran
417
As Ars Technica has reported, the growing demands of data centers, fueled by the AI boom, have led to a shortage of RAM and flash memory chips, driving prices to skyrocket.

When Robert R. Taylor bought the entire supply of hand pumps it was not because he needed that much supply for his product, it was in order to maintain exclusivity in the liquid soap market. He knew that if he bought the entire supply, his competitors would be hamstrung getting competing products to market, giving him an advantage.

I sincerely doubt that OpenAI needs the 40% of global DRAM wafers (not chips, packages, modules, or cards) that it has bought.
 
Upvote
250 (252 / -2)

Marlor_AU

Ars Tribunus Angusticlavius
7,669
Subscriptor
What this really means is that companies will keep pushing LLM-based workflows, but the option of running them locally will be diminished. If huge portions of the RAM supply are being reserved for datacenters, that's where the processing will occur.

I'm not as anti-LLM as some people. There are plenty of places where more capable natural language processing will help solve real problems (when integrated sensibly into existing application workflows). But I'm more excited by the possibilities of ever-more capable and efficient open, locally-hosted models than I am about sending all my data to a third-party service. Ultimately, the RAM shortage will stunt the advancement and utility of these local models, providing a leg-up for the large service providers and further entrenching language models as something that are run as a service, not as a local software component.
 
Last edited:
Upvote
125 (125 / 0)

C.M. Allen

Ars Tribunus Angusticlavius
6,048
On the one hand: "RAM prices have soared, which is bad news for people interested in buying, building, or upgrading a computer this year..."

But at least we can rest easy, because now: "why sell you the means only once when they can sell you the 'service' forever!"

It's just greed, all the way down.
I'm sure there are 'think tanks' working out how to dismantle the personal computer market, to replace it with 'computing as a service' cloud systems, so customers own nothing. Ever. It's what big software and game publishers have wanted all along. In addition to massively increasing the barrier to entry for the software/game development business, allowing the existing big players to more easily monopolize the market. At every end of this push, the 'big players' win, and everyone else loses, which is precisely what makes that the outcome those big players will definitely want to try.
 
Upvote
109 (109 / 0)
The optimist in me wants RAM pricing to bring in a year of performance optimizations to get some bloat out of software. If systems are shipping with less RAM, it sure would be nice to see OSs, applications, and websites optimize for more efficiency.

You could just use a Linux Distro for that, I am running Linux on a 17 year old machine with just 4 GB of ram and it works fine for what I use it for.
 
Upvote
-4 (29 / -33)
Upvote
60 (60 / 0)
What many don't understand is that even IF the "AI Bubble" were to completely collapse, prices will never be as cheap as they used to be. When stuff like this happens, companies discover new price ceilings that allow them to maximize margins vs. volume. It starts from the top and it trickles down.

That is the same reason why the 5090 launched at such an absurd price, vs. all the previous generations. NVIDIA and partners realized that yes, people will, in fact, gladly pay thousands for a top of the line GPU.
 
Upvote
115 (120 / -5)
Post content hidden for low score. Show…
What many don't understand is that even IF the "AI Bubble" were to completely collapse, prices will never be as cheap as they used to be. When stuff like this happens, companies discover new price ceilings that allow them to maximize margins vs. volume. It starts from the top and it trickles down.

That is the same reason why the 5090 launched at such an absurd price, vs. all the previous generations. NVIDIA and partners realized that yes, people will, in fact, gladly pay thousands for a top of the line GPU.
Bingo. Prices never go down. If it does ever “pop” these companies will just funnel their marketing budget into proping up the AI house of cards, keeping it all afloat to justify max prices.
 
Upvote
38 (43 / -5)

SomeChemist

Seniorius Lurkius
40
Subscriptor
When Robert R. Taylor bought the entire supply of hand pumps it was not because he needed that much supply for his product, it was in order to maintain exclusivity in the liquid soap market. He knew that if he bought the entire supply, his competitors would be hamstrung getting competing products to market, giving him an advantage.

I sincerely doubt that OpenAI needs the 40% of global DRAM wafers (not chips, packages, modules, or cards) that it has bought.
What amazes me is that OpenAI is a money furnace that has been burning capital like there's no tomorrow and yet somehow SK Hynix, Samsung, and Micron seem to think that they're good for the money. Even companies that are building data centers are starting to question whether OpenAI will be able to pay when the bill comes.
 
Upvote
145 (145 / 0)
NZXT tried "rental PCs" in 2024 and is now being sued for bait-and-switch, on top of a year's worth of rental costing more than the actual value of the hardware

https://www.extremetech.com/gaming/...ory-pc-rental-scam-issues-pointless-statement
They are being sued due to misrepresenting configurations and other shady nonsense. See this GamersNexus video for details:
View: https://www.youtube.com/watch?v=0pomC1CfpC0


Renting PCs is absolutely legal, even at absurd prices. It's been a common practice in the U.S. both via "rent-to-own" and other programs since at least the late 90s.
 
Upvote
63 (63 / 0)

Nop666

Ars Praefectus
3,862
Subscriptor++
I keep testing LLMs on areas that might be considered "tricky" where I'm an actual expert.

I'm seeing if they get the answers right, or are very confidently wrong and try to gaslight me on this.

So far I'm seeing about a 25% failure rate, as of a month ago.
Only 25%? Anecdotally, the failure rate for LLM-based tools seems to be upwards of 90%.
 
Upvote
-11 (14 / -25)

Ozy

Ars Tribunus Angusticlavius
7,448
What many don't understand is that even IF the "AI Bubble" were to completely collapse, prices will never be as cheap as they used to be. When stuff like this happens, companies discover new price ceilings that allow them to maximize margins vs. volume. It starts from the top and it trickles down.

That is the same reason why the 5090 launched at such an absurd price, vs. all the previous generations. NVIDIA and partners realized that yes, people will, in fact, gladly pay thousands for a top of the line GPU.
You're using the launch cost of a top-end GPU at the height of the AI bubble as evidence that prices will remain high when the AI bubble collapses? How does that work?
 
Upvote
-7 (17 / -24)

cadence

Ars Scholae Palatinae
1,002
Subscriptor++
Models that can be run locally are currently quite weak compared to the large models that require substantial datacenter infrastructure. As a result, they are of limited practical use today. While smaller models may become significantly more capable in a few years, at present it often feels like a poor use of money to run them locally. So a true “AI PC” has always seemed very strange to me, since the cost doesn't seem to justify the results.

Services such as OpenRouter offer access to powerful models without logging user prompts and offer sufficient privacy for the vast majority of users. Those who need absolute privacy, don't mind weak LLMs, and have lots of cash on hand, yes, they can buy an "AI PC". But how many people like that are out there?
 
Upvote
7 (14 / -7)

C.M. Allen

Ars Tribunus Angusticlavius
6,048
NZXT tried "rental PCs" in 2024 and is now being sued for bait-and-switch, on top of a year's worth of rental costing more than the actual value of the hardware

https://www.extremetech.com/gaming/...ory-pc-rental-scam-issues-pointless-statement
Not really the same thing. I'm talking about 'PCs' being replaced with glorified terminals. Everything exists and is handled 'in the cloud.' All the end user gets is a keyboard and mouse for input and a screen the see the result. The end user never actually has a computer. Or any of the software, games, movies, music, etc they 'buy.' It's all just streamed video from 'the cloud.'

One: it means that the even if the AI bubble pops, all these data centers remain relevant. Two: it works with pricing computer hardware out of the range of the home computing market, because it won't negatively impact profit margins for hardware manufacturers. Three: it makes DRM irrelevant, because the media never leaves the data centers. Four: rights holders can revoke or change 'access' to their products whenever they want, including requiring new 'payment plans' to continuing access their products, because customers never have access to the media. I could keep going, but I'm pretty sure by now you can see why this would appeal to all the big players. It's all about control -- of the price, the product, the customers, and the market. They get to retain control of everything, which is everything they've ever wanted, from the hardware manufacturers all the way to the ISPs and developers. They can squeeze as much as they want, and the customers have no recourse, no alternatives.
 
Upvote
65 (66 / -1)

jhodge

Ars Tribunus Angusticlavius
8,661
Subscriptor++
Not really the same thing. I'm talking about 'PCs' being replaced with glorified terminals. Everything exists and is handled 'in the cloud.' All the end user gets is a keyboard and mouse for input and a screen the see the result. The end user never actually has a computer. Or any of the software, games, movies, music, etc they 'buy.' It's all just streamed video from 'the cloud.'

One: it means that the even if the AI bubble pops, all these data centers remain relevant. Two: it works with pricing computer hardware out of the range of the home computing market, because it won't negatively impact profit margins for hardware manufacturers. Three: it makes DRM irrelevant, because the media never leaves the data centers. Four: rights holders can revoke or change 'access' to their products whenever they want, including requiring new 'payment plans' to continuing access their products, because customers never have access to the media. I could keep going, but I'm pretty sure by now you can see why this would appeal to all the big players. It's all about control -- of the price, the product, the customers, and the market. They get to retain control of everything.
Tell the truth: you work for Citrix, right? ;)
 
Upvote
41 (42 / -1)

Lexus Lunar Lorry

Ars Scholae Palatinae
846
Subscriptor++
What many don't understand is that even IF the "AI Bubble" were to completely collapse, prices will never be as cheap as they used to be. When stuff like this happens, companies discover new price ceilings that allow them to maximize margins vs. volume. It starts from the top and it trickles down.

That is the same reason why the 5090 launched at such an absurd price, vs. all the previous generations. NVIDIA and partners realized that yes, people will, in fact, gladly pay thousands for a top of the line GPU.
I wonder if Xi Jinping will save us here. His push for made-in-China silicon will probably spin off some chip manufacturers that decide to go into memory instead of CPUs/GPUs. They can then dump their products onto the Western world just like they did with solar panels and EVs. And to be honest, given the anti-consumer behavior of our domestic firms, I'm not sure that I'd shed any tears for them.

Perhaps in the future, the only affordable RAM for gamers will be from QAFIJKL brands on Amazon. The only downside will be that the chips will automatically replace any Tank Man pictures in your memory with Tiananmen Square tourist photos.
 
Upvote
39 (42 / -3)

SportivoA

Ars Tribunus Militum
1,527
I'm curious: how does this affect SoC shops like Apple? Are they being charged more by the fabs because of the AI RAM demand? Or are they mostly unaffected because they're not using the same pipelines?
RAM and logic (processors) are entirely different processes and fabrication facilities. Also with different optimizations and potential margins. Memory goes boom/bust with commodity, built-to-an-interchangable-standard products while cutting-edge (or even decent) logic has a long list of customers after the new performance for differentiation of custom designs. Apple, of course, gets to be The Biggest Customer for a lot of these places (also power semiconductors, screens, and batteries) and will have existing preferential contracts that reduce their uncertainty in this absurd market on either set of silicon processes. Of course, if this keeps up long enough, even Apple might run out of margin on their RAM/storage upsells.
 
Upvote
41 (41 / 0)

Ganz

Ars Scholae Palatinae
757
I love that even my 82 year old mom recognizes (without me telling her) that "AI" is a buzzword to avoid in sales situations, like "blockchain" and "NFT" and "crypto" and (in 2019) "5G".
I sure wish more people would remember when "being first" to "5G" was hyped as a full blown national security emergency. Did we "win"? What prizes did we win or lose?
 
Upvote
81 (82 / -1)

Marlor_AU

Ars Tribunus Angusticlavius
7,669
Subscriptor
I'm curious: how does this affect SoC shops like Apple? Are they being charged more by the fabs because of the AI RAM demand? Or are they mostly unaffected because they're not using the same pipelines?
Apple still uses external DRAM suppliers (Samsung, SK Hynix and Micron). The DRAM dies are stacked onto the Apple silicon during manufacture.

So Apple is as vulnerable as anyone else. However, it's possible they have long-term supply arrangements in place that could insulate them from the immediate pricing shock.
 
Upvote
72 (73 / -1)

pond-iridium.2q

Ars Centurion
253
Subscriptor
Nevermind "AI PCs", if the RAM shortage doesn't end soon, Apple's prices to upgrade their computers beyond the base configuration memory will start to look almost reasonable. And I believe that's on the list of portents along with dogs and cats living together.
I compared a strix and framework PC with 128GB of ram with a Studio M4 Max with 128GB of ram…only a $400 difference and considering the mac can use all of that ram (minus the OS needs) on what the LLM needs from the GLU cores and the others can only address 96GB for the GPU, I’d say that’s a fair price difference.
 
Upvote
16 (19 / -3)

pond-iridium.2q

Ars Centurion
253
Subscriptor
I'm curious: how does this affect SoC shops like Apple? Are they being charged more by the fabs because of the AI RAM demand? Or are they mostly unaffected because they're not using the same pipelines?
Apple has long term contracts that will protect their price points and supply for a few more years. I think I read through the G6.
 
Upvote
10 (10 / 0)
Post content hidden for low score. Show…
Not really the same thing. I'm talking about 'PCs' being replaced with glorified terminals. Everything exists and is handled 'in the cloud.' All the end user gets is a keyboard and mouse for input and a screen the see the result. The end user never actually has a computer. Or any of the software, games, movies, music, etc they 'buy.' It's all just streamed video from 'the cloud.'
wouldn't that require a tremendous amount of investment in telecom infrastructure to handle everyone streaming everything all the time? (not to mention people stuck on rural DSL or satellite with limited bandwidth and/or terrible latency)

if this RAM cartel nonsense keeps up, hopefully someone will come up with a way to import grey market Chinese RAM once they get some more fabs spun up
 
Upvote
5 (7 / -2)
I compared a strix and framework PC with 128GB of ram with a Studio M4 Max with 128GB of ram…only a $400 difference and considering the mac can use all of that ram (minus the OS needs) on what the LLM needs from the GLU cores and the others can only address 96GB for the GPU, I’d say that’s a fair price difference.
That's only an issue on Windows, Linux can use as much memory as you're willing to give it.
Apple has long term contracts that will protect their price points and supply for a few more years. I think I read through the G6.
Source? From my understanding they're fine for this quarter but next quarter could see them hit with large price hikes. They're still in negotiations for 2026 orders: https://www.digitimes.com/news/a20260107VL211/apple-dram-2026-nand-production.html
 
Upvote
22 (24 / -2)