The idea of AI PCs isn’t inherently a bad thing.
They'll optimize for lock-in and recurring income streams, and you'll enjoy your bug-ridden stuttering bloatfest.The optimist in me wants RAM pricing to bring in a year of performance optimizations to get some bloat out of software. If systems are shipping with less RAM, it sure would be nice to see OSs, applications, and websites optimize for more efficiency.
As Ars Technica has reported, the growing demands of data centers, fueled by the AI boom, have led to a shortage of RAM and flash memory chips, driving prices to skyrocket.
I'm sure there are 'think tanks' working out how to dismantle the personal computer market, to replace it with 'computing as a service' cloud systems, so customers own nothing. Ever. It's what big software and game publishers have wanted all along. In addition to massively increasing the barrier to entry for the software/game development business, allowing the existing big players to more easily monopolize the market. At every end of this push, the 'big players' win, and everyone else loses, which is precisely what makes that the outcome those big players will definitely want to try.On the one hand: "RAM prices have soared, which is bad news for people interested in buying, building, or upgrading a computer this year..."
But at least we can rest easy, because now: "why sell you the means only once when they can sell you the 'service' forever!"
It's just greed, all the way down.
The optimist in me wants RAM pricing to bring in a year of performance optimizations to get some bloat out of software. If systems are shipping with less RAM, it sure would be nice to see OSs, applications, and websites optimize for more efficiency.
NZXT tried "rental PCs" in 2024 and is now being sued for bait-and-switch, on top of a year's worth of rental costing more than the actual value of the hardwareI'm sure there are 'think tanks' working out how to dismantle the personal computer market, to replace it with 'computing as a service' cloud systems, so customers own nothing. Ever.
Bingo. Prices never go down. If it does ever “pop” these companies will just funnel their marketing budget into proping up the AI house of cards, keeping it all afloat to justify max prices.What many don't understand is that even IF the "AI Bubble" were to completely collapse, prices will never be as cheap as they used to be. When stuff like this happens, companies discover new price ceilings that allow them to maximize margins vs. volume. It starts from the top and it trickles down.
That is the same reason why the 5090 launched at such an absurd price, vs. all the previous generations. NVIDIA and partners realized that yes, people will, in fact, gladly pay thousands for a top of the line GPU.
What amazes me is that OpenAI is a money furnace that has been burning capital like there's no tomorrow and yet somehow SK Hynix, Samsung, and Micron seem to think that they're good for the money. Even companies that are building data centers are starting to question whether OpenAI will be able to pay when the bill comes.When Robert R. Taylor bought the entire supply of hand pumps it was not because he needed that much supply for his product, it was in order to maintain exclusivity in the liquid soap market. He knew that if he bought the entire supply, his competitors would be hamstrung getting competing products to market, giving him an advantage.
I sincerely doubt that OpenAI needs the 40% of global DRAM wafers (not chips, packages, modules, or cards) that it has bought.
They are being sued due to misrepresenting configurations and other shady nonsense. See this GamersNexus video for details:NZXT tried "rental PCs" in 2024 and is now being sued for bait-and-switch, on top of a year's worth of rental costing more than the actual value of the hardware
https://www.extremetech.com/gaming/...ory-pc-rental-scam-issues-pointless-statement
Only 25%? Anecdotally, the failure rate for LLM-based tools seems to be upwards of 90%.I keep testing LLMs on areas that might be considered "tricky" where I'm an actual expert.
I'm seeing if they get the answers right, or are very confidently wrong and try to gaslight me on this.
So far I'm seeing about a 25% failure rate, as of a month ago.
You're using the launch cost of a top-end GPU at the height of the AI bubble as evidence that prices will remain high when the AI bubble collapses? How does that work?What many don't understand is that even IF the "AI Bubble" were to completely collapse, prices will never be as cheap as they used to be. When stuff like this happens, companies discover new price ceilings that allow them to maximize margins vs. volume. It starts from the top and it trickles down.
That is the same reason why the 5090 launched at such an absurd price, vs. all the previous generations. NVIDIA and partners realized that yes, people will, in fact, gladly pay thousands for a top of the line GPU.
Not really the same thing. I'm talking about 'PCs' being replaced with glorified terminals. Everything exists and is handled 'in the cloud.' All the end user gets is a keyboard and mouse for input and a screen the see the result. The end user never actually has a computer. Or any of the software, games, movies, music, etc they 'buy.' It's all just streamed video from 'the cloud.'NZXT tried "rental PCs" in 2024 and is now being sued for bait-and-switch, on top of a year's worth of rental costing more than the actual value of the hardware
https://www.extremetech.com/gaming/...ory-pc-rental-scam-issues-pointless-statement
Tell the truth: you work for Citrix, right?Not really the same thing. I'm talking about 'PCs' being replaced with glorified terminals. Everything exists and is handled 'in the cloud.' All the end user gets is a keyboard and mouse for input and a screen the see the result. The end user never actually has a computer. Or any of the software, games, movies, music, etc they 'buy.' It's all just streamed video from 'the cloud.'
One: it means that the even if the AI bubble pops, all these data centers remain relevant. Two: it works with pricing computer hardware out of the range of the home computing market, because it won't negatively impact profit margins for hardware manufacturers. Three: it makes DRM irrelevant, because the media never leaves the data centers. Four: rights holders can revoke or change 'access' to their products whenever they want, including requiring new 'payment plans' to continuing access their products, because customers never have access to the media. I could keep going, but I'm pretty sure by now you can see why this would appeal to all the big players. It's all about control -- of the price, the product, the customers, and the market. They get to retain control of everything.
I wonder if Xi Jinping will save us here. His push for made-in-China silicon will probably spin off some chip manufacturers that decide to go into memory instead of CPUs/GPUs. They can then dump their products onto the Western world just like they did with solar panels and EVs. And to be honest, given the anti-consumer behavior of our domestic firms, I'm not sure that I'd shed any tears for them.What many don't understand is that even IF the "AI Bubble" were to completely collapse, prices will never be as cheap as they used to be. When stuff like this happens, companies discover new price ceilings that allow them to maximize margins vs. volume. It starts from the top and it trickles down.
That is the same reason why the 5090 launched at such an absurd price, vs. all the previous generations. NVIDIA and partners realized that yes, people will, in fact, gladly pay thousands for a top of the line GPU.
I'm sure as hell not getting paid like that!Tell the truth: you work for Citrix, right?![]()
RAM and logic (processors) are entirely different processes and fabrication facilities. Also with different optimizations and potential margins. Memory goes boom/bust with commodity, built-to-an-interchangable-standard products while cutting-edge (or even decent) logic has a long list of customers after the new performance for differentiation of custom designs. Apple, of course, gets to be The Biggest Customer for a lot of these places (also power semiconductors, screens, and batteries) and will have existing preferential contracts that reduce their uncertainty in this absurd market on either set of silicon processes. Of course, if this keeps up long enough, even Apple might run out of margin on their RAM/storage upsells.I'm curious: how does this affect SoC shops like Apple? Are they being charged more by the fabs because of the AI RAM demand? Or are they mostly unaffected because they're not using the same pipelines?
I sure wish more people would remember when "being first" to "5G" was hyped as a full blown national security emergency. Did we "win"? What prizes did we win or lose?I love that even my 82 year old mom recognizes (without me telling her) that "AI" is a buzzword to avoid in sales situations, like "blockchain" and "NFT" and "crypto" and (in 2019) "5G".
Apple still uses external DRAM suppliers (Samsung, SK Hynix and Micron). The DRAM dies are stacked onto the Apple silicon during manufacture.I'm curious: how does this affect SoC shops like Apple? Are they being charged more by the fabs because of the AI RAM demand? Or are they mostly unaffected because they're not using the same pipelines?
I compared a strix and framework PC with 128GB of ram with a Studio M4 Max with 128GB of ram…only a $400 difference and considering the mac can use all of that ram (minus the OS needs) on what the LLM needs from the GLU cores and the others can only address 96GB for the GPU, I’d say that’s a fair price difference.Nevermind "AI PCs", if the RAM shortage doesn't end soon, Apple's prices to upgrade their computers beyond the base configuration memory will start to look almost reasonable. And I believe that's on the list of portents along with dogs and cats living together.
Apple has long term contracts that will protect their price points and supply for a few more years. I think I read through the G6.I'm curious: how does this affect SoC shops like Apple? Are they being charged more by the fabs because of the AI RAM demand? Or are they mostly unaffected because they're not using the same pipelines?
wouldn't that require a tremendous amount of investment in telecom infrastructure to handle everyone streaming everything all the time? (not to mention people stuck on rural DSL or satellite with limited bandwidth and/or terrible latency)Not really the same thing. I'm talking about 'PCs' being replaced with glorified terminals. Everything exists and is handled 'in the cloud.' All the end user gets is a keyboard and mouse for input and a screen the see the result. The end user never actually has a computer. Or any of the software, games, movies, music, etc they 'buy.' It's all just streamed video from 'the cloud.'
That's only an issue on Windows, Linux can use as much memory as you're willing to give it.I compared a strix and framework PC with 128GB of ram with a Studio M4 Max with 128GB of ram…only a $400 difference and considering the mac can use all of that ram (minus the OS needs) on what the LLM needs from the GLU cores and the others can only address 96GB for the GPU, I’d say that’s a fair price difference.
Source? From my understanding they're fine for this quarter but next quarter could see them hit with large price hikes. They're still in negotiations for 2026 orders: https://www.digitimes.com/news/a20260107VL211/apple-dram-2026-nand-production.htmlApple has long term contracts that will protect their price points and supply for a few more years. I think I read through the G6.