Nvidia fixes the 8GB RAM problem with one of its GPUs—if you can pay for it

quamquam quid loquor

Ars Tribunus Militum
2,901
Subscriptor++
Just a note on my unrelated experience with the Framework Desktop AI Max+ 395, their graphics drivers are absolutely trash.

My windows desktop locked up every single day when using Framework's official driver package. Sparkles on the screen and reboots into 1080p with driver failure. Solved by replacing with AMD's official graphics drivers.

Laptop drivers are even more finicky than desktop ones in my experience, so proceed with caution.
 
Upvote
28 (31 / -3)

AdamM

Ars Praefectus
5,935
Subscriptor
I'm not completely grasping the implication. The $1200 reflects the inflated price we should expect from now on in other GPUS, desktop or mobile, or just this one instance?
Basically prices are up because suppliers are charging more. Short of that changing the same will happen with other GPU’s.
 
Upvote
16 (16 / 0)
I can't believe anyone is willingly buying nvidia gpus at this point. AMDs prices are inflated too but still cost about half as much as nvidia. Between that and the DLSS5 slop it was an easy choice to go AMD in my new system I had to build to replace my 10 year old rig
DLAA is, subjectively, the best-looking anti-aliasing, and that is just DLSS without any upscaling. New games have increasingly been leaning on AI upscaling just to hit acceptable performance, so DLSS/FSR/XeSS are probably here to stay.
 
Upvote
6 (14 / -8)

torque2k

Ars Praetorian
497
Subscriptor++
I can't believe anyone is willingly buying nvidia gpus at this point. AMDs prices are inflated too but still cost about half as much as nvidia. Between that and the DLSS5 slop it was an easy choice to go AMD in my new system I had to build to replace my 10 year old rig
Remember, we're not all gamers out here in this RAM-starved world. Running local AI models helps keep our data from the grabby paws of Big AI, and Nvidia has the processing power AND the RAM to make it happen better (right now). Going with a 12GB laptop GPU vs. 8GB means I can run a 9B higher-quant version of Qwen 3.5 (still having issues with 3.6 here) with room to drive dual 4K external displays at my desk vs. a 4B parameter version of the model. MUCH better coding assistant.

Mind, unified RAM like Apple has (and AMD has been working on bettering) is a better option for AI model size concerns, but not for true speed. Can't wait until this all gets figured out magically! :)
 
Upvote
22 (30 / -8)
Remember, we're not all gamers out here in this RAM-starved world. Running local AI models helps keep our data from the grabby paws of Big AI, and Nvidia has the processing power AND the RAM to make it happen better (right now). Going with a 12GB laptop GPU vs. 8GB means I can run a 9B higher-quant version of Qwen 3.5 (still having issues with 3.6 here) with room to drive dual 4K external displays at my desk vs. a 4B parameter version of the model. MUCH better coding assistant.

Mind, unified RAM like Apple has (and AMD has been working on bettering) is a better option for AI model size concerns, but not for true speed. Can't wait until this all gets figured out magically! :)
As someone forced to use AI for work I cannot comprehend wanting to run it at home even locally, especially at the price tag that Nvidia is demanding for it
 
Upvote
45 (50 / -5)
DLAA is, subjectively, the best-looking anti-aliasing, and that is just DLSS without any upscaling. New games have increasingly been leaning on AI upscaling just to hit acceptable performance, so DLSS/FSR/XeSS are probably here to stay.
The AA and upscaling is fine. I don't even mind frame generation, but saying that will start a massive flame war.

DLSS5 isn't that. DLSS5 is the "yasify" filter they showed off recently.
 
Upvote
47 (48 / -1)
Just a note on my unrelated experience with the Framework Desktop AI Max+ 395, their graphics drivers are absolutely trash.
Fixed. Framework's driver support (ultimately their entire software side in general, drivers and BIOS) is just plain bad. Bought the 7840U and it was like 6 months before that was stable. Bought the VRR 120Hz display and it was like 9 months before Framework's drivers officially supported variable refresh rates in Windows.

And as Framework continues to bring out more hardware and stretch their limited resources (compared to someone on the level of Dell, Asus, etc.) further it makes me wonder how much longer this can work.

I'm half convinced this is part of why they're pushing Linux so hard, because then they can just rely on the kernel drivers.
 
Upvote
9 (12 / -3)

quamquam quid loquor

Ars Tribunus Militum
2,901
Subscriptor++
Fixed. Framework's driver support (ultimately their entire software side in general, drivers and BIOS) is just plain bad. Bought the 7840U and it was like 6 months before that was stable. Bought the VRR 120Hz display and it was like 9 months before Framework's drivers officially supported variable refresh rates in Windows.

And as Framework continues to bring out more hardware and stretch their limited resources (compared to someone on the level of Dell, Asus, etc.) further it makes me wonder how much longer this can work.

I'm half convinced this is part of why they're pushing Linux so hard, because then they can just rely on the kernel drivers.
At least with the desktop I got a relative "steal" on 128gb unified RAM. Buying their laptops at these prices just feels like robbery.

Dell is the only windows laptop I trust to be stable and up to date.
 
Upvote
2 (4 / -2)
I have an aging 3070fe in my gaming rig, and at those prices it will be the last Nvidia card I own. I'm not expecting anything to improve until the AI bubble pops - AI isn't going away, but billions of speculation cash being pumped into direct competitors just to hedge their bets can't go on forever, not can the 'we're tying up all our future products to go to future GPUs that will go in future data centres that aren't getting built because they haven't got the power infrastructure available'
 
Upvote
3 (5 / -2)
FWIW, in Linux land there's an effort to manage VRAM better by evicting video data from allocated processes not running in foreground faster. The details are here. From my own experience, Windows has a similar problem in poor VRAM contents management, causing the symptoms of stuttering but in a different way. Linux land is sometimes more sensitive to resource utilization bottlenecks and the market processes that cause scarcity despite most of the money being shoved into features that largely help the hyperscalers. I mean, only a few entities compared to the broad Linux user base cares if Linux single instances don't scale past 512 cores or has trouble with more than a TB of system RAM (numbers are illustrative only. I don't know what the current scaling limits are.)

That said for AMD GPU owners, unless you've got a rolling release distro like Cachy-OS with its extensive back porting of game-relevant bug patching in its kernels, you're possibly going to be plagued by some AMDGPU driver bugs that have accumulated in the Linux driver for the past 19 or so months (since kernel v. 6.12.x). I can't recommend Linux if you have an AMD GPU till this mess has been sorted out and the patch sets get fully accepted then trickle down through the various distros (which can take months- and there's no guarantee new bugs won't crop up in the mean time to trickle down faster). There's multiple bugs and they manifest in different ways. I have a game that would trigger the page flip race bug every time I triggered a cinematic. Others on laptops have frequent issues with APU power management.

I don't hit those bugs in every 3D program/game, and it's only a couple of games I play they've affected. But there's been plenty of complaints and only recently was a concerned citizen with the Skill and motivation to fix the problem. AMD hasn't been assed to do so, chasing after AI unicorns and high end hardware profits (which apparently these bugs don't directly affect? shrug They seem to be more desktop environment specific.)
 
Upvote
6 (6 / 0)
At least with the desktop I got a relative "steal" on 128gb unified RAM. Buying their laptops at these prices just feels like robbery.

Dell is the only windows laptop I trust to be stable and up to date.
When's the last time you used a Dell laptop? Their XPS line is fucking awful.
 
Upvote
9 (9 / 0)
At least with the desktop I got a relative "steal" on 128gb unified RAM. Buying their laptops at these prices just feels like robbery.

Dell is the only windows laptop I trust to be stable and up to date.
... boy have I the Dell horror stories... (and I'll leave it at that because every vendor has them, no OEMs deserve any kind of trust like that)
 
Upvote
12 (12 / 0)

Fred Duck

Ars Tribunus Angusticlavius
7,282
I don't even mind frame generation, but saying that will start a massive flame war.
Frame War started!

Employer: G. Pu! You're here! I've paid a pretty penny for you.
G. Pu: I'm the best there is at what I do.
Employer: Fantastic! Here are a set of maths problems! Please solve them.
G. Pu: Done. Well, I did a few of them and just made up the rest.
Employer: Wait, what am I paying you for?
G. Pu: If you look at them fast enough, you won't even notice which are fake.
Employer: T_T

Dell is the only windows laptop I trust to be stable and up to date.
Wild Stallions Rule!.jpg
 
Upvote
6 (7 / -1)
A 12GB 4070 is a 4070ti (or really a 4080) so this kinda feels like the 4070ti is on the outs and this is their far overpriced solution.
Only the amount of memory has changed. The memory bus is the same, the GPU is the same. They're just using denser GDDR7 packages to squeeze a little extra memory in there. Performance when VRAM isn't exhausted will be the same.

It very much is not a 5070Ti. Don't give NV ideas, they already get cute with product names that suggest two products are more different than they are, or hide how worse one is than another (they do it both ways).
 
Upvote
16 (16 / 0)

Jivejebus

Ars Scholae Palatinae
737
Subscriptor++
I can't believe anyone is willingly buying nvidia gpus at this point. AMDs prices are inflated too but still cost about half as much as nvidia. Between that and the DLSS5 slop it was an easy choice to go AMD in my new system I had to build to replace my 10 year old rig
I'm right there with you. At a friend's over the weekend a buddy was showing off a system he built for another friend with a 5070ti. Of all the GPUs to buy that's got to be by far the worst value. Looking at my local Microcenter, which he could have used, a 9070xt would have been $300 cheaper and another $200 would have gotten a 5080. $1000 for a 5070ti is crazy town
 
Upvote
11 (12 / -1)
I can't believe anyone is willingly buying nvidia gpus at this point. AMDs prices are inflated too but still cost about half as much as nvidia. Between that and the DLSS5 slop it was an easy choice to go AMD in my new system I had to build to replace my 10 year old rig
For gaming, sure. I agree. But for compute Nvidia's CUDA is the only game in town that's both generally reliable and turn-key. Intel is still too new and working through various problems related to that newness in both the hardware and software stack. They're catching up fast on the software side with oneAPI, though.

AMD... sigh ROCm is such a fucking mess beyond the kernel driver bugs I pointed out above.
 
Upvote
14 (15 / -1)

Kebba

Ars Scholae Palatinae
972
Subscriptor
I'm not completely grasping the implication. The $1200 reflects the inflated price we should expect from now on in other GPUS, desktop or mobile, or just this one instance?
I would guess that it might hit different manufacturers differently and at different times. Somebody with really large volumes and a well times contract might be better shielded compared to a smaller player, or somebody negotiating for increased supply. Granted, if everybody else raises prices a manufacturer with OK pricing might still increase along with the rest of the market. So yeah, increases costs, but I could imaging framework being hit particullary bad as they are relatively small.

For reference, my job is a quite small consumer of RAM chips and I heard that increases in pricing was 10-20X, assuming you could even get the chips regardless of what you are willing to pay
 
Upvote
4 (4 / 0)
I'm right there with you. At a friend's over the weekend a buddy was showing off a system he built for another friend with a 5070ti. Of all the GPUs to buy that's got to be by far the worst value. Looking at my local Microcenter, which he could have used, a 9070xt would have been $300 cheaper and another $200 would have gotten a 5080. $1000 for a 5070ti is crazy town
And boy is the absurd prices of AMD and Nvidia probably the best thing to happen to Arc for intel in terms of opportunity. They get to be the strong value player and hopefully stay in the game
 
Upvote
0 (0 / 0)
I can't believe anyone is willingly buying nvidia gpus at this point. AMDs prices are inflated too but still cost about half as much as nvidia. Between that and the DLSS5 slop it was an easy choice to go AMD in my new system I had to build to replace my 10 year old rig
They aren't half the price. The 9070XT is about $700 and the 5070TI is about $1000. That's the worst case comparison I could see. The 16GB 9060XT is about $450 and the 16GB 5060TI is about $500.
 
Upvote
6 (7 / -1)
One could think that for gaming, this all means a resurgence of console gaming - but they were first to announce price hikes and delays because of the AI crisis.
So... perhaps a resurgence in analog gaming vs digital?
I'll wipe the dust off of my Fighting Fantasy book series and tabletop RPG collection. :p
 
Upvote
3 (4 / -1)

evan_s

Ars Tribunus Angusticlavius
7,410
Subscriptor
I'm right there with you. At a friend's over the weekend a buddy was showing off a system he built for another friend with a 5070ti. Of all the GPUs to buy that's got to be by far the worst value. Looking at my local Microcenter, which he could have used, a 9070xt would have been $300 cheaper and another $200 would have gotten a 5080. $1000 for a 5070ti is crazy town

The 5070ti and 5080 both aren't great values. On Perf per $ perspective I don't think a $1200 5080 is any better than a $1000 5070ti. That's a 15-20% improvement in performance for a 20% improvement in price. At best you are staying flat and being over priced for the performance and at worst it's going down.
 
Upvote
2 (3 / -1)

SirDanglyBits

Smack-Fu Master, in training
15
When's the last time you used a Dell laptop? Their XPS line is fucking awful.
Their support is also not the best. I had a precision mobile workstation die a couple months ago due to some sort of hardware failure, that laptop was only a month old. It wasn't right from day one, the touchpad was stuck on maximum sensitivity and no matter what I tried I couldn't change that. Even booting it to Linux it behaved the same, so that ruled out a software issue. The finger print reader only worked when it wanted to, which was about half the time. The fans constantly ran at full speed for no aparent reason, so it was really loud. Sometimes the screen would just totally glitch out, similar to what you get when you have a bad display ribbon. However a reboot always brought the display back, so it ovbiously wasn't a bad display ribbon. Then it started just randomly turning itself off. It didn't matter if I was using it or if it was just sitting there idle, it would just randomly turn itself off. Finally it died completely and wouldn't even power on. It was a work comupter and we have premium support, so they sent a tech to my house to fix it. The tech was unable to fix it and said I had to mail it in for repair. So I mailed it to them and a week later they sent it back, still completely dead, wouldn't even power on. So I sent it to Dell again, telling them they forgot to fix it. They sent it back a week later and this time it was actually fixed. SInce then it has worked fine. But I lost a lot of time and productivity going through all that.
 
Upvote
2 (2 / 0)
The 5070ti and 5080 both aren't great values. On Perf per $ perspective I don't think a $1200 5080 is any better than a $1000 5070ti. That's a 15-20% improvement in performance for a 20% improvement in price. At best you are staying flat and being over priced for the performance and at worst it's going down.
GPU prices are insane right now so it's hard to call anything a "great value", but going higher has always had diminishing returns in the "performance/$" metric.

I get why people use that as a metric, but I kinda hate it. It's arguably useful in certain contexts, but as an absolute it's pretty useless. A GPU might score high in "performance/$" because it's genuinely good, or because it's cheap. So really it's only useful if you're either talking about comparable GPUs, in which case you're just saying one is cheaper than the other and expressing it as a ratio doesn't help anyone, or if you want to demonstrate how expensive the top end is. But at the end of the day, if you need a 10090XTiX to run the games you want to play, at the quality you're after, at the resolution and framerate of your choice, then how much better the bottom end card scores in perf/$ is entirely irrelevant. 100X better perf/$ is a waste of money if that means a 10fps slideshow.
 
Upvote
5 (5 / 0)
So has there ever been a confirmation of the actual silicon shortage? Eight months later, and there's still a "shortage"? I'm a little skeptical on the length of this so called shortage.

There's some silicon suppliers definitely getting rich out there.
The AI bubble will consume every last drop of memory it can get its hands on. These are massively complex manufacturering processes that can't just increase manufacturing tenfold overnight.
 
Upvote
6 (6 / 0)
Their support is also not the best. I had a precision mobile workstation die a couple months ago due to some sort of hardware failure, that laptop was only a month old. It wasn't right from day one, the touchpad was stuck on maximum sensitivity and no matter what I tried I couldn't change that. Even booting it to Linux it behaved the same, so that ruled out a software issue. The finger print reader only worked when it wanted to, which was about half the time. The fans constantly ran at full speed for no aparent reason, so it was really loud. Sometimes the screen would just totally glitch out, similar to what you get when you have a bad display ribbon. However a reboot always brought the display back, so it ovbiously wasn't a bad display ribbon. Then it started just randomly turning itself off. It didn't matter if I was using it or if it was just sitting there idle, it would just randomly turn itself off. Finally it died completely and wouldn't even power on. It was a work comupter and we have premium support, so they sent a tech to my house to fix it. The tech was unable to fix it and said I had to mail it in for repair. So I mailed it to them and a week later they sent it back, still completely dead, wouldn't even power on. So I sent it to Dell again, telling them they forgot to fix it. They sent it back a week later and this time it was actually fixed. SInce then it has worked fine. But I lost a lot of time and productivity going through all that.
Last time I had to talk to Dell customer support I was like "okay, the backlight is definitely dead. I can see things on the screen if I hold a flashlight up to it, but it's not making any of its own light" and the guy was like "all right, we have to go through the troubleshooting steps though" and ninety minutes later after having me try every possible thing on Earth he was like [chipper] "okay, it looks like your screen's backlight is out! We'll have to schedule a technician to repair it!"
 
Upvote
12 (12 / 0)
GPU prices are insane right now so it's hard to call anything a "great value", but going higher has always had diminishing returns in the "performance/$" metric.

I get why people use that as a metric, but I kinda hate it. It's arguably useful in certain contexts, but as an absolute it's pretty useless. A GPU might score high in "performance/$" because it's genuinely good, or because it's cheap. So really it's only useful if you're either talking about comparable GPUs, in which case you're just saying one is cheaper than the other and expressing it as a ratio doesn't help anyone, or if you want to demonstrate how expensive the top end is. But at the end of the day, if you need a 10090XTiX to run the games you want to play, at the quality you're after, at the resolution and framerate of your choice, then how much better the bottom end card scores in perf/$ is entirely irrelevant. 100X better perf/$ is a waste of money if that means a 10fps slideshow.
Yeah. I only upgrade GPUs when I can no longer play some AAA games I really want to play at a suitable graphics quality. Whenever that happens I generally just check benchmarks/reviews and find the cheapest GPU that will play whatever I'm looking to play at pretty high settings, maybe with a little overhead budget permitting.

So anyway I haven't bought a GPU since about Baldur's Gate 3.
 
Upvote
3 (3 / 0)

HiWire

Ars Scholae Palatinae
762
Yes, paying that much for a mobile 5070 with a 128-bit bus is stupid. Not recommended at all, and Framework's pricing should be warning enough.

The mobile 5070's bus has a theoretical bandwidth of 384 GB/s, while a 5070 Ti's is 672 GB/s (with a 192-bit bus width). The performance increase from going to 12GB VRAM vs. price is so inconsequential that it's ludicrous.

https://en.wikipedia.org/wiki/GeForce_RTX_50_series#Mobile

Let's take a slightly unfair comparison - the desktop AMD RX Radeon 9060 XT 16GB ($349 USD) vs. Nvidia 9070 mobile - it's 5-40% (an aggregate 25%) faster for a lot less money:

https://technical.city/en/video/GeForce-RTX-5070-mobile-vs-Radeon-RX-9060-XT-16GB
 
Last edited:
Upvote
0 (0 / 0)
Ow, and that's just for laptop GPU performance. I think I'll stick with integrated for now. Both AMD and Intel have made pretty good strides with integrated graphics, and they perform well enough for all of the great indie games out there. It's dark times for AAA-gamers (and AAA-game sales as well).
 
Upvote
1 (1 / 0)

SirDanglyBits

Smack-Fu Master, in training
15
Last time I had to talk to Dell customer support I was like "okay, the backlight is definitely dead. I can see things on the screen if I hold a flashlight up to it, but it's not making any of its own light" and the guy was like "all right, we have to go through the troubleshooting steps though" and ninety minutes later after having me try every possible thing on Earth he was like [chipper] "okay, it looks like your screen's backlight is out! We'll have to schedule a technician to repair it!"
Yeah they wanted to troubleshoot with me as well. I told them, it doesn't power on. They said okay well lets troubleshot that. They had me try all kinds of dumb stuff. But regardless, it would not power on. Like what part of it doesn't power on do you not understand? LOL!
 
Upvote
0 (2 / -2)