Basically prices are up because suppliers are charging more. Short of that changing the same will happen with other GPU’s.I'm not completely grasping the implication. The $1200 reflects the inflated price we should expect from now on in other GPUS, desktop or mobile, or just this one instance?
ThanksBasically prices are up because suppliers are charging more. Short of that changing the same will happen with other GPU’s.
DLAA is, subjectively, the best-looking anti-aliasing, and that is just DLSS without any upscaling. New games have increasingly been leaning on AI upscaling just to hit acceptable performance, so DLSS/FSR/XeSS are probably here to stay.I can't believe anyone is willingly buying nvidia gpus at this point. AMDs prices are inflated too but still cost about half as much as nvidia. Between that and the DLSS5 slop it was an easy choice to go AMD in my new system I had to build to replace my 10 year old rig
Remember, we're not all gamers out here in this RAM-starved world. Running local AI models helps keep our data from the grabby paws of Big AI, and Nvidia has the processing power AND the RAM to make it happen better (right now). Going with a 12GB laptop GPU vs. 8GB means I can run a 9B higher-quant version of Qwen 3.5 (still having issues with 3.6 here) with room to drive dual 4K external displays at my desk vs. a 4B parameter version of the model. MUCH better coding assistant.I can't believe anyone is willingly buying nvidia gpus at this point. AMDs prices are inflated too but still cost about half as much as nvidia. Between that and the DLSS5 slop it was an easy choice to go AMD in my new system I had to build to replace my 10 year old rig
As someone forced to use AI for work I cannot comprehend wanting to run it at home even locally, especially at the price tag that Nvidia is demanding for itRemember, we're not all gamers out here in this RAM-starved world. Running local AI models helps keep our data from the grabby paws of Big AI, and Nvidia has the processing power AND the RAM to make it happen better (right now). Going with a 12GB laptop GPU vs. 8GB means I can run a 9B higher-quant version of Qwen 3.5 (still having issues with 3.6 here) with room to drive dual 4K external displays at my desk vs. a 4B parameter version of the model. MUCH better coding assistant.
Mind, unified RAM like Apple has (and AMD has been working on bettering) is a better option for AI model size concerns, but not for true speed. Can't wait until this all gets figured out magically!![]()
The AA and upscaling is fine. I don't even mind frame generation, but saying that will start a massive flame war.DLAA is, subjectively, the best-looking anti-aliasing, and that is just DLSS without any upscaling. New games have increasingly been leaning on AI upscaling just to hit acceptable performance, so DLSS/FSR/XeSS are probably here to stay.
Fixed. Framework's driver support (ultimately their entire software side in general, drivers and BIOS) is just plain bad. Bought the 7840U and it was like 6 months before that was stable. Bought the VRR 120Hz display and it was like 9 months before Framework's drivers officially supported variable refresh rates in Windows.Just a note on my unrelated experience withtheFrameworkDesktop AI Max+ 395,theirgraphicsdrivers are absolutely trash.
At least with the desktop I got a relative "steal" on 128gb unified RAM. Buying their laptops at these prices just feels like robbery.Fixed. Framework's driver support (ultimately their entire software side in general, drivers and BIOS) is just plain bad. Bought the 7840U and it was like 6 months before that was stable. Bought the VRR 120Hz display and it was like 9 months before Framework's drivers officially supported variable refresh rates in Windows.
And as Framework continues to bring out more hardware and stretch their limited resources (compared to someone on the level of Dell, Asus, etc.) further it makes me wonder how much longer this can work.
I'm half convinced this is part of why they're pushing Linux so hard, because then they can just rely on the kernel drivers.
When's the last time you used a Dell laptop? Their XPS line is fucking awful.At least with the desktop I got a relative "steal" on 128gb unified RAM. Buying their laptops at these prices just feels like robbery.
Dell is the only windows laptop I trust to be stable and up to date.
... boy have I the Dell horror stories... (and I'll leave it at that because every vendor has them, no OEMs deserve any kind of trust like that)At least with the desktop I got a relative "steal" on 128gb unified RAM. Buying their laptops at these prices just feels like robbery.
Dell is the only windows laptop I trust to be stable and up to date.
At this pace, RAM is going up more than their stocks.buy some Micron stock, sell it when you next need to buy RAM
Frame War started!I don't even mind frame generation, but saying that will start a massive flame war.
Dell is the only windows laptop I trust to be stable and up to date.
Only the amount of memory has changed. The memory bus is the same, the GPU is the same. They're just using denser GDDR7 packages to squeeze a little extra memory in there. Performance when VRAM isn't exhausted will be the same.A 12GB 4070 is a 4070ti (or really a 4080) so this kinda feels like the 4070ti is on the outs and this is their far overpriced solution.
I'm right there with you. At a friend's over the weekend a buddy was showing off a system he built for another friend with a 5070ti. Of all the GPUs to buy that's got to be by far the worst value. Looking at my local Microcenter, which he could have used, a 9070xt would have been $300 cheaper and another $200 would have gotten a 5080. $1000 for a 5070ti is crazy townI can't believe anyone is willingly buying nvidia gpus at this point. AMDs prices are inflated too but still cost about half as much as nvidia. Between that and the DLSS5 slop it was an easy choice to go AMD in my new system I had to build to replace my 10 year old rig
For gaming, sure. I agree. But for compute Nvidia's CUDA is the only game in town that's both generally reliable and turn-key. Intel is still too new and working through various problems related to that newness in both the hardware and software stack. They're catching up fast on the software side with oneAPI, though.I can't believe anyone is willingly buying nvidia gpus at this point. AMDs prices are inflated too but still cost about half as much as nvidia. Between that and the DLSS5 slop it was an easy choice to go AMD in my new system I had to build to replace my 10 year old rig
I would guess that it might hit different manufacturers differently and at different times. Somebody with really large volumes and a well times contract might be better shielded compared to a smaller player, or somebody negotiating for increased supply. Granted, if everybody else raises prices a manufacturer with OK pricing might still increase along with the rest of the market. So yeah, increases costs, but I could imaging framework being hit particullary bad as they are relatively small.I'm not completely grasping the implication. The $1200 reflects the inflated price we should expect from now on in other GPUS, desktop or mobile, or just this one instance?
And boy is the absurd prices of AMD and Nvidia probably the best thing to happen to Arc for intel in terms of opportunity. They get to be the strong value player and hopefully stay in the gameI'm right there with you. At a friend's over the weekend a buddy was showing off a system he built for another friend with a 5070ti. Of all the GPUs to buy that's got to be by far the worst value. Looking at my local Microcenter, which he could have used, a 9070xt would have been $300 cheaper and another $200 would have gotten a 5080. $1000 for a 5070ti is crazy town
They aren't half the price. The 9070XT is about $700 and the 5070TI is about $1000. That's the worst case comparison I could see. The 16GB 9060XT is about $450 and the 16GB 5060TI is about $500.I can't believe anyone is willingly buying nvidia gpus at this point. AMDs prices are inflated too but still cost about half as much as nvidia. Between that and the DLSS5 slop it was an easy choice to go AMD in my new system I had to build to replace my 10 year old rig
I'm right there with you. At a friend's over the weekend a buddy was showing off a system he built for another friend with a 5070ti. Of all the GPUs to buy that's got to be by far the worst value. Looking at my local Microcenter, which he could have used, a 9070xt would have been $300 cheaper and another $200 would have gotten a 5080. $1000 for a 5070ti is crazy town
Their support is also not the best. I had a precision mobile workstation die a couple months ago due to some sort of hardware failure, that laptop was only a month old. It wasn't right from day one, the touchpad was stuck on maximum sensitivity and no matter what I tried I couldn't change that. Even booting it to Linux it behaved the same, so that ruled out a software issue. The finger print reader only worked when it wanted to, which was about half the time. The fans constantly ran at full speed for no aparent reason, so it was really loud. Sometimes the screen would just totally glitch out, similar to what you get when you have a bad display ribbon. However a reboot always brought the display back, so it ovbiously wasn't a bad display ribbon. Then it started just randomly turning itself off. It didn't matter if I was using it or if it was just sitting there idle, it would just randomly turn itself off. Finally it died completely and wouldn't even power on. It was a work comupter and we have premium support, so they sent a tech to my house to fix it. The tech was unable to fix it and said I had to mail it in for repair. So I mailed it to them and a week later they sent it back, still completely dead, wouldn't even power on. So I sent it to Dell again, telling them they forgot to fix it. They sent it back a week later and this time it was actually fixed. SInce then it has worked fine. But I lost a lot of time and productivity going through all that.When's the last time you used a Dell laptop? Their XPS line is fucking awful.
GPU prices are insane right now so it's hard to call anything a "great value", but going higher has always had diminishing returns in the "performance/$" metric.The 5070ti and 5080 both aren't great values. On Perf per $ perspective I don't think a $1200 5080 is any better than a $1000 5070ti. That's a 15-20% improvement in performance for a 20% improvement in price. At best you are staying flat and being over priced for the performance and at worst it's going down.
The AI bubble will consume every last drop of memory it can get its hands on. These are massively complex manufacturering processes that can't just increase manufacturing tenfold overnight.So has there ever been a confirmation of the actual silicon shortage? Eight months later, and there's still a "shortage"? I'm a little skeptical on the length of this so called shortage.
There's some silicon suppliers definitely getting rich out there.
Last time I had to talk to Dell customer support I was like "okay, the backlight is definitely dead. I can see things on the screen if I hold a flashlight up to it, but it's not making any of its own light" and the guy was like "all right, we have to go through the troubleshooting steps though" and ninety minutes later after having me try every possible thing on Earth he was like [chipper] "okay, it looks like your screen's backlight is out! We'll have to schedule a technician to repair it!"Their support is also not the best. I had a precision mobile workstation die a couple months ago due to some sort of hardware failure, that laptop was only a month old. It wasn't right from day one, the touchpad was stuck on maximum sensitivity and no matter what I tried I couldn't change that. Even booting it to Linux it behaved the same, so that ruled out a software issue. The finger print reader only worked when it wanted to, which was about half the time. The fans constantly ran at full speed for no aparent reason, so it was really loud. Sometimes the screen would just totally glitch out, similar to what you get when you have a bad display ribbon. However a reboot always brought the display back, so it ovbiously wasn't a bad display ribbon. Then it started just randomly turning itself off. It didn't matter if I was using it or if it was just sitting there idle, it would just randomly turn itself off. Finally it died completely and wouldn't even power on. It was a work comupter and we have premium support, so they sent a tech to my house to fix it. The tech was unable to fix it and said I had to mail it in for repair. So I mailed it to them and a week later they sent it back, still completely dead, wouldn't even power on. So I sent it to Dell again, telling them they forgot to fix it. They sent it back a week later and this time it was actually fixed. SInce then it has worked fine. But I lost a lot of time and productivity going through all that.
Yeah. I only upgrade GPUs when I can no longer play some AAA games I really want to play at a suitable graphics quality. Whenever that happens I generally just check benchmarks/reviews and find the cheapest GPU that will play whatever I'm looking to play at pretty high settings, maybe with a little overhead budget permitting.GPU prices are insane right now so it's hard to call anything a "great value", but going higher has always had diminishing returns in the "performance/$" metric.
I get why people use that as a metric, but I kinda hate it. It's arguably useful in certain contexts, but as an absolute it's pretty useless. A GPU might score high in "performance/$" because it's genuinely good, or because it's cheap. So really it's only useful if you're either talking about comparable GPUs, in which case you're just saying one is cheaper than the other and expressing it as a ratio doesn't help anyone, or if you want to demonstrate how expensive the top end is. But at the end of the day, if you need a 10090XTiX to run the games you want to play, at the quality you're after, at the resolution and framerate of your choice, then how much better the bottom end card scores in perf/$ is entirely irrelevant. 100X better perf/$ is a waste of money if that means a 10fps slideshow.
Yeah they wanted to troubleshoot with me as well. I told them, it doesn't power on. They said okay well lets troubleshot that. They had me try all kinds of dumb stuff. But regardless, it would not power on. Like what part of it doesn't power on do you not understand? LOL!Last time I had to talk to Dell customer support I was like "okay, the backlight is definitely dead. I can see things on the screen if I hold a flashlight up to it, but it's not making any of its own light" and the guy was like "all right, we have to go through the troubleshooting steps though" and ninety minutes later after having me try every possible thing on Earth he was like [chipper] "okay, it looks like your screen's backlight is out! We'll have to schedule a technician to repair it!"