Adrenaline is ok. At least it doesn't require a fucking account to use.huh. and here I am, the asshole who never opens the software because I assumed it's all just bloated bullshit and all I wanted was the drivers
$299 in 2001 is equivalent in purchasing power to about $516 today. So they're giving you a discount!Yeesh, I remember paying $299 for a 3D Prophet II and thinking that was miserably expensive . . . Now $499 is undercutting the competition. How times change.
Adrenaline is ok. At least it doesn't require a fucking account to use.
That doesn't make any sense.nVidia cards are far more compatible with older titles where the AMD are not. Being forced to use Software Renderer makes the title very ugly to play.
hardware sounds really hard. i mean, trying to run a company in any competitive industry is hard, but especially hardware. because you gotta make massive investments in the next generation of hardware and whatever production capabilities that means and who knows if your decisions pays off correctly (cough intel's rough path towards 7nm), whereas for me the "massive investments" tends to amount to "should i buy this book on X language or concept or invest some time in understanding this new skunkworks project on github or wherever."Maybe that is a wise strategy in the strictest shrewd penny-pinching sense that weighs margin against volume to the 2nd decimal place, but it's absolutely terrible from a mindshare perspective.
Yeah, I can't really tell if AMD has to do this because of production costs having gone up (fab space is quite scarce and these GPUs are not small), if they want to do this to avoid appearing as the less good, "value" proposition brand, or if it's about keeping good margins ("greed").AMD continues to insist on giving only marginal performance-per-dollar advantages against Nvidia, instead of trying to actually take market share.
Maybe that is a wise strategy in the strictest shrewd penny-pinching sense that weighs margin against volume to the 2nd decimal place, but it's absolutely terrible from a mindshare perspective.
Blowing the current narrative of rough parity wide open and attracting a whole bunch of value-oriented customers over to team red isn't even a strategy you would need to maintain forever. Even just doing it for this singular generation which has been terribly maligned from a consumer perspective (and is indeed the worst-selling generation of consumer GPUs in history) would give AMD a big reputation boost.
People still talk fondly about the value proposition of the RX480/580 and that was a business decision embarked upon 7 whole years ago.
The worst part about all of this is that the price cuts for these products are inevitable. And it won't even be that long, I have no doubt. AMD just doesn't want to enjoy the good press of putting out a fantastic value product, for some ridiculous reason.
i think it's an interesting "revealed preference" that AMD can get away with pretty much dodging RT - that is, a lot of consumers value RT at near-0, so AMD doesn't really have to discount their products that much (or any) as a result of having worse/non-RT and still make a "value" proposition.But if it's about the perception of being "value"-oriented as a brand, AMD should be honest/realistic about the world today: RT and CUDA are significant advantages on the Nvidia side... It pains me to say this, because I've had GPUs from both brands in the last decade, but this time I went with Nvidia because I wanted to test RT and DLSS (purchase was when FSR 1 was significantly behind DLSS), and also to play around with AI.
Totally agree with your 2nd paragraph.Yeah, I can't really tell if AMD has to do this because of production costs having gone up (fab space is quite scarce and these GPUs are not small), if they want to do this to avoid appearing as the less good, "value" proposition brand, or if it's about keeping good margins ("greed").
But if it's about the perception of being "value"-oriented as a brand, AMD should be honest/realistic about the world today: RT and CUDA are significant advantages on the Nvidia side... It pains me to say this, because I've had GPUs from both brands in the last decade, but this time I went with Nvidia because I wanted to test RT and DLSS (purchase was when FSR 1 was significantly behind DLSS), and also to play around with AI.
Agreed. I've used AMD/ATI cards only from 2003 to 2019. But since then, as I've had more income and no longer need to penny pinch when it comes to graphics card, I switched back to Nvidia as I started to care more about feature / power consumption / noise.Yeah, I can't really tell if AMD has to do this because of production costs having gone up (fab space is quite scarce and these GPUs are not small), if they want to do this to avoid appearing as the less good, "value" proposition brand, or if it's about keeping good margins ("greed").
But if it's about the perception of being "value"-oriented as a brand, AMD should be honest/realistic about the world today: RT and CUDA are significant advantages on the Nvidia side... It pains me to say this, because I've had GPUs from both brands in the last decade, but this time I went with Nvidia because I wanted to test RT and DLSS (purchase was when FSR 1 was significantly behind DLSS), and also to play around with AI.
Nope, just older Windows games. When I get home I will look some up. This is why we have no AMD left at the house.That doesn't make any sense.
I've literally never found a title that my RX580 couldn't play, and it runs the exact same drivers as any modern AMD card.
Are you talking about games you need to run in Windows 95 compatibility mode, or DOSBox or something?
Interesting. Anecdotally, I had more gamebreaking graphics compatibility issues within 2 weeks of installing my 3060 Ti (Borderlands 3) than in 5 full years of playing games on an RX580. I have nothing but praise for AMD's driver stability based on my personal experience.Nope, just older Windows games. When I get home I will look some up. This is why we have no AMD left at the house.
i think it's an interesting "revealed preference" that AMD can get away with pretty much dodging RT - that is, a lot of consumers value RT at near-0, so AMD doesn't really have to discount their products that much (or any) as a result of having worse/non-RT and still make a "value" proposition.
Some people have fun with both gaming and AI. Can't see much downside in a gaming GPU also being good for AI experiments.My gaming PC is just for gaming. Why would I be doing work on my personal computer?
If work needs me to have a $500 GPU to do my job, they can buy me one.
Certainly. There's no downside, unless of course you pay a lot more money for those capabilities and don't ever use them.Some people have fun with both gaming and AI. Can't see much downside in a gaming GPU also being good for AI experiments.
I think you’re missing the point of what I said.If not RT, then DLSS, or other features. Whatever the reasons, consumers overall have shown they will pay more NVidia cards, so AMD has to offer a discount.
Again, that's fine. This person said you need to buy NV GPUs because it'll benefit your job (improve your calls with your boss).Some people have fun with both gaming and AI. Can't see much downside in a gaming GPU also being good for AI experiments.
AMD's 6000 series RT was pretty useless (much like NV's 20 series for that matter). The 7000 series is much improved. They don't have a 4090 competitor, but the 7900XTX can do some decent RT. It varies by the game of course. Some NV RTX games perform pretty terribly in RT on AMD, but there are others where the 7900XTX does pretty well. Especially games that just use ray tracing for some effects rather than trying to raytracing basically anywhere they can.I think you’re missing the point of what I said.
RT performance in AMD is abysmal and AMD is not exactly pushing for Nvidia matching or beating RT AFAICT, and they are barely punished at all for that. If there existed a hypothetical Nvidia card that had the exact same raster performance as existing cards but no RT, and it was cheaper, it would not be accurately describable as a value card. You’re paying less and getting equivalently less. It’s the exact same value.
AMD RT is so bad that it might as well not exist but they don’t get punished for it. At least last time I was on the market (6000 series) the discount was not huge and AMD was still an authentically value product. Because consumer revealed preferences are that raster performance is overwhelmingly the priority, RT barely matters, and DLSS-FSR and other feature gaps are relatively small.
Like, if Nvidia issues an identical line with 0 RT capabilities, how much cheaper would their cards have to be to maintain the same value proposition? I would say not very much actually.
I've seen reference to the 7000 series being in line with the RTX 3xxx series ray tracing performance, which seems sufficient to me for most practical situations.AMD's 6000 series RT was pretty useless (much like NV's 20 series for that matter). The 7000 series is much improved. They don't have a 4090 competitor, but the 7900XTX can do some decent RT. It varies by the game of course. Some NV RTX games perform pretty terribly in RT on AMD, but there are others where the 7900XTX does pretty well. Especially games that just use ray tracing for some effects rather than trying to raytracing basically anywhere they can.
Of course if you're shooting for high refresh rate, high resolution, it doesn't really matter. Ray tracing is still such a huge performance hit for any GPU that if those are your priorities, ray tracing is a big hit to performance.
I'd say that's probably fair. They are behind the 40 series in ray tracing, but ahead in rasterization.I've seen reference to the 7000 series being in line with the RTX 3xxx series ray tracing performance, which seems sufficient to me for most practical situations.
I'm not really sure RT and DLSS have had too much effect on that. Nvidia has had absurdly dominant market share for well before those technologies came out. Maybe it really does all boil down to CUDA? I have no idea. I have owned slightly more AMD cards in my life, but plenty of Nvidia cards, and I don't see why exactly Nvidia has 80% market share based entirely on how their actual GPUs compare to AMDs.If not RT, then DLSS, or other features. Whatever the reasons, consumers overall have shown they will pay more NVidia cards, so AMD has to offer a discount.
I haven’t paid too close attention but is it merely in line with RT performance or does it also have DLSS + RT level performance? Not too much experience but I’ve anecdotally heard that it’s DLSS in conjunction with RT that really gives Nvidia the RT capabilitiesI've seen reference to the 7000 series being in line with the RTX 3xxx series ray tracing performance, which seems sufficient to me for most practical situations.
That's true. I also never understood the exact reasons. Could be that Nvidia GPUs are more commonly available on pre-built desktops and on laptops than AMD dGPUs? Probably also people being swayed by complaints about AMD drivers on forums etc., which I've always felt was unfair, and I've own cards of both brands.I'm not really sure RT and DLSS have had too much effect on that. Nvidia has had absurdly dominant market share for well before those technologies came out. Maybe it really does all boil down to CUDA? I have no idea. I have owned slightly more AMD cards in my life, but plenty of Nvidia cards, and I don't see why exactly Nvidia has 80% market share based entirely on how their actual GPUs compare to AMDs.
By now, I think it's largely self-perpetuating: NV is dominant, so developers invest time in supporting NV-specific features, so gamers want NV, so NV maintains it's dominance. AMD is going to need to deliver a much better (not comparable, not close-but-cheaper) GPU with good availability at a good price to start taking back market share.That's true. I also never understood the exact reasons. Could be that Nvidia GPUs are more commonly available on pre-built desktops and on laptops than AMD dGPUs? Probably also people being swayed by complaints about AMD drivers on forums etc., which I've always felt was unfair, and I've own cards of both brands.
I can't imagine enough people care about CUDA (and esp. until the last year or so) to really sway sales on the gaming side.
I think CUDA helps quite a bit, at least in my circles. I don't care I just game. But it probably helps contribute to the mind-share/market-share.I'm not really sure RT and DLSS have had too much effect on that. Nvidia has had absurdly dominant market share for well before those technologies came out. Maybe it really does all boil down to CUDA? I have no idea. I have owned slightly more AMD cards in my life, but plenty of Nvidia cards, and I don't see why exactly Nvidia has 80% market share based entirely on how their actual GPUs compare to AMDs.
That's true. I also never understood the exact reasons. Could be that Nvidia GPUs are more commonly available on pre-built desktops and on laptops than AMD dGPUs? Probably also people being swayed by complaints about AMD drivers on forums etc., which I've always felt was unfair, and I've own cards of both brands.
I can't imagine enough people care about CUDA (and esp. until the last year or so) to really sway sales on the gaming side.
Eh. For decades, computer component prices dropped over the years in actual dollars, never mind inflation adjusted ones. There's a limit to how far that can continue, of course, but there's no denying that some of the current "inflation" affecting video card prices is greedy companies expecting bigger and bigger profits. Look at the RAM and SSD markets by comparison.$299 in 2001 is equivalent in purchasing power to about $516 today. So they're giving you a discount!
The thing when applying "greedflation" to something like this is that it doesn't pass the sniff test.Eh. For decades, computer component prices dropped over the years in actual dollars, never mind inflation adjusted ones. There's a limit to how far that can continue, of course, but there's no denying that some of the current "inflation" affecting video card prices is greedy companies expecting bigger and bigger profits.
well i dated myself by remarking on people complaining about $60 video games when I paying the same amount in the 90s, so i've already done my quota of yelling at kids on my lawn for today.
And that was 20+ years ago. What's $299 in 2023 dollars? I'll bet it's more than $499Yeesh, I remember paying $299 for a 3D Prophet II and thinking that was miserably expensive . . . Now $499 is undercutting the competition. How times change.
Nvidia — by virtue of their record profits and greed — have done AMD so many favors this generation in keeping team red relevant. Had Nvidia sacrificed huge profits for more reasonable profits (still a profit, mind you!) they could have put AMD in a chokehold.
Really rooting for Intel to keep the pressure on and be the affordable option.
Some of it is explainable just by inertia. To change it up, AMD'd really have to blow open the gates if they want to meaningfully eat into nvidia's share. And if you're AMD you really have to wonder - is it worth it? they're in the business of maximizing profits, current and future - they either have to really out-innovate nvidia somehow [extremely hard, if it was really that simple they'd have done it] or cut margins a lot and hope that increased sales outweigh the decreased per-unit $$$ (now and into the future).