AMD’s Radeon RX 7700 XT and RX 7800 XT fill the gaps in its next-gen GPU lineup

Turbofrog

Ars Tribunus Militum
2,467
nVidia cards are far more compatible with older titles where the AMD are not. Being forced to use Software Renderer makes the title very ugly to play.
That doesn't make any sense.

I've literally never found a title that my RX580 couldn't play, and it runs the exact same drivers as any modern AMD card.

Are you talking about games you need to run in Windows 95 compatibility mode, or DOSBox or something?
 
Upvote
27 (27 / 0)

thelee

Ars Tribunus Militum
1,902
Subscriptor
Maybe that is a wise strategy in the strictest shrewd penny-pinching sense that weighs margin against volume to the 2nd decimal place, but it's absolutely terrible from a mindshare perspective.
hardware sounds really hard. i mean, trying to run a company in any competitive industry is hard, but especially hardware. because you gotta make massive investments in the next generation of hardware and whatever production capabilities that means and who knows if your decisions pays off correctly (cough intel's rough path towards 7nm), whereas for me the "massive investments" tends to amount to "should i buy this book on X language or concept or invest some time in understanding this new skunkworks project on github or wherever."

that is to say, given the end of moore's law and the massive capital needed to innovate in hardware these days, i firmly see the logic in nvidia and amd maximizing margins as much as they can now while they can instead of necessarily chasing commodity-level prices and market share. even any promised competitive disruption from intel (who has the kind of capital resources that could provide disruption in this day and age) is slow coming.
 
Upvote
12 (12 / 0)

aexcorp

Ars Praefectus
3,314
Subscriptor
AMD continues to insist on giving only marginal performance-per-dollar advantages against Nvidia, instead of trying to actually take market share.

Maybe that is a wise strategy in the strictest shrewd penny-pinching sense that weighs margin against volume to the 2nd decimal place, but it's absolutely terrible from a mindshare perspective.

Blowing the current narrative of rough parity wide open and attracting a whole bunch of value-oriented customers over to team red isn't even a strategy you would need to maintain forever. Even just doing it for this singular generation which has been terribly maligned from a consumer perspective (and is indeed the worst-selling generation of consumer GPUs in history) would give AMD a big reputation boost.

People still talk fondly about the value proposition of the RX480/580 and that was a business decision embarked upon 7 whole years ago.

The worst part about all of this is that the price cuts for these products are inevitable. And it won't even be that long, I have no doubt. AMD just doesn't want to enjoy the good press of putting out a fantastic value product, for some ridiculous reason.
Yeah, I can't really tell if AMD has to do this because of production costs having gone up (fab space is quite scarce and these GPUs are not small), if they want to do this to avoid appearing as the less good, "value" proposition brand, or if it's about keeping good margins ("greed").

But if it's about the perception of being "value"-oriented as a brand, AMD should be honest/realistic about the world today: RT and CUDA are significant advantages on the Nvidia side... It pains me to say this, because I've had GPUs from both brands in the last decade, but this time I went with Nvidia because I wanted to test RT and DLSS (purchase was when FSR 1 was significantly behind DLSS), and also to play around with AI.
 
Upvote
7 (9 / -2)

thelee

Ars Tribunus Militum
1,902
Subscriptor
But if it's about the perception of being "value"-oriented as a brand, AMD should be honest/realistic about the world today: RT and CUDA are significant advantages on the Nvidia side... It pains me to say this, because I've had GPUs from both brands in the last decade, but this time I went with Nvidia because I wanted to test RT and DLSS (purchase was when FSR 1 was significantly behind DLSS), and also to play around with AI.
i think it's an interesting "revealed preference" that AMD can get away with pretty much dodging RT - that is, a lot of consumers value RT at near-0, so AMD doesn't really have to discount their products that much (or any) as a result of having worse/non-RT and still make a "value" proposition.

to wit, i don't know how much of a typical gamer I am, but I think only one game in the past couple years I played even offered RT as an option, and frankly I did not care - I'm much more in the business of chasing 120fps at 4K than slightly prettier lights. to that end, FSR2 is great (BG3 and Diablo 4 with FSR 2 in quality setting is almost indistinguishable to me from native) and i'm looking forward to whatever FSR3 can pull off, so AMD has effectively retained a bit of the value argument here to me even if it's not quite DLSS.
 
Upvote
10 (13 / -3)

Turbofrog

Ars Tribunus Militum
2,467
Yeah, I can't really tell if AMD has to do this because of production costs having gone up (fab space is quite scarce and these GPUs are not small), if they want to do this to avoid appearing as the less good, "value" proposition brand, or if it's about keeping good margins ("greed").

But if it's about the perception of being "value"-oriented as a brand, AMD should be honest/realistic about the world today: RT and CUDA are significant advantages on the Nvidia side... It pains me to say this, because I've had GPUs from both brands in the last decade, but this time I went with Nvidia because I wanted to test RT and DLSS (purchase was when FSR 1 was significantly behind DLSS), and also to play around with AI.
Totally agree with your 2nd paragraph.

For the first one, I really struggle to imagine that AMD's feeling any sort of pinch on the production cost side. They are one of TSMC's largest customers, especially for this particular node, and so undoubtedly are enjoying preferential pricing. They're on the slightly older N5 node (instead of N4 for Lovelace) for RDNA3, and more significantly than that, only about 60% of the die area of their new GPUs are on N5 due to their chiplet, with the remaining MCD area being on the very-cheap N6 node. Nvidia needs big monolithic dies for an even more expensive process, so at least at the high-end, Lovelace is probably quite expensive to make (hence the >$1000 pricetags to maintain 60% gross margins, I guess).

All of that should translate to very, very good economics for these cards. Maybe not quite as good as the xx50 class hardware that Nvidia is now selling for $300-500, but not far off it. It seems like everyone is making absolutely killer margins on this generation and they just don't care if they barely sell any cards as a result.
 
Upvote
13 (14 / -1)

fluctuationEM

Wise, Aged Ars Veteran
106
Subscriptor
Yeah, I can't really tell if AMD has to do this because of production costs having gone up (fab space is quite scarce and these GPUs are not small), if they want to do this to avoid appearing as the less good, "value" proposition brand, or if it's about keeping good margins ("greed").

But if it's about the perception of being "value"-oriented as a brand, AMD should be honest/realistic about the world today: RT and CUDA are significant advantages on the Nvidia side... It pains me to say this, because I've had GPUs from both brands in the last decade, but this time I went with Nvidia because I wanted to test RT and DLSS (purchase was when FSR 1 was significantly behind DLSS), and also to play around with AI.
Agreed. I've used AMD/ATI cards only from 2003 to 2019. But since then, as I've had more income and no longer need to penny pinch when it comes to graphics card, I switched back to Nvidia as I started to care more about feature / power consumption / noise.

Nowadays I would still gladly go to team Red, but only if it packs significantly more value.
 
Upvote
4 (6 / -2)
That doesn't make any sense.

I've literally never found a title that my RX580 couldn't play, and it runs the exact same drivers as any modern AMD card.

Are you talking about games you need to run in Windows 95 compatibility mode, or DOSBox or something?
Nope, just older Windows games. When I get home I will look some up. This is why we have no AMD left at the house.
 
Upvote
-12 (0 / -12)

Turbofrog

Ars Tribunus Militum
2,467
Nope, just older Windows games. When I get home I will look some up. This is why we have no AMD left at the house.
Interesting. Anecdotally, I had more gamebreaking graphics compatibility issues within 2 weeks of installing my 3060 Ti (Borderlands 3) than in 5 full years of playing games on an RX580. I have nothing but praise for AMD's driver stability based on my personal experience.
 
Upvote
15 (15 / 0)

ScifiGeek

Ars Legatus Legionis
18,971
i think it's an interesting "revealed preference" that AMD can get away with pretty much dodging RT - that is, a lot of consumers value RT at near-0, so AMD doesn't really have to discount their products that much (or any) as a result of having worse/non-RT and still make a "value" proposition.

If not RT, then DLSS, or other features. Whatever the reasons, consumers overall have shown they will pay more NVidia cards, so AMD has to offer a discount.
 
Upvote
0 (1 / -1)
My gaming PC is just for gaming. Why would I be doing work on my personal computer?

If work needs me to have a $500 GPU to do my job, they can buy me one.
Some people have fun with both gaming and AI. Can't see much downside in a gaming GPU also being good for AI experiments.
 
Upvote
6 (6 / 0)

thelee

Ars Tribunus Militum
1,902
Subscriptor
If not RT, then DLSS, or other features. Whatever the reasons, consumers overall have shown they will pay more NVidia cards, so AMD has to offer a discount.
I think you’re missing the point of what I said.

RT performance in AMD is abysmal and AMD is not exactly pushing for Nvidia matching or beating RT AFAICT, and they are barely punished at all for that. If there existed a hypothetical Nvidia card that had the exact same raster performance as existing cards but no RT, and it was cheaper, it would not be accurately describable as a value card. You’re paying less and getting equivalently less. It’s the exact same value.

AMD RT is so bad that it might as well not exist but they don’t get punished for it. At least last time I was on the market (6000 series) the discount was not huge and AMD was still an authentically value product. Because consumer revealed preferences are that raster performance is overwhelmingly the priority, RT barely matters, and DLSS-FSR and other feature gaps are relatively small.

Like, if Nvidia issues an identical line with 0 RT capabilities, how much cheaper would their cards have to be to maintain the same value proposition? I would say not very much actually.
 
Upvote
4 (8 / -4)

ERIFNOMI

Ars Legatus Legionis
17,194
Some people have fun with both gaming and AI. Can't see much downside in a gaming GPU also being good for AI experiments.
Again, that's fine. This person said you need to buy NV GPUs because it'll benefit your job (improve your calls with your boss).

My point was what my job requires from my hardware is my employer's problem. My personal desktop is much better than my work laptop, but I'm sure as fuck not using my hardware for work. If I need something better for work, work is paying for it.

I did play around with AI image generation on my old 2080 Super, but the paltry 8GB of VRAM was a significant limitation.
 
Upvote
12 (12 / 0)

ERIFNOMI

Ars Legatus Legionis
17,194
I think you’re missing the point of what I said.

RT performance in AMD is abysmal and AMD is not exactly pushing for Nvidia matching or beating RT AFAICT, and they are barely punished at all for that. If there existed a hypothetical Nvidia card that had the exact same raster performance as existing cards but no RT, and it was cheaper, it would not be accurately describable as a value card. You’re paying less and getting equivalently less. It’s the exact same value.

AMD RT is so bad that it might as well not exist but they don’t get punished for it. At least last time I was on the market (6000 series) the discount was not huge and AMD was still an authentically value product. Because consumer revealed preferences are that raster performance is overwhelmingly the priority, RT barely matters, and DLSS-FSR and other feature gaps are relatively small.

Like, if Nvidia issues an identical line with 0 RT capabilities, how much cheaper would their cards have to be to maintain the same value proposition? I would say not very much actually.
AMD's 6000 series RT was pretty useless (much like NV's 20 series for that matter). The 7000 series is much improved. They don't have a 4090 competitor, but the 7900XTX can do some decent RT. It varies by the game of course. Some NV RTX games perform pretty terribly in RT on AMD, but there are others where the 7900XTX does pretty well. Especially games that just use ray tracing for some effects rather than trying to raytracing basically anywhere they can.

Of course if you're shooting for high refresh rate, high resolution, it doesn't really matter. Ray tracing is still such a huge performance hit for any GPU that if those are your priorities, ray tracing is a big hit to performance.
 
Upvote
10 (10 / 0)
AMD's 6000 series RT was pretty useless (much like NV's 20 series for that matter). The 7000 series is much improved. They don't have a 4090 competitor, but the 7900XTX can do some decent RT. It varies by the game of course. Some NV RTX games perform pretty terribly in RT on AMD, but there are others where the 7900XTX does pretty well. Especially games that just use ray tracing for some effects rather than trying to raytracing basically anywhere they can.

Of course if you're shooting for high refresh rate, high resolution, it doesn't really matter. Ray tracing is still such a huge performance hit for any GPU that if those are your priorities, ray tracing is a big hit to performance.
I've seen reference to the 7000 series being in line with the RTX 3xxx series ray tracing performance, which seems sufficient to me for most practical situations.
 
Upvote
3 (4 / -1)
If not RT, then DLSS, or other features. Whatever the reasons, consumers overall have shown they will pay more NVidia cards, so AMD has to offer a discount.
I'm not really sure RT and DLSS have had too much effect on that. Nvidia has had absurdly dominant market share for well before those technologies came out. Maybe it really does all boil down to CUDA? I have no idea. I have owned slightly more AMD cards in my life, but plenty of Nvidia cards, and I don't see why exactly Nvidia has 80% market share based entirely on how their actual GPUs compare to AMDs.
 
Upvote
12 (12 / 0)

thelee

Ars Tribunus Militum
1,902
Subscriptor
I've seen reference to the 7000 series being in line with the RTX 3xxx series ray tracing performance, which seems sufficient to me for most practical situations.
I haven’t paid too close attention but is it merely in line with RT performance or does it also have DLSS + RT level performance? Not too much experience but I’ve anecdotally heard that it’s DLSS in conjunction with RT that really gives Nvidia the RT capabilities
 
Upvote
1 (1 / 0)

TexasDex

Smack-Fu Master, in training
50
There seems to be a bit of an error in the article. The sentence "the new Radeon RX 7800 XT and RX 7700 XT, both advertised as 1440p graphics cards and available starting at $449 and $499, respectively." would imply that the 7800XT is $449 and the 7700XT is $499, which contradicts the byline. I assume the two model numbers were accidentally swapped.
 
Upvote
2 (2 / 0)

aexcorp

Ars Praefectus
3,314
Subscriptor
I'm not really sure RT and DLSS have had too much effect on that. Nvidia has had absurdly dominant market share for well before those technologies came out. Maybe it really does all boil down to CUDA? I have no idea. I have owned slightly more AMD cards in my life, but plenty of Nvidia cards, and I don't see why exactly Nvidia has 80% market share based entirely on how their actual GPUs compare to AMDs.
That's true. I also never understood the exact reasons. Could be that Nvidia GPUs are more commonly available on pre-built desktops and on laptops than AMD dGPUs? Probably also people being swayed by complaints about AMD drivers on forums etc., which I've always felt was unfair, and I've own cards of both brands.

I can't imagine enough people care about CUDA (and esp. until the last year or so) to really sway sales on the gaming side.
 
Upvote
2 (4 / -2)

jhodge

Ars Tribunus Angusticlavius
8,663
Subscriptor++
That's true. I also never understood the exact reasons. Could be that Nvidia GPUs are more commonly available on pre-built desktops and on laptops than AMD dGPUs? Probably also people being swayed by complaints about AMD drivers on forums etc., which I've always felt was unfair, and I've own cards of both brands.

I can't imagine enough people care about CUDA (and esp. until the last year or so) to really sway sales on the gaming side.
By now, I think it's largely self-perpetuating: NV is dominant, so developers invest time in supporting NV-specific features, so gamers want NV, so NV maintains it's dominance. AMD is going to need to deliver a much better (not comparable, not close-but-cheaper) GPU with good availability at a good price to start taking back market share.

In CPUs, it took AMD implementing a better 64-bit architecture than Intel to start making real gains, and then they blew it with the Bulldozer core.

When you're #2, "almost as good" isn't good enough.
 
Upvote
6 (6 / 0)

thelee

Ars Tribunus Militum
1,902
Subscriptor
I'm not really sure RT and DLSS have had too much effect on that. Nvidia has had absurdly dominant market share for well before those technologies came out. Maybe it really does all boil down to CUDA? I have no idea. I have owned slightly more AMD cards in my life, but plenty of Nvidia cards, and I don't see why exactly Nvidia has 80% market share based entirely on how their actual GPUs compare to AMDs.
I think CUDA helps quite a bit, at least in my circles. I don't care I just game. But it probably helps contribute to the mind-share/market-share.

Some of it is explainable just by inertia. To change it up, AMD'd really have to blow open the gates if they want to meaningfully eat into nvidia's share. And if you're AMD you really have to wonder - is it worth it? they're in the business of maximizing profits, current and future - they either have to really out-innovate nvidia somehow [extremely hard, if it was really that simple they'd have done it] or cut margins a lot and hope that increased sales outweigh the decreased per-unit $$$ (now and into the future). I think it'd take the combination of a major unforced error on nvidia's part and any opportunistic AMD actions to meaningfully shake up the GPU market, and it's more likely that AMD will mess up unfortunately, just by size they have less room for error. but it happened in the cpu space. i had intel chips for the longest time and could not comprehend using anything else, then some of my friends started doing AMD setups, and then I switched over around ~2000 series, and then the 5800X3D came out and it basically blew the doors off anything intel could do for the price point (for gaming), and all that mindshare change for us required not just amd doing solid work but also intel's own stagnation and strategic missteps.
 
Upvote
2 (2 / 0)

Abulia

Ars Tribunus Angusticlavius
8,388
Nvidia — by virtue of their record profits and greed — have done AMD so many favors this generation in keeping team red relevant. Had Nvidia sacrificed huge profits for more reasonable profits (still a profit, mind you!) they could have put AMD in a chokehold.

Really rooting for Intel to keep the pressure on and be the affordable option.
 
Upvote
4 (5 / -1)

ScifiGeek

Ars Legatus Legionis
18,971
That's true. I also never understood the exact reasons. Could be that Nvidia GPUs are more commonly available on pre-built desktops and on laptops than AMD dGPUs? Probably also people being swayed by complaints about AMD drivers on forums etc., which I've always felt was unfair, and I've own cards of both brands.

I can't imagine enough people care about CUDA (and esp. until the last year or so) to really sway sales on the gaming side.

Multiple factors things at play, leaving room for different people to have different reasons.

Gaming related:

Halo cards: NVidia has had the halo card for many generations, and AMD only competed against the lower end. Almost every time for the past decade when looking at CPU reviews, there will be a NVidia card in their, because it will minimize the gaming bottleneck. So you get multiple generations of reinforcement that NVidia simply makes the best cards.

New features: NVidia tend to push new features more, so they look more forward looking (love them or hate them because they are often proprietary). Hairworks, G-Sync, PhysX, DLSS, RTX...

Non Gaming related:

Media Encode/Decode: NVidia usually had better media section of the card (some of AMD's have been terrible in comparison).

Compute with CUDA, RTX making it into 3D render applications, and now Deep Learning with Tensor Cores.
 
Upvote
3 (3 / 0)

barich

Ars Legatus Legionis
10,742
Subscriptor++
$299 in 2001 is equivalent in purchasing power to about $516 today. So they're giving you a discount!
Eh. For decades, computer component prices dropped over the years in actual dollars, never mind inflation adjusted ones. There's a limit to how far that can continue, of course, but there's no denying that some of the current "inflation" affecting video card prices is greedy companies expecting bigger and bigger profits. Look at the RAM and SSD markets by comparison.

Personally, I just bought a used Radeon RX 5700 XT because it has by far the best price per FPS available at the moment. It can play everything at 1080p and reasonable quality settings for $135.
 
Upvote
5 (8 / -3)
Too many offerings from AMD and Nvidia have muddled the product space. Nvidia now owns the AI buzz meme and controls the high end product lines. AMD may have been better served by cutting some other middling products and bring these new cards in at $349 and $399. They can get ahead of Intel's own refresh which has a good change of closely matching these new AMD cards in performance at similar or lower price points.

Tough right now deciding on mid-tier CPU builds since there aren't many comparable priced GPU offerings. If you're not completely low-end or high-end, it's just frustrating for the boring masses in the middle.
 
Upvote
1 (1 / 0)

thelee

Ars Tribunus Militum
1,902
Subscriptor
Eh. For decades, computer component prices dropped over the years in actual dollars, never mind inflation adjusted ones. There's a limit to how far that can continue, of course, but there's no denying that some of the current "inflation" affecting video card prices is greedy companies expecting bigger and bigger profits.
The thing when applying "greedflation" to something like this is that it doesn't pass the sniff test.

If it was just pure rent-seeking, then it should be pretty easy (for some definition of easy) for a new firm to enter the industry and disrupt, by charging lower prices for the same product.

The fact that intel is on a years-long struggle to enter the GPU market and are only hitting the low-end of the market right now means that the sniff test fails. There is a legitimate case to be made that hardware has gotten astronomically harder than it was back in the 00s. Moore's Law coming to an end is a phenomenon that is truly being discussed. It takes much more effort to attain the same relative gains as before, and eventually it'll be impossible. (Arguably it already is practically impossible, given how weaksauce some GPU generations have been of late.)

It's probably just the much simpler case that nvidia and amd need ever more tons of capital to invest in the next generation of chips, and significant buffer room in case they make strategic errors or their very cyclical industry hits a down turn. They will absolutely maximize profits where they can, and that combines with the fact that they aren't complete idiots and wont' do something like have a $1000 margin to sell 1 unit when they can have a $500 margin and sell 10 units, for example.
 
Upvote
-6 (3 / -9)

glutto

Ars Scholae Palatinae
749
Subscriptor++
well i dated myself by remarking on people complaining about $60 video games when I paying the same amount in the 90s, so i've already done my quota of yelling at kids on my lawn for today.

I remember when they were asking for $120 around here for Phantasy Star 2 in 1990. That was a jaw-dropper.
 
Upvote
0 (0 / 0)

khumak50

Ars Tribunus Militum
1,533
I don't think we'll start to see reasonably priced gaming GPUs until/unless Intel's ARC lineup moves up the chain a bit in performance so they can put something up against the 80 class offering from AMD and Nvidia. Both AMD and especially Nvidia still seem to be in price gouging mode right now. Intel is much more consumer friendly in the GPU space right now but they really only have low to at best mid range offerings so far.
 
Upvote
5 (5 / 0)

Secondfloor

Ars Praefectus
3,257
Subscriptor
Nvidia — by virtue of their record profits and greed — have done AMD so many favors this generation in keeping team red relevant. Had Nvidia sacrificed huge profits for more reasonable profits (still a profit, mind you!) they could have put AMD in a chokehold.

Really rooting for Intel to keep the pressure on and be the affordable option.

Ah, I was waiting for a Ngreedia post.
 
Upvote
-11 (1 / -12)
Some of it is explainable just by inertia. To change it up, AMD'd really have to blow open the gates if they want to meaningfully eat into nvidia's share. And if you're AMD you really have to wonder - is it worth it? they're in the business of maximizing profits, current and future - they either have to really out-innovate nvidia somehow [extremely hard, if it was really that simple they'd have done it] or cut margins a lot and hope that increased sales outweigh the decreased per-unit $$$ (now and into the future).

There's a consideration that is often overlooked here - the apparently considerable stock of new previous generation cards that carried unreasonable, but initially tenable pandemic/semiconductor shortage-era MSRPs.

Add to this the glut of used previous-generation crypto cards, and you have a recipe for high prices on new product.

It should be lost on no one that the suggested 7700XT pricing makes discount 6700 series pricing appear even more attractive, which is helpful to retailers and board partners. Likewise, the hilariously bad price structure on current 4000 series GPUs will be helpful to liquidate old 3000 series cards.

The two big GPU manufacturers therefore have numerous, explicitly non-inflationary reasons to keep MSRPs high at the current moment:
-Existing stock on shelves and in warehouses, some still carrying impossible pandemic prices.
-Finer process fab allocations will be reserved for the greatest number of the most highly profitable SKUs; those are not GPUs, but AI/compute/server chips, currently in high demand.
-AMD sells the vast majority of their GPUs in console SoC form, and the (RDNA3ish) PS5 Pro &/or XBoxSNextONE are not announced.
-Credible near-term competition in this space is incredibly unlikely, given the absolutely enormous barriers to entry in this market, both on the production and the software/driver side. Intel is an engineering behemoth and poached some of the top talent, yet Arc isn't fast or backwards-compatible enough to compete much above $300.

Well-priced 7000 series midrange cards therefore risk ire from AMDs biggest customers (Sony, MS), as well as board/retail partners stuck with old stock they paid a premium for, all while taking fab space from more currently profitable products (Instinct, Epyc).

Their competitors are either printing money and deliberately hobbling their consumer GPUs to induce still more up-selling for compute (nvidia), or need an overpowered product to compensate for their lack of software/driver optimization, while relying on the same TSMC fabs as everyone else (intel, any new entrant).

So what incentive does AMD really have to do anything other than what they're doing?

Predictable 'magic-of-the-marketplace'll fix it' and 'it's just inflation' hand-waving therefore appears quite insufficient to explain continued pricing at odds with Moore's law, for both those and additional reasons.
 
Upvote
9 (10 / -1)