• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD: Radeon 7900XTX ($999) & 7900XT ($899) Announced | Available December 13th

OZ9000

Banned
Bro 7900 XTX is not a gamer quality GPU. It's AMD's super high end GPU equivalent to a titan or 4090/3090. $1000 for that is pretty fair.
The 7800XT will likely be $300 cheaper minimum and will be the more reasonable gaming-class GPU.
He does have a point.

Pricing for GPUs is getting worse and worse each generation.
 
Last edited:

Sanepar

Member
Most of the games offering RT also offers DLSS/FSR. And most of the people use them because they work awesome and basically give free FPS.
Sure, if you're some kind of PC snob that won't lower yourself to using DLSS (because of reasons), then thats your problem. But actually many people can make use of RT today.
I can't see the point to pay $1200 to use dlss for a RT on majority of games that resumes on reflections u can only pay attention on static situations.

Current RT doesn't make any diff in most of games.

And many games come out without RT, like Cod Mw2 or Plague Tale.

Like Spiderman no one will see in movement the reflection on a window. It is stupid.

The only game i see a reall diff with RT is Metro Exodus.
 

thuGG_pl

Member
I can't see the point to pay $1200 to use dlss for a RT on majority of games that resumes on reflections u can only pay attention on static situations.

Current RT doesn't make any diff in most of games.

And many games come out without RT, like Cod Mw2 or Plague Tale.

Like Spiderman no one will see in movement the reflection on a window. It is stupid.

The only game i see a reall diff with RT is Metro Exodus.

But what's wrong with DLSS? I don't get your argument. DLSS greatly improves the performeance for almost none hit to quality. In my book you're mad if you're skipping on performace for free.
 

Leonidas

AMD's Dogma: ARyzen (No Intel inside)
Bro 7900 XTX is not a gamer quality GPU. It's AMD's super high end GPU equivalent to a titan or 4090/3090. $1000 for that is pretty fair.
The 7800XT will likely be $300 cheaper minimum and will be the more reasonable gaming-class GPU.
But how cut down will the 7800 XT be? The 7900 XT (-13% CU) is already cut down more than the 6800 XT (-10% CU).

If 7800 XT ends up as the rumored 60 CU, then the top SKU will have 60% more CUs. Terrible.
 
Last edited:

HoofHearted

Member
Bro 7900 XTX is not a gamer quality GPU. It's AMD's super high end GPU equivalent to a titan or 4090/3090. $1000 for that is pretty fair.
The 7800XT will likely be $300 cheaper minimum and will be the more reasonable gaming-class GPU.
Except it’s not though… AMD has already stated that the 7900 cards are equivalent to the 4080… not the 4090…

Assuming AMD is working on a 7950 - that would most likely be comparable to the 4090…
 

iQuasarLV

Member
The market is largely responsible for this.
We kept buying higher and higher priced product since the original Titan launched. The upward trend never stops.
Naw blame the miners
Gamers spoke with their wallets during the RTX 2xxx series and Nvidia did a back step next release. However, a pandemic and miners buying shit up by the pallet got the decision makers going absolutely batshit insane thinking we're going to embrace a $1,000+ market for cards.
 

poppabk

Cheeks Spread for Digital Only Future
Coming from a 3060 Ti, I'd say I got a great deal in my 6800 XT.*

* Also, don't be a fanboy clown. It doesn't look good on you or anyone else, lol
Didn't even realize that there were Nvidia or AMD fanboys. I just switch depending on what works for me, Matrox to Voodoo to Nvidia to ATI to Nvidia to AMD...and on and on.
 
Last edited:

Nvzman

Member
Except it’s not though… AMD has already stated that the 7900 cards are equivalent to the 4080… not the 4090…

Assuming AMD is working on a 7950 - that would most likely be comparable to the 4090…
They are speaking strictly performance-wise. In terms of actual market and pricing its definitely the "halo" AMD card which is what the former titan and RTX3090/4090 are.
 
He does have a point.

Pricing for GPUs is getting worse and worse each generation.

No doubt. Both AMD and Nvidia jumped the base prices for the respective models last-gen (though Nvidia still offered some great value when the first cards launched) and Nvidia looks to be pushing up even further. It makes it somewhat refreshing that AMD at least didn't push past the $1k mark for their lineup.
 
How does intel ARC RT compare to RTX40 series? Is it better than RDNA 2? From What I heard, it handles it better than RDNA 2, but not as good as RTX30 and RTX40 (correct me if I am wrong)
How does intel ARC Raster compare to RTX40 series? Is it better than RDNA 2? From What I heard, its decent but not close to RDNA2, and RTX30 and 40 series (correct me if I am wrong)
How does intel XeSS upscaling compare to RTX40 series? Is it better than RDNA 2? From What I heard, it's better than FSR 2.0 but not quite like RTX series DLSS 2.2 (correct me if I am wrong)

Is it safe to assume that RDNA 3 RT performance will be like ARC RT performance at minimum?
Is it safe to assume that FSR 2.2/3.0 will be in between XeSS and DLSS 3.0?
 

twilo99

Member
We will see what happens when the new GPUs launch and we get the drivers for them, but there is nothing wrong with the current AMD drivers. I've had my 6800xt for over a year and no issues on the driver side.
 

supernova8

Banned
I expect that the 7900XTX will review well but that reviewers will still end up saying:

"If you want the absolute best then get the 4090" because let's face it, anyone even contemplating spending $1000 on a GPU has "fuck you" levels of money and can afford to just get a 4090 if they want the best. They're probably spending around $1000 on a CPU alone, a bunch of money on an AM5 motherboard and the accompaying DDR5 memory, and are not going to miss/give a shit about that extra $600 required to get the 4090 over the 7900XTX.

I think AMD is barking up the wrong tree trying to market the 7900XTX as being better value to a subset of consumers who, clearly, don't really care about "value" and just want the best (no matter how much it costs).
 
Last edited:

Panajev2001a

GAF's Pleasant Genius
I expect that the 7900XTX will review well but that reviewers will still end up saying:

"If you want the absolute best then get the 4090" because let's face it, anyone even contemplating spending $1000 on a GPU has "fuck you" levels of money and can afford to just get a 4090 if they want the best. They're probably spending around $1000 on a CPU alone, a bunch of money on an AM5 motherboard and the accompaying DDR5 memory, and are not going to miss/give a shit about that extra $600 required to get the 4090 over the 7900XTX.

I think AMD is barking up the wrong tree trying to market the 7900XTX as being better value to a subset of consumers who, clearly, don't really care about "value" and just want the best (no matter how much it costs).
Then at least they should comment about the real total cost: case updates, new PSU, improved cooling. All things that the XTX avoids. Much smaller, much cheaper, and much lower power consumption needs to be part of the picture unless you are trying to sell nVIDIA cards as your main priority.
 

lukilladog

Member
But what's wrong with DLSS? I don't get your argument. DLSS greatly improves the performeance for almost none hit to quality. In my book you're mad if you're skipping on performace for free.

Nah, I instantly can see substantial drop in texture quality and the sharpness of objects. To me it just makes the problem of having to run games at a lower resolution bearable.
 
I would take a 10% performance loss for a substantial price cut.

The pricing of the 4000 series is an utter joke in the UK.

I'll be curious how AMD's pricing translates over here.

Yeah over 2 grand on the 4090 is a joke. Least AMD coming in with something decent at a price that some would accept.
 

GHG

Member
Then at least they should comment about the real total cost: case updates, new PSU, improved cooling. All things that the XTX avoids. Much smaller, much cheaper, and much lower power consumption needs to be part of the picture unless you are trying to sell nVIDIA cards as your main priority.

The fact that it doesn't come with a power connector mystery lottery and the accompanying anxiety is a pretty big deal.
 

twilo99

Member
Whoa… not downgraded from a 4090 competitor to a 4080 competitor to a 3080 competitors. More green sites news later it would be maybe a XSS competitor eh ;)?

Well its basically what he is saying here



I thought that the "900" in the name would suggest a 4090 type card but I guess not..
 

Panajev2001a

GAF's Pleasant Genius
The fact that it doesn't come with a power connector mystery lottery and the accompanying anxiety is a pretty big deal.
Apparently not :LOL:.

That is the power of mindshare as well as the halo effect of people whose best PC at all costs and all risks and defending this choice vigorously and sometimes in very misleading ways.
 

Sanepar

Member
Now that we know 4080 sucks hard the only option will be 7900 xtx. Problem will be how many AMD will have available.
 

supernova8

Banned
Then at least they should comment about the real total cost: case updates, new PSU, improved cooling. All things that the XTX avoids. Much smaller, much cheaper, and much lower power consumption needs to be part of the picture unless you are trying to sell nVIDIA cards as your main priority.
That's partly true but:
(1) Most people buying a 4090 have probably already seen the news about all that stuff (I could be wrong but I like to think the cohort of people routinely dropping $1000+ on GPUs are not that stupid/ignorant).
(2) People buying a 4090 probably had a 3090/Ti and thus likely already have a large case anyway (or if not, cases are hardly an expensive part of a PC build relative to other components).
(3) Back to my original point, people in the market for a card of this price probably don't mind paying for a new PSU anyway.

Plus, to be clear, I'm not interested in the 4090 (or the 7900XTX) or any GPU in that price range. It's just obvious/common sense if you consider (generally) the demographic most likely to buy a 4090. Plus, if they're professionals doing graphics related stuff then they're 150% getting a 4090 regardless because of CUDA and AMD's lack of anything to compete with it.

I'm definitely not defending my purchase or anyone else's, just pointing out that trying to preach about value to people who don't care about value is a recipe for defeat.
 

RoboFu

One of the green rats
Well its basically what he is saying here



I thought that the "900" in the name would suggest a 4090 type card but I guess not..

He saying in price.. but it would be dumb to assume a $1,000 card would compete with a $1,600 card. 😵‍💫

Though I am expecting it to be better than a 4080.

A 4090 is a halo pr product. Like how a Shelby mustang is used to sell the 30k 4 cylinder mustang. No matter what fanboys will use the 4090 as their comparison point.
 
Last edited:
Whoa… not downgraded from a 4090 competitor to a 4080 competitor to a 3080 competitors. More green sites news later it would be maybe a XSS competitor eh ;)?

The only area I can see them being 3080ti level with is maybe some of the RT modes that don't run well on AMD. But, yeah the media likes to spin for the more popular brand, that's where they figure the most clicks are.
 
Yup exactly. $1000 for their flagship “Titan” competitor is not bad. I’m more disappointed that neither they nor Nvidia have anything (yet) that’s priced for mainstream or even high-end. This must be the first GPU gen where they both launched with only the enthusiast tier.
Well when put that way, yeah against the Titan class cards then it makes sense. Don't know why Nvidia stopped using that moniker and AMD never had one. For that segment, as best of the best, this price looks normal.

That being said, it does kinda suck in how it give them an out to price the next in line near this cost. In the past the highest priced cards where the gaming line and the production cards like quadro where a seperate naming scheme and market, plus usually way higher price points. This changed with the titan class cards, but back then "titan"class cards didn't come out with the gaming lineup.

Example, going back in time to the 8800gtx for $600 and 8800gts for $400. There was no titan class card until the 8800 ultra which was priced near $1000 , came out later and gamers rightly called it out as not being good enough a value for the high price (this wasn't even a titan card as that term wasn't around yet). Now they release the titan class cards first and the rest fall in line from that. Gamers quickly buy it anyway. It sucks for consumers, but is probably profitable for nvidia/amd.

I remember when the official "titan" card came out people thought it was too expensive then too. Although I think those cards had something different that set them apart from gaming grade cards, either 2x the memory or precision, faster ram , something that justified the high price.
 

PaintTinJr

Member
The thing is that the 6800xt performs the same as the 3080 in the matrix demo. Epics lumens seems to be designed around the console implementation and doesn’t give a boost to rtx cards like other ray tracing games.
I've been thinking about that and checked my suspicions with technpowerups specs for Rapid Packed Maths for each of the cards, and Nvidia is getting nothing extra, and the AMD cards are getting double flops - just like Cerny said with PS4 IIRC - so there's every likelihood that even the 7900XT will run comparatively well against the 4090 in UE5.

https://www.techpowerup.com/gpu-specs/geforce-rtx-3080.c3621
https://www.techpowerup.com/gpu-specs/radeon-rx-6950-xt.c3875
https://www.techpowerup.com/gpu-specs/geforce-rtx-4090.c3889
https://www.techpowerup.com/gpu-specs/radeon-rx-7900-xtx.c3941

On older games I suspect the ability to batch all the independent deferred render passes to run in parallel on these monster cards results in high frame-rates, but where RT on a nvidia card can run in parallel to that work too, and be ready for the gather pass earlier that produces the final frame, on the AMD cards the BVH accelerator is part of the CU, so without an excess of CUs for rendering and RT, that work probably gets done a pass later on AMD cards, hence why the RTX cards have consistently beaten them. With UE5, the Rapid pack maths boost on AMD probably means that nanite takes far less time, leaving headroom for them to do the lumen pass as fast without dedicated RTX cores, and the RTX cards probably can't use the RTX cores until nanite finishes, meaning the RTX cores are probably idle through the slower nvidia processing of nanite. Or at least that's what I would expect, and now think that long term AMD will get massive gains on all UE5 games.
 
Last edited:

iQuasarLV

Member
I am thinking if the used market unloads their 6950 XT in December, what is the best website to explore the idea of picking one up for around $550?
 

SlimySnake

Flashless at the Golden Globes
I've been thinking about that and checked my suspicions with technpowerups specs for Rapid Packed Maths for each of the cards, and Nvidia is getting nothing extra, and the AMD cards are getting double flops - just like Cerny said with PS4 IIRC - so there's every likelihood that even the 7900XT will run comparatively well against the 4090 in UE5.

https://www.techpowerup.com/gpu-specs/geforce-rtx-3080.c3621
https://www.techpowerup.com/gpu-specs/radeon-rx-6950-xt.c3875
https://www.techpowerup.com/gpu-specs/geforce-rtx-4090.c3889
https://www.techpowerup.com/gpu-specs/radeon-rx-7900-xtx.c3941

On older games I suspect the ability to batch all the independent deferred render passes to run in parallel on these monster cards results in high frame-rates, but where RT on a nvidia card can run in parallel to that work too, and be ready for the gather pass earlier that produces the final frame, on the AMD cards the BVH accelerator is part of the CU, so without an excess of CUs for rendering and RT, that work probably gets done a pass later on AMD cards, hence why the RTX cards have consistently beaten them. With UE5, the Rapid pack maths boost on AMD probably means that nanite takes far less time, leaving headroom for them to do the lumen pass as fast without dedicated RTX cores, and the RTX cards probably can't use the RTX cores until nanite finishes, meaning the RTX cores are probably idle through the slower nvidia processing of nanite. Or at least that's what I would expect, and now think that long term AMD will get massive gains on all UE5 games.
Thats very interesting. There have been some AMD sponsored RT games that have offered relatively similar RT performance. I think Far Cry 6 had very similar benchmarks.

It will be interesting to see how RT performs in the new Star Wars Jedi Fallen Order game. It's using full RT and its a UE5 game. However, they dont seem to be using Lumens but the standard UE ray tracing support.

But yeah, I think with everyone going UE5 including CD Project who had by far the worst performing RT game on AMD GPUs, it bodes well for AMD going forward. I just dont know when that will be. Cyberpunk is still the goto benchmark for ray tracing games and AMD will lose every benchmark there. And aside from Jedi Survivor, next year is as barren as this year when it comes to next gen only game that arent next gen in name only.
 

PaintTinJr

Member
Thats very interesting. There have been some AMD sponsored RT games that have offered relatively similar RT performance. I think Far Cry 6 had very similar benchmarks.

It will be interesting to see how RT performs in the new Star Wars Jedi Fallen Order game. It's using full RT and its a UE5 game. However, they dont seem to be using Lumens but the standard UE ray tracing support.

But yeah, I think with everyone going UE5 including CD Project who had by far the worst performing RT game on AMD GPUs, it bodes well for AMD going forward. I just dont know when that will be. Cyberpunk is still the goto benchmark for ray tracing games and AMD will lose every benchmark there. And aside from Jedi Survivor, next year is as barren as this year when it comes to next gen only game that arent next gen in name only.
I'm going to go out on a limb and say at native 4K through Proton - to eliminate any dodgy paid Windows throttling for Nvidia - that the 7900XTX will run Jedi Survivor with RT with better or equal to the 4090 based on the specs of the FP16 flops and pixel-rate:)
 

Buggy Loop

Member
The thing is that the 6800xt performs the same as the 3080 in the matrix demo. Epics lumens seems to be designed around the console implementation and doesn’t give a boost to rtx cards like other ray tracing games.

It.. also performs the same at 4K, 1440p and even 1080p with a compiled version, indicating that maybe it’s an incomplete demo/engine? To the point where the global illumination seems to be calculated on CPU? Nobody uses matrix demo to benchmark, because it has all the red flags indicating it’s CPU limited while the utilization is indicating that it’s not. Something is broken or limited in the background.

Extrapolating anything out of it, on an engine that is not ready for production.. I mean yea, I hope I don’t have to explain further? Even the dev community on unrealengine forum are encountering a ton of problems


I'm going to go out on a limb and say at native 4K through Proton - to eliminate any dodgy paid Windows throttling for Nvidia -

The Office What GIF
 
Last edited:

Crayon

Member
That's partly true but:
(1) Most people buying a 4090 have probably already seen the news about all that stuff (I could be wrong but I like to think the cohort of people routinely dropping $1000+ on GPUs are not that stupid/ignorant).
(2) People buying a 4090 probably had a 3090/Ti and thus likely already have a large case anyway (or if not, cases are hardly an expensive part of a PC build relative to other components).
(3) Back to my original point, people in the market for a card of this price probably don't mind paying for a new PSU anyway.

Plus, to be clear, I'm not interested in the 4090 (or the 7900XTX) or any GPU in that price range. It's just obvious/common sense if you consider (generally) the demographic most likely to buy a 4090. Plus, if they're professionals doing graphics related stuff then they're 150% getting a 4090 regardless because of CUDA and AMD's lack of anything to compete with it.

I'm definitely not defending my purchase or anyone else's, just pointing out that trying to preach about value to people who don't care about value is a recipe for defeat.

And someone who has $1,000 can also pay someone else to rebuild their pc with all that stuff they bought if they don't want to. This is getting rediculous. Being willing to pay $1,000 does not automatically mean the buyer has a money no object attitude. $1,000 does not equal $1,600. $600 does not equal $0. The $300 for a psu and a case is not effectively zero dollars because someone has a thousand. Actually why spend $300 on adequate case and psu when you can spend $500 for the good stuff. After all, they had $1,000 so there's got to be plenty more where that came from. They're a thousandaire, after all.
 

SlimySnake

Flashless at the Golden Globes
Interesting results wix max settings + RT. That's about "bad Amd RT"
974ef5976c5830e97ee67e1e451b9ccd.png
So 50% better would get us to around 45 fps? not too bad. PaintTinJr PaintTinJr another next gen demo that isnt really giving nvidia the boost we've come to expect.

Those 6600xt and 6700xt should give us an idea of what performance to expect from consoles. My tests with RT off were pretty shitty. I had to go all low settings with DLSS set to balanced at 1440p to get around 65-85 fps which would give consoles 30-40 fps performance range.
 

PaintTinJr

Member
It.. also performs the same at 4K, 1440p and even 1080p with a compiled version, indicating that maybe it’s an incomplete demo/engine? To the point where the global illumination seems to be calculated on CPU? Nobody uses matrix demo to benchmark, because it has all the red flags indicating it’s CPU limited while the utilization is indicating that it’s not. Something is broken or limited in the background.

Extrapolating anything out of it, on an engine that is not ready for production.. I mean yea, I hope I don’t have to explain further? Even the dev community on unrealengine forum are encountering a ton of problems




The Office What GIF
Sorry, it was a weak parody of when Carmack mentioned around the launch of one of his games - think it was Rage - that the game ran superior on WINE than natively on Windows when he tested it.
 

DenchDeckard

Moderated wildly
Just let this thing be as good as a 4080 pleeeease.

The fact that the founders 4080 is 1269 uk dollar pounds is some kind of sick joke and fuck NVIDIA. They need a stomping.
 
Top Bottom