• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Whats the maximum Nvidia tax you’d pay?

dgrdsv

Member
It's not that far off
performance-matchup-rtx-5070-ti.png
"Rasterization" should not matter to anyone buying a new GPU in the year 2025.
Once you go into RT territory it deteriorates quickly with the RT workload increase and often ends up being at 5070 level instead of 5070 Ti.
The price AMD has set on these cards reflect their own expectation on how people will be comparing them to the competition. 9070 XT being in between 5070 and 5070 Ti in price is a rather perfect illustration of its relative perf/features positioning.
In other words, there is no "Nvidia tax" in this comparison. Those who buy a more expensive Nvidia card get more performance and features out of that. Whether these matter to you personally is a completely different question.

They are direct competitors.
Are they?

tbm4neqe.png


See above about why AMD has priced them as they did - for a nice change from previous two generations where they basically thought that no one cares about RT for some reason.
The cards are competitive precisely because they come with a discount in comparison to Nvidia this time. Thank god that AMD has figured this one out on a third approach.
 
Last edited:
I mean, if you're upgrading now it means you were on the lower end of the 30series (or older). The problem most likely is ram (still locked to 8/10gb).

So you're looking at 5070ti/9070xt. They're the only two "good value" cards.

The best buy would still be a used 40series. But stock has fucked used prices too.
 

pudel

Member
My 4090 is still pretty fine. Not even sure if I would want/need to upgrade to a 60xx gen. And by the time a 70xx gen is out I will have a look around and when the situation is still as ridiculous as right now....I will definitely consider an AMD card....yep. But thats a long time to go. Just feel sorry for everyone who is looking currently for a new rig/video card. :messenger_pensive:
 
Prices change frequently in my country. In the first 1-3 days after launch, the RX 9070XT was only $100 cheaper than the 5070ti, but the prices of the 9700XT have dropped and the difference is now $175. For me personally, the DLSS package alone (DLSS SR, RR, DLDSR, FG) is well worth the $175 price difference.

AMD improved FSR image reconstruction, but FSR FG is still unusable due to judder (motion isnt smooth) and input latency (aiming with the mouse has a weighty feel even at high base fps). DLSS FG offers perfectly smooth motion and extremely low input lag (I measured just 1-4ms in Cyberpunk with the latest DLL). It's easy to forget that I'm playing with frame gen, that's how good nvidia FG is. Being able to boost framerate 2x times while still having perfectly responsive experience helps imensly in the most demanding games.

The 9070XT is competitive in raster, and has much better RT performance even compared to the XTX, but the RTX5070ti still has an upper hand in RT games, and especially PT. Those who opt for the 5070ti will be paying $175 more for a better product with more features, which will certainly extend the life of the GPU.


glmdlf1.jpeg


InJKSuM.jpeg


21fps 9070XT vs 166fps 5070ti

QVZQuup.jpeg
 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Are they?

tbm4neqe.png


See above about why AMD has priced them as they did - for a nice change from previous two generations where they basically thought that no one cares about RT for some reason.
The cards are competitive precisely because they come with a discount in comparison to Nvidia this time. Thank god that AMD has figured this one out on a third approach.

Yes they are.
It wins some it loses some but on average they are in the same class power wise.
Price wise the 9070XT wins outright.
Games with excessive RT it loses, no one is denying that, but as I said on average because no one ONLY plays heavy RT games and most games arent RT heavy, they are ~5% apart at 1440p.

With Unreal Engine being the most popular engine going forward they are basically the same so yes they are direct competitors:
black-myth-wukong-2560-1440.png
 
Yes they are.
It wins some it loses some but on average they are in the same class power wise.
Price wise the 9070XT wins outright.
Games with excessive RT it loses, no one is denying that, but as I said on average because no one ONLY plays heavy RT games and most games arent RT heavy, they are ~5% apart at 1440p.

With Unreal Engine being the most popular engine going forward they are basically the same so yes they are direct competitors:
black-myth-wukong-2560-1440.png
Lumen does not use RT cores. That's why the 9070XT is so competitive. The same game, but with hardware RT.


VO2Dz1L.jpeg
 
Last edited:

kiphalfton

Member
Moore's Law Is Dead dropped a truth bomb a while ago:

People want "competition" against Nvidia not because they want to buy AMD or Intel, but because they want to buy Nvidia at a lower price.

Jensen has everyone by the balls.

You're giving MLID too much credit; he's a chump and just parroting what everybody has been saying for the past however many years.
 

CrustyBritches

Gold Member
Nvidia: imaginary MSRP, imaginary performance, and imaginary stock.

They completely abandoned the customer base that built up their company for decades. They also managed to make AMD products like 9070-series and PS5 Pro look like world beaters.
 

xenosys

Member
Hypothetically, If both the AMD and Nvidia GPU offer similar raster and RT performance, and the same amount of VRAM, probably an extra 50 bucks for the upscaler quality and multi-FG advantages.
 
Last edited:

John Wick

Member
That's $4,853 (US) for a graphics card.
Don't worry about it. PC gamers enjoy paying more to Nvidia. Ever since the GTX 970 I've never managed to keep a Nvidia GPU. Every time I bought one I managed to sell it without even trying. I've felt guilty selling them for far more than I bought them for. But not anymore as I've come to the realisation PC gamers perversely enjoy paying Nvidia's excessive prices. Otherwise they'd stop buying them to send a message to Nvidia. I've managed to get a 2080TI quite cheap 2nd hand for my needs. But selling those cards has basically paid for my consoles and subs and my 2080ti.
 

Fess

Member
If AMD have better stuff out I’ll buy that without thinking twice just like how I did when moving from Intel to AMD on CPUs.

Currently AMD is competitive on pricing and sometimes raster but not ray-tracing.
So, not yet. I pay the Nvidia tax I need to pay until there are alternatives.

Availability is a bigger problem. Money can be acquired by planning ahead 2-3 years and save up and invest in the stock market. But if I can’t buy something because it’s not available then I’m locked out in the cold with no solution.
 

Fess

Member
21fps 9070XT vs 166fps 5070ti

QVZQuup.jpeg
How is 13.9GB VRAM usage and 166fps possible with maximum settings and full path-tracing on a 5070ti when I climbed over 16GB on a 4080 Super and ended up with 10-20fps or lower?

1080p?
A patch has lowered memory consumption?

🤔
 
Last edited:

dgrdsv

Member
With Unreal Engine being the most popular engine going forward they are basically the same so yes they are direct competitors:
This isn't true at all.
Even in TPU's results you will find UE5 games where 9070 XT underperform w/o any HWRT in use (Stalker 2 for example or Silent Hill 2).
And once you turn HWRT on you get this:

kqxml8xu.png


No matter how you want them to be "direct competitors" the fact are they are not, 9070 XT is slower even w/o RT on average and w/RT it just falls off a cliff.
The price AMD has set on these cards is a solid choice based on how they perform against Nvidia's new parts.
And the two months delay AMD took after the announcement of 50 series was to lower the originally planned price so you can thank Nvidia for that.
 
Last edited:

Hudo

Gold Member
This is the only correct answer.

Realistically, I would pay $180 at most for a 5060 card. I would pay $950 at most for a 5090 card. Everything else needs to fall inbetween. Charging >=$1000 for a fucking consumer card is just Nvidia being greedy dipshits.
 

JohnnyFootball

GerAlt-Right. Ciriously.
I told nvidia to fuck off this gen. I received a priority access email from nvidia for a FE 5090 and I checked to see if was real, I had the option to add it to the cart and said nope.
2REror5.png


Between, the ROPs, high power draw, melting cable issues (yes it’s rare but still it happens), bad drivers (all of you praising nvidia drivers just aren’t paying attention) and basically no architectural improvements such where we end up with whatever % performance increase is offset by the same %power increase … I just said no, knowing full well I could have flipped the 5090 at a decent profit or kept the 5090 and sold my 4090 for nearly $2000.

Instead I am not bothering, I am keeping my 4090 and I upgraded my living room Bazzite/SteamOS PC from a 7800XT to a 9070XT and even without being fully optimized for Linux, I have already gotten pretty nice frame rate improvements and it’s only going to get better as the Bazzite team implements the Mesa drivers. And as far as Im concerned FSR4 is absolute fucking amazing compared to what came before. It’s so damn good that it can almost rival DLSS4. Not saying it’s better so don’t come at me with that shit, but it sure as hell isn’t worth paying a massive premium for. I expected FSR4 to be at or near DLSS3 so that was a huge win and one AMD must work with developers to get FSR4 implemented into older games and newer ones. Again DLSS4 is still the best but FSR4 is pretty close. I am a bit disappointed in RT performance I felt it could have been a little better, but that is largely being skewed by RT heavy nvidia sponsored titles
 
Comparing fps with multi framegen on with one and off with the other is pretty silly no? Why are they using FSR3 anyway?
I also showed results without FG (17fps vs 53fps). As for gameplay screenshot both both cards used FG in that test, but the 9070XT only supports FGx2 (at least on paper, because FSR FG is crap and I consider it unusable).


How is 13.9GB VRAM usage and 166fps possible with maximum settings and full path-tracing on a 5070ti when I climbed over 16GB on a 4080 Super and ended up with 10-20fps or lower?

1080p?
A patch has lowered memory consumption?

🤔
It's 1440p. At 4K cards with 16GB VRAM need to reduce the texture streaming pull from maximum to very high (texture quality still looks the same), otherwise performance will drop to single digits.
 

Three

Member
I also showed results without FG (17fps vs 53fps). As for gameplay screenshot both both cards used FG in that test, but the 9070XT only supports FGx2 (at least on paper, because FSR FG is crap and I consider it unusable).
MFG is what's crap. One is using MFG and the other is not. That comparison of fps is meaningless.
 
MFG is what's crap. One is using MFG and the other is not. That comparison of fps is meaningless.
Youtuber Jansn Benchmarks tested full potential of both cards in this particular test. FGx4 is perfectly usable on nvidia cards because it renders motion smoothly without judder (flip metering running on tensor cores ensures perfect frame pacing) and with minimum input latency, while FGx2 on AMD card cannot render motion smoothly (even at 60fps base framerate FSR FG has judder) and input lag problem.

Nvidia lied when they compared the 5070 performance to the 4090, but MFG is still awesome feature to have. Being able to boost framerate into 200 fps territory certainly improves the experience.

Even the biggest AMD die hard fan on entire YT likes MFG (Frogboy even received a free 9070XT signed by AMD CEO), although in this particular video he should cap his framerate to 237fps to avoid judder with FGx4 on his 240Hz monitor.

 
Last edited:

MikeM

Member
Youtuber Jansn Benchmarks tested full potential of both cards in this particular test. FGx4 is perfectly usable on nvidia cards because it renders motion smoothly without judder (because it emulates flip metering on tensor cores and that ensures perfect frame pacing) and with minimum input latency, while FGx2 on AMD card cannot render motion smoothly (even at 60fps base framerate FSR FG has judder) and input lag problem.

Nvidia lied when they compared the 5070 performance to the 4090, but MFG is still awesome feature to have. Being able to boost framerate into 200 fps territory certainly improves the experience.

Even the biggest AMD die hard fan on entire YT likes MFG (Frogboy even received a free 9070XT signed by AMD CEO), although in this particular video he should cap his framerate to 237fps to avoid judder with FGx4 on his 240Hz monitor.


I can’t use FG. Anything that adds latency is useless.

I don’t think anyone disputes that Nvidia reigns supreme in heavy RT, but price differences can reflect that. My Asus TUF 9070xt cost $1,059 CAD ($735 US) but the equivalent 5070ti in TUF form is 37% more expensive at $1,449.
 

Three

Member
Youtuber Jansn Benchmarks tested full potential of both cards in this particular test. FGx4 is perfectly usable on nvidia cards because it renders motion smoothly without judder (flip metering running on tensor cores ensures perfect frame pacing) and with minimum input latency, while FGx2 on AMD card cannot render motion smoothly (even at 60fps base framerate FSR FG has judder) and input lag problem.

Nvidia lied when they compared the 5070 performance to the 4090, but MFG is still awesome feature to have. Being able to boost framerate into 200 fps territory certainly improves the experience.

Even the biggest AMD die hard fan on entire YT likes MFG (Frogboy even received a free 9070XT signed by AMD CEO), although in this particular video he should cap his framerate to 237fps to avoid judder with FGx4 on his 240Hz monitor.


I have a 5090, I don't need a video review to tell me what it does or tell me it's usable. Using MFG in a framerate comparison is meaningless though, those are not actual frames and not indicative of performance. It would be like enabling upscaling then comparing to native res. Saying one is 1080p the other 4k.
 
Last edited:

JCK75

Member
$0 - AMD was lagging behind and that's why I went Nvidia, AMD was just not acceptable
until now.. now it's as good, and I'll go with the best bang for the buck regardless of who makes it.
 
I can’t use FG. Anything that adds latency is useless.

I don’t think anyone disputes that Nvidia reigns supreme in heavy RT, but price differences can reflect that. My Asus TUF 9070xt cost $1,059 CAD ($735 US) but the equivalent 5070ti in TUF form is 37% more expensive at $1,449.
At the moment, the price difference between the 9070XT and 5070ti in my country is $175 USD (currency converted). A few days ago, the price difference was only $100, but now 9070XT are getting cheaper and cheaper. I know that $175 is a lot for some people, but from my perspective, paying an additional $175 just to have better gaming experience for the next couple of years is worth it. With the nvidia card, I would not have to worry about whether games that use neural rendering features will work properly. Also performance in PT will be acceptable. My card has comparable performance to 5070ti and I get 110-170fps in PT games at 1440p DLSSQ + FGx2, that's playable experience in my book. On AMD card I would need to turn PT off.

Older games also look and run better on nvidia cards simply because they were optimized for nvidia hardware. Many RT games rely on nvidia ray reconstruction. Try playing Alan Wake 2 with RT on an AMD card and you will get an excessive amount of noise that will certainly affect your experience. There is little to no noise on the nvidia card. Also many games only support DLSS. Some RT games support FSR3.1, so you can try modding FSR4 into the game, but according to youtuber Terra Ware the quality is not as good as the native FSR4 implementation.



I do not know if I would want to spend so much time modifying so many RT games just to get FSR4 working properly. On my Nvidia card I dont need to do that, and if want to use the latest DLSS I only need to set DLSS settings globally in nv inspector and that's it. I get amazing clarity in DLSS games, and the quality continues to improve with each new DLSS update

DLSS FG usually adds input lag (there are games where DLSS FG actually reduces latency if the game does not natively support Reflex), but not a lot. In cyberpunk I measured additional 9-10ms lag with old FG (lets call it FG ver 1.0), but the latest FG (running fully on tensor cores) decreased input latency even further to 1-4ms with FGx2 on my RTX4080S, which is literally placebo territory. Artefacts arent noticeable even if I look for them (it's easier to notice LCD ghosting than DLSS FG artefacts) and 1-4ms is not a big price to pay for a HUGE boost to motion clarity (on sample and hold displays higher fps drastically improve motion clarity). I also found that my aiming is even easier with DLSS FG. It feels like the latest DLSS FG started predicting mouse movement and my eyes can also track moving targets much more easier at higher refreshrate.

1440p DLAA, psycho RT, 70fps and 31.3ms latency (OSD in the right corner of my screen)

20250313-124432.jpg


With FGx2 I get 124fps and 33.8ms.

20250313-124256.jpg


FG can destroy input latency with the wrong settings, for example if you play with vsync on, or cap the framerate with the default "async" framerate limiter in riva tuner, you will get additional 100ms, and that will certainly affect gaming experience. If you however use the "reflex" framerate limiter in riva tuner there's no additional input lag even with framerate limiter. That's how I play and I dont have any issues with input lag.

Right now I'm playing Robocop Rogue City (awesome game BTW), and FGx2 improved my experience a lot, even though I had around 90fps at 1440p DLAA even without FG. Smoothnes with DLSS FG is insane and my aiming is perfect, I don't miss any targets even if enemies are moving. I also tested FSR FG but like always, motion is no longer smooth (judder) and I start missing targets even if they dont move and stay still, that's how big difference is between nvidia and amd frame generator. DLSS FG is perfectly synced with mouse movement because nvidia thought about everything (hardware based flip metering in RTX40 series, while 50 series use tensor cores for that), while FSR FG cant provide smooth frame delivery (judder) and mouse movent is no longer synced correctly.


Robo-Cop-Win64-Shipping-2025-03-12-16-52-13-735.jpg


To me, DLSS FG feels like a free performance boost, so I think this feature alone is worth the $175 price difference between the 9070XT and 5070ti. This frame generation technology, if implemented properly, would revolutionize gaming on consoles. PS5 can only use the crappy FSR FG, so the experience is terrible, but I think Mark Cerny will build PS6 with hardware based FG in mind, so the experience will be comparable to DLSS FG. Console gamers would be able to play games at 120fps instead of 60fps (get much higher motion clarity and smoothnes), and on the gamepad even people extremely sensitive to input lag would not notice any difference between real 120fps and AI-generated 120fps.
 
Last edited:
I have a 5090, I don't need a video review to tell me what it does or tell me it's usable. Using MFG in a framerate comparison is meaningless though, those are not actual frames and not indicative of performance. It would be like enabling upscaling then comparing to native res. Saying one is 1080p the other 4k.
It's true that real frames and AI-generated frames are not the same, therefore reviewers should focus on comparing raw performance, but DLSS FG certainly improves motion smoothness and clarity. That's not a meaningless difference.

DLSS FG works amazingly well and can make a huge difference in real gaming experience. If the generated 170fps look comparable to the real 170fps frame rate, then I don't care if I'm playing with fake frames, because this knowledge doesnt affect my gaming experience. This feature improves my experience and I would not want to buy a GPU that cannot run FG well. AMD FSR FG is nowhere near as good as AMD DLSS FG, and from my perspective it makes sense to spend more money on Nvidia GPU just to have a proper FG experience. I'm only willing to turn FG off when my GPU can max out my monitor's refresh rate, becasue real framerate always reduce latency (DLSS FG adds very little input latency compared to base framerate, but it does not reduce it like real framerate without FG). Thanks to DLSS FG, I no longer get upset when the game dips below 60fps from time to time, becasue I always see smooth motion. If I were to run games at 4K without FG or DLSS, 4K gaming would be extremely expensive, and even the 5090 does not have the raw power to run the most demanding games at 4K high refresh rates without these AI features.

As for your DLSS SR comment, 4K DLSS and 4K TAA native will both render 4K resolution. DLSS does not upscale the image in traditional sense (upscaling just resizes the image and cannot add any new detail), but reconstructs the 4K image based on REAL data from the previous frames. Nvidia engineers refer to DLSS feature as super resolution, or image reconstruction, and they are absolutely right, because that's exactly what DLSS does. DLSS lower resolution in the first step to improve performance (some people focus on this step thinking they are playing at a lower resolution), but DLSS also increase resolution in the second step based on temporal data, but somehow people forget about it. Youtubers and neogafers refer to DLSS as upscaling, but that's an inaccurate and oversimplified way to look at it.
 
Last edited:

SHA

Member
I'm a $4,000 pc build guy, no build more than that makes sense to me, that's just my opinion.
 
Last edited:
Top Bottom