• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

GeForce RTX 5090 is $1,999 5080 $999 5070 Ti $749 5070 $549 (Availability Starting Jan 30 for RTX 5090 and 5080)

T4keD0wN

Member
Can someone tl dr me without DLSS and AI bullshit,
what is raw performance improvement:
4080 to 5080 ?

Or better yet. Someone calculate 3080 vs 5070ti vs 5080.

geforce-rtx-5080-perf-chart-outline.svg

Look at Far Cry 6: green is 5080, grey 4080. Its with RT and the RT cores are better (different amounts further complicate it) so the true raw performance improvement should be even lower than this, so its barely an improvement in that regard, its all about AI bullshit now.

4080 is about 149% of a 3080. 4080 to 5080 is a smaller improvement. ~35% with RT and less than 30% without RT probably, so 5080 should be slightly less than double of 3080, but without vram possibly holding it back, with RT you should be looking at 200%+ going from 3080 to 5080.

Worth noting that Far Cry 6 ray tracing is very light so it should not play a big role here, theres also a possibility of a CPU bottleneck since Dunia Engine does not have the best multithreading. My pessimistic estimate for raster 3080 to 5080 (if i presume better RT cores affect the chart) is about ~1.90x, optimistic ~2.00x if they dont.
 
Last edited:

rofif

Can’t Git Gud
There are other techniques in effect to lower input lag and they are the reason why Nvidia frame gen works well. If there wasn’t it would be pretty worthless.

Wrong. Thats what reflex is for and now with reflex 2 it will be even lower
if you start with 30fps, the input lag is 30fps + something.
Reflex only delivers the frames as they are ready without double or triple buffering + some reduction.
 

proandrad

Member
if you start with 30fps, the input lag is 30fps + something.
Reflex only delivers the frames as they are ready without double or triple buffering + some reduction.
Im not an expert, but that sounds right. Frame gen is best used if you are already well over 60fps. Nvidia’s video isn’t just using frame gen to go from 20fps to 200. It’s using dlss to upscale and then using frame gen. Dlss upscaling probably gets it to above 60fps before the frame gen.
 
Last edited:

Fess

Member
I said the same thing about STALKER 2 and Cyberpunk recently and people gaslit me saying it was a memory leak vs my card tapping out at 4K, it's why I was hoping AMD was going to make due on their promises
In Indiana Jones they show how much vram that is in use and also give suggestions on which settings to lower. So no reason to gaslit there. But to be in a lowering settings scenario on a new graphics card isn’t optimal, so I wonder if they have some other tech in place to sort it out. The only card here with more than 16GB is the 5090. Board reviewers should test the cards on Indiana Jones.
 
Well, yes, and so what? Should I forget that $2 chip in my 1080p pre-OLED TV could inflate frames just fine?


The increased lag is not a side effect due to inferior tech, it's an inherent part of frame inflation.
There is no future in which it starts to make sense, bar filthy marketing purposes.

I'm agreeing with you on both points, just chipping in my two cents. I don't think all of these advancements can co-exist and still be a good experience for gaming, when the input lag cost is so high.
 

dcx4610

Member
Didn't expect those prices. Good on Nvidia. I mean they aren't CHEAP but they are better than I thought they'd be.

Good luck finding a 5070...

I hope it's not a repeat of the 4000 series where you can't even buy the card before the 6000 series releases
 
Last edited:

analog_future

Resident Crybaby
Man, I really hope I can nab the 5090 FE. Love the aesthetics of it:

geforce-rtx-50series-og-5090.jpg




Realistically I will take whatever I can get though.
 
Last edited:

Astray

Member
These prices (and DLSS enhancements for past-gen cards) are an absolute killshot to AMD, even for the (upper) midrange cards they're supposedly focusing on, these new cards will not only have to compete with new-gen Nvidia cards, but also with the old cards that just got new life.

If I'm Intel, I'm absolutely salivating at the prospect of fighting it out with AMD, especially given how much weakness they signaled in the lead up to this CES. They are absolutely there for the taking when it comes to GPUs.

Initial hype gone, new hype in 2 years. Probably hang on to my 4090 and get a 6090. I think.
Yeah I think that I don't really regret my 4090 purchase, especially given that I got it on a good discount.
 

dcx4610

Member
I have a 2070 so I'm definitely down for a 5070. I wonder if it's worth the extra cash for the Ti to get 16GB of memory though...
 
I think the rumors of higher prices was on purpose.

Disagree. I think NV messed up so bad that even for them they realized they had to cut prices. And then ripped off Lossless Scaling to double down on Fake Frames as a marketing tool.
 
Last edited:

poppabk

Cheeks Spread for Digital Only Future
I have a 2070 so I'm definitely down for a 5070. I wonder if it's worth the extra cash for the Ti to get 16GB of memory though...
That's the goal. It's always worth the extra cash for the next model up.
 
I know people are complaining about the jump in raw shader performance and rasterisation, but just going back to what Mark Cerny said last month and what many others have been saying for a while now is that going forward in the future we'll likely see more advanced use of silicon space rather than just increasing the shader and core count. As Nvidia is doing here, and as they have been doing steadily now we'll see a bigger focus on machine learning and ray-tracing, a major area will be ML rendered geometry and textures.

Personally I welcome it.
 

Delt31

Member
Catching up on all of this. Here is my only question. Where can I find the difference in results of REAL FPS? I see these crazy high FPS based on AI/inserted/fake frames. What about the real raw performance of 4090 vs 5090. Any intel on that yet?
 

FingerBang

Member
Catching up on all of this. Here is my only question. Where can I find the difference in results of REAL FPS? I see these crazy high FPS based on AI/inserted/fake frames. What about the real raw performance of 4090 vs 5090. Any intel on that yet?
You cannot. We have not been given that information. All the comparisons were done using DLSS4 + RT + Frame Generation. The estimate seems to be around 25 to 35%, though I'd expect it to be bigger for the 5090.
 

HeisenbergFX4

Gold Member
Sales of the product will not reflect this sentiment
I think most aren't impressed with it, but most are accepting this as our reality and buying anyways. Short term, given the new pipeline reduces VRAM utilization, I expect to see the VRAM bottleneck be alleviated. We'll just revisit this issue again with more and more games a few years down the road.
The vast majority will not spend the $2k for a GPU
 



Oof! Wake me up when we get to the 6090. 28fps at native 4K isn’t the jump I expect for a $2000 GPU. 5090 will be amazing with DLSS but I expect more of raw performance increase between card generations. Hopefully I see that real increase when we get to the 6090 and 7090 which is when I’ll be ready to upgrade my 4090.
 
Last edited:

StereoVsn

Gold Member
I was just answering his question asking if everyone was ok with spending 2k for above 16gb of vram

The 5090 will 100% sell out though
Thing is I can see a lot of folks (relatively speaking) grabbing 5090 as an AI workload card. 32GB isn’t too shabby and $2K is a lot better vs $6-8K.

Edit: Not talking about MS putting these in Azure, but for AI devs and data scientists. We have been talking to some of our engineers and our procurement is discussing things with a distributor.

You will see quite a lot of that. So a lot of those sales might go outside of gaming.
 
Last edited:

Zathalus

Member
if you start with 30fps, the input lag is 30fps + something.
Reflex only delivers the frames as they are ready without double or triple buffering + some reduction.
Don’t start with 30fps, even Nvidia recommends against it. Base fps (with upscaling but not frame generation) should be higher than 40fps, ideally higher than 50fps. With Reflex 2 latency will be under a regular 60fps game and frames would be whatever multiplier you select.
 

HeisenbergFX4

Gold Member



Oof! Wake me up when we get to the 6090. 28fps at native 4K isn’t the jump I expect for a $2000 GPU. 5090 will be amazing with DLSS but I expect more of raw performance increase between card generations. Hopefully I see that real increase when we get to the 6090 and 7090 which is when I’ll be ready to upgrade my 4090.

28 FPS standing still and no combat

Snl Season 47 GIF by Saturday Night Live
 

thuGG_pl

Member
Catching up on all of this. Here is my only question. Where can I find the difference in results of REAL FPS? I see these crazy high FPS based on AI/inserted/fake frames. What about the real raw performance of 4090 vs 5090. Any intel on that yet?
Nowhere, you need to wait for reviews, probably at the end of the month.
 

Puscifer

Member
Bought a 4080 super not to long ago

Ffffffffffffffffffff

I’m fine
I’m fine
Wait till raster results come. I'm not even sure I wanna get a 5070ti anyone because their results are from DLSS performance, they didn't even bother with quality let alone balanced. Wait till the dust settles
 
Last edited:

iQuasarLV

Member
Catching up on all of this. Here is my only question. Where can I find the difference in results of REAL FPS? I see these crazy high FPS based on AI/inserted/fake frames. What about the real raw performance of 4090 vs 5090. Any intel on that yet?
Look at the DF x2/x3/x4 comparison. The x2 latency hit is based off what the initial frame rate was. The starting frame rate in Cyberpunk 2077 was 20fps based on the x2 latency hit. if you were to take the x4 hit its running a 17fps latency. Bleck!

FPS to MS converter
 
Top Bottom