• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

RTX 5090 Review Thread

peish

Member


DLSS4 MFG in close up CP2077 looks good enough, we wont spot a difference unless you are pixel counting.

over 2 years, the future is neural AI insertion to fill up the gaps.
 

FingerBang

Member
Overclockers, which is a fairly large UK retailer for pc parts, have said their allocation is single digit.
That's insane. Form the article:

"The launch of the RTX 5090 will be the worst when it comes to availability. Already being told to expect it to be that way for the first 3 months."

At least they have a few hundreds 5080s, so it's likely they'll last a few minutes/hours
 
Last edited:

nkarafo

Member
575 watts... Plus another 200 ish for the rest of the system.

I just can't think of any scenario that is worth that kind of consumption. But i guess if you have money to buy this card you aren't worried about bills.
 

dEvAnGeL

Member
Cant wait for the 4090's to drop in price.
With Nvidia stopping production of the 4090 and with the already known limited stock availability of the 5090. Add to it that the performance increase is “disappointing” for some. It might be a while. The 5080 could change that. If somehow it ends being on par with the 4090 or at least not far behind, between 5 & 10% less raster performance for $1k. I can see that being a catalyst for second hand market of 4090 being affected.
 

SmoothBrain

Member
The 5080 could change that. If somehow it ends being on par with the 4090 or at least not far behind, between 5 & 10% less raster performance for $1k. I can see that being a catalyst for second hand market of 4090 being affected.
That if is doing a lot of heavy lifting, though. Multi frame gen sounds good on paper, but playing it didn’t, at least on a 4090 or 4080. You get the pretty frames, but the input is clearly lagging behind. The faster the game, the more noticeable it is.
 

FingerBang

Member
With Nvidia stopping production of the 4090 and with the already known limited stock availability of the 5090. Add to it that the performance increase is “disappointing” for some. It might be a while. The 5080 could change that. If somehow it ends being on par with the 4090 or at least not far behind, between 5 & 10% less raster performance for $1k. I can see that being a catalyst for second hand market of 4090 being affected.
I'm hoping for the 5080 to be about 5 to 10 slower than the 4090. The rumors are about 15-20% faster than the 4080, which should put it more or less in that bracket (the 4090 is 25-30% faster).

It's a meh jump coming from a 4080, but still, overall, a better gain in price/performance compared to 4090 to 5090.
 

Zathalus

Member
I don’t think I’m crazy when I say the 5070 is not going to outperform the 4070 Super right? 17% less CUDA cores and 33% more memory bandwidth probably means the 5070 will do better in memory bandwidth heavy scenarios, but will loose out when the raw CUDA performance is needed. It basically seems like just a $50 price cut with MFG applied.
 
Last edited:

Most comparisons there are absurd, you can't compare 30FPS into 120FPS with 4x FG vs 120 FPS Native, ofc the latter is better, but you need a near 4x more powerful GPU, like what, from a 3060 to a 4080 or something like that? No shit you are getting better results.

Artifacts affect image quality, but so does the motion clarity, a lot actually.

If anything, this video shows how solid is the 2x Framegen even from 30 FPS which adds many more artifacts than if you did from 60 FPS, definitely more than worth these extra 7-10ms and few artifacts vs the huge motion clarity you gain.

Also these latency comparisons at the end of the video from 30 FPS 4x FG vs 120 FPS native... derp
 

FingerBang

Member
I don’t think I’m crazy when I say the 5070 is not going to outperform the 4070 Super right? 17% less CUDA cores and 33% more memory bandwidth probably means the 5070 will do better in memory bandwidth heavy scenarios, but will loose out when the raw CUDA performance is needed. It basically seems like just a $50 price cut with MFG applied.
This is basically it. For anyone who's waiting to buy a new card and doesn't need/want a 5090, it's probably best to wait until they're all out (Nvidia and AMD). If you need a card now, I would get whatever is on sale for the price point you need. You won't get any real performance bump for the same price (unless AMD goes crazy and releases the 9070xt for $500/600)
 

LiquidMetal14

hide your water-based mammals
Nothing on the Gigabyte Gaming OC variant? I expect those to be the 2200 models. That's the 4090 model I had and have had a great experience with that.

I wish they would announce pricing on the full AIB lineup
 
Last edited:

daninthemix

Member
Most comparisons there are absurd, you can't compare 30FPS into 120FPS with 4x FG vs 120 FPS Native, ofc the latter is better, but you need a near 4x more powerful GPU, like what, from a 3060 to a 4080 or something like that? No shit you are getting better results.

Artifacts affect image quality, but so does the motion clarity, a lot actually.

If anything, this video shows how solid is the 2x Framegen even from 30 FPS which adds many more artifacts than if you did from 60 FPS, definitely more than worth these extra 7-10ms and few artifacts vs the huge motion clarity you gain.

Also these latency comparisons at the end of the video from 30 FPS 4x FG vs 120 FPS native... derp
Yeah, they are cherry-picking the worst scenario for MFG (starting at 30 fps) - ironically the MFG output is still preferably to 30fps. Starting from a more sensible 60fps (with far fewer artifacts as a result) and automagically going to 240fps for free is where this technology shines.
 

FingerBang

Member
That looks disgusting.

Motion clarity has already gone to shit thanks to TAA, this looks like shit.
The problem is not the technology. The problem is Nvidia and their lies. (Your 5070 will be as fast as a 4090!)

If you have a 4K 240HZ display and your GPU can only push up to, let's say, 120 frames, then this tech can be fantastic for maxing out the framerate. It's the same for something like 480hz or, in the future, stuff like 1000hz.
That's because the generated frames sit between "real" frames. If the frame-rete is already high, you will not notice many artifacts, and the image will be extremely smooth and pleasant. It can be useful for pushing 90 frames to 120hz and avoid HDR VRR flickering.

This tech does not make a 30fps game a 60fps one. Artifacts are visible, and the latency is even worse than before. It's not garbage, but it's in no way "the future" they're trying to sell us.

The future will arrive when we can keep shrinking transistors without the ridiculous prices they're asking. The idea that we've reached some limit on where we can push graphics is incorrect.
 
Looks horrendous. There are already visible artefacts and degradation in motion quality with 2x FG, but 4x FG is even worse.
Hardware Unboxed shows a base framerate of 30fps upscaled to 120fps, and they also slow it down to the point where you can see individual generated frames. Of course is such scenario people will see artefacts. During gameplay it's however way more difficult to see problems if you have around 60fps real fps and when generated frames are only displayed for 50% of time (FGx2).

I tried to look for DLSS FGx2 artefacts but it still wasnt easy to spot them. On small occasions I saw artefacts during text scroling, or during very fast movement on grey colors (semi transparent outline of the character), but these artefacts coule be mistaken for LCD trailing artefacts. IMO such extremely small artefacts are not a big price to pay when you consider the overall improvement in motion clarity. On sample and hold displays, higher refresh rates increase motion resolution (even "fake" frames, or even totally black frames). FG improves motion sharpness, not the other way around like you suggest. Of course, real frames will always be better than generated frames, but given the choice between 60fps and generated 100-120fps, I will always choose the latter.

c20ec5ff825260bc53a7-3.png



DLSS FG looks no better than lossless scaling latest update which is hilarious and brilliant
I can easily see Lossless Scaling (LSFG) artefacts, unlike DLSS FG, so from my perspective comparing DLSS FG to LSFG is a joke.
 
Last edited:
Hardware Unboxed shows a base framerate of 30fps upscaled to 120fps, and they also slow it down to the point where you can see individual generated frames. Of course is such scenario people will see artefacts. During gameplay it's however way more difficult to see problems if you have around 60fps real fps and when generates frames are only displayed for 50% of time (FGx2).

I tried to look FGx2 artefacts but it still wasnt easy to spot problems. On small occasions I saw artefacts during text scroling, or during very fast movement on grey colors (semi transparent outline of the character), but these artefacts coule be mistaken for LCD trailing artefacts. IMO such extremely small artefacts are not a big price to pay when you consider the overall improvement in motion clarity. On sample and hold displays, higher refresh rates increase motion resolution (even "fake" frames, or even totally black frames). FG improves motion sharpness, not the other way around like you suggest. Of course, real frames will always be better than generated frames, but given the choice between 60fps and generated 100-120fps, I will always choose the latter.

c20ec5ff825260bc53a7-3.png




I can easily see Lossless Scaling (LSFG) artefacts, unlike DLSS FG, so from my perspective comparing DLSS FG to LSFG is a joke.
There are visible artefacts in even slow moving scenes in the video, especially with FGx4.

It seems if you want to use FG, then the game needs to render 60FPS minimum to minimise artefacts and latency (as FG will drop the 'native' rendering FPS compared to baseline).
 
Last edited:
575 watts... Plus another 200 ish for the rest of the system.

I just can't think of any scenario that is worth that kind of consumption. But i guess if you have money to buy this card you aren't worried about bills.
I mean realistically how much more money per year is this in electricity bills?
 
I mean realistically how much more money per year is this in electricity bills?
I did some quick maths:
5090 system (600W total power consumption) and playing 2 hours a day then that's £9 a month (which in fairness is quite cheap, but will obviously climb notably if you play games much more than this). In comparison, a 4080 system (~300W) will only cost you £4.5 a month if playing 2 hours a day (even cheaper).

Overall I do prefer having lower energy consumption overall since it's a more environmentally conscious option, plus less heat generated in the room.
 
Last edited:
The electricity consumption isn't the issue, its the dam heat that it creates. Absolutely brutal in summer time.
This is my biggest issue.

Even my RTX 4080 system, which only draws 300W in gaming heats the room by an additional 2-3C in the summer. This is significant.

I can't imagine what it's like playing games with a power hungry GPU TBH.
 
Last edited:

nkarafo

Member
I mean realistically how much more money per year is this in electricity bills?
Depends on the country.

In a third world country like Greece, where we have the most expensive electricity in Europe, i wouldn't even touch this card. Mine is 160W, which is 40W higher than the previous one i had and it gives me anxiety.
 

Bojji

Member
Depends on the country.

In a third world country like Greece, where we have the most expensive electricity in Europe, i wouldn't even touch this card. Mine is 160W, which is 40W higher than the previous one i had and it gives me anxiety.

We are fucked:

b8UYaYd.jpeg


It's good that with undervolt 4070ti super takes ~220W on average.
 
Last edited:
I mean realistically how much more money per year is this in electricity bills?
FE RTX5090 can pull 575W (MSI suprim even 595W) at 99% GPU usage. Add to that other components and it should be around 750W for PC case alone, and about 900W with the QD-OLED 32'inch monitor.

575W running for 4 hours equals 2.30 kWh per day and 839.50 kWh per year. In my country I would have to pay 864zł for this (with currency exchange about 215 US $).

575W running for 8 hours equals 4.60 kWh per day, and 1679.00 kWh per year. In my country I would have to pay 1729zł for this (431 US $).

900W running for 4 hours equals 3.60 kWh per day, and 1314 kWh per year. In my country I would have to pay 1353zł for this (337 US $).

900W running for 8 hours equals 7.20 kWh per day, and 2628 kWh per year. In my country I would have to pay 2706 zł for this (675 US $).

My 7800X3D + OC'ed 4080S pulls 430W max in the worst case, and usually under 400W (I know because I have a watt meter). If I would however run games with PS5 settings I see about 160-180W for the whole PC case.
 
Last edited:

StereoVsn

Gold Member
Looks horrendous. There are already visible artefacts and degradation in motion quality with 2x FG, but 4x FG is even worse.
I can see use for 2 X FG and maybe even the older version if you can get the base to 60+ FPS. 4 FG looks very sketchy. I am sure Nvidia will improve but that’s an awful lot of frames to predict.
 
I can see use for 2 X FG and maybe even the older version if you can get the base to 60+ FPS. 4 FG looks very sketchy. I am sure Nvidia will improve but that’s an awful lot of frames to predict.
You'd have a more compelling use for G-Sync + DLSS performance (which is now equal to old DLSS Quality in IQ) and skipping framegen entirely.

The Youtube comment nailed it:
"I love how it works best when you don't need it, and worse when you do"
 
Last edited:
We are fucked:

b8UYaYd.jpeg


It's good that with undervolt 4070ti super takes ~220W on average.
I tried undervolting my RTX4080S. Without UV I saw between 275-300W (depending on the game at 99% GPU usage). With UV it was 50-60W less, however I also saw around 3-5fps worse performance. OC gave me 7-10fps without increasing power usage that much (my GPU cannot pull more than 315W amyway), so I concluded UV didnt make much sense because I would lose about 10-15fps, IMO 50-60W less isnt worth 10-15fps.




4070tiS 91fps stock, 112fps OC'ed. If I had the 4070tiS I would not undervolt it either. Too much performance loss for little reduction in power consumption.
 
Last edited:

peish

Member
These prices are significantly cheaper than the UK. We pay € 30 cents per kWh for electricity.

Electricity is basically free in Europe (!)

It’s really the heat, at 580w, add another’s 400w from rest of pc and tv and speakers, it will get unbearable in your room quickly

Now people may ask wtf Nvidia has been doing for over 2 years with Ada to Blackwell. We should also question wtf is up at TSMC!

We need Intel to save us, at this point, Nvidia, Amd, Tesla, Qualcomm, MS, Broadcom should pay Intel big money!
 

StereoVsn

Gold Member
It’s really the heat, at 580w, add another’s 400w from rest of pc and tv and speakers, it will get unbearable in your room quickly

Now people may ask wtf Nvidia has been doing for over 2 years with Ada to Blackwell. We should also question wtf is up at TSMC!

We need Intel to save us, at this point, Nvidia, Amd, Tesla, Qualcomm, MS, Broadcom should pay Intel big money!
Well, if Samsung got their shit in gear that would also be good so TSMC would finally got some competition.

Limiting Power draw for 5090 looks promising though. 80% power draw results in under 10% performance loss in general it looks like.
 
Well, if Samsung got their shit in gear that would also be good so TSMC would finally got some competition.

Limiting Power draw for 5090 looks promising though. 80% power draw results in under 10% performance loss in general it looks like.
What would really interest me is a card which performs as well as the 5090 but half the TDP. Maybe with the 6000 series? Or am I dreaming...
 
Last edited:
Top Bottom