• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

RTX 5090 Review Thread


This is the video that burst my bubble…and I have a 240hz 4K monitor

If you already own the RTX 4090, you probably tried playing games with DLSS FGx2, so you should know what to expect. Hardware Unboxed's video exaggerates a lot.

The RTX 5090 doesn't offer the Ampere vs. Ada generation gap, but it's still noticeably faster than the RTX4090. FE RTX5090 offers 35% relative difference compared to the RTX4090 at 4K native based on 25 games tested (MSI 5090 suprim 39%). The RTX5090 can run cyberpunk with PT at 4K 60-70fps with just DLSSQuality, while the RTX4090 isnt even close to 60fps (around 45fps).


average-fps-3840-2160.png


And bear in mind that the gap can be wider in newer games. Some recent games like Black Myth Wukong shows much higher difference between Ampere and Ada Lovelace.

Screenshot-20250119-135323-You-Tube-2.jpg


Neural rendering (mega geometry) should be much faster on the RTX 50 series compared to RTX40.

But the biggest reason to upgrade to the RTX5090 is the redesigned power connector :p. Keeping the RTX4090 is risky.
 
Last edited:

CuNi

Member
I tried undervolting my RTX4080S. Without UV I saw between 275-300W (depending on the game at 99% GPU usage). With UV it was 50-60W less, however I also saw around 3-5fps worse performance. OC however gave me 7-10fps without increasing power usage that much (my GPU cannot pull more than 315W amyway), so I concluded UV didnt make much sense because I would lose about 10-15fps, IMO 50-60W less isnt worth 10-15fps.

I don't know if 4000-Series UV differently, but on my 3080, I went as low as 750mV and -300Mhz on Clock but increased Memory by 1000.
It performs around 10% worse, but power draw went from 400W (had it on 120% power target) to 270W.
Some reviewers also already UV'd the 5090 and found that by using 875mV and 2.3GHz clock, it only lost 12% but was drawing around 40% less than at stock setting.
I bet depending on silicon and more tweaking (especially with the clock) you can get it to only lose 5-7% while pulling 40% less power.
 

Ironbunny

Member
The electricity consumption isn't the issue, its the dam heat that it creates. Absolutely brutal in summer time.

This. Running a 14900k with 4090 in summer is already on the line. Maybe need to undervolt the 5090 for summer. There seems to be a big efficiency drop at certain clock speeds so you might actually saw off big watts doing it.
 

Bojji

Member
These prices are significantly cheaper than the UK. We pay € 30 cents per kWh for electricity.

Electricity is basically free in Europe (!)

Countries with atom have it basically for free. Germany was stupid to shut down their nuclear plants and Poland was even more retarded in the 90s


We are building plant(s) right now but it will still take xx years and it's not enough.

I tried undervolting my RTX4080S. Without UV I saw between 275-300W (depending on the game at 99% GPU usage). With UV it was 50-60W less, however I also saw around 3-5fps worse performance. OC however gave me 7-10fps without increasing power usage that much (my GPU cannot pull more than 315W amyway), so I concluded UV didnt make much sense because I would lose about 10-15fps, IMO 50-60W less isnt worth it 10-15fps.




4070tiS 91fps stock, 112fps OC'ed. If I had the 4070tiS I would not undervolt it either. Too much performance loss for little reduction in power consumption.


I have pretty much the same performance with UV (1) and stock (2):


bezzOqd.jpeg

OpDxFMy.jpeg


io4lfBo.jpeg
yE7CQiw.jpeg


While GPU is cooler and needs less work from fans ^

I based my UV on clock that GPU was making on stock (2800MHz), then found the lowest voltage for that (0.975v).
 
Countries with atom have it basically for free. Germany was stupid to shut down their nuclear plants and Poland was even more retarded in the 90s


We are building plant(s) right now but it will still take xx years and it's not enough.



I have pretty much the same performance with UV (1) and stock (2):


bezzOqd.jpeg

OpDxFMy.jpeg


io4lfBo.jpeg
yE7CQiw.jpeg


While GPU is cooler and needs less work from fans ^

I based my UV on clock that GPU was making on stock (2800MHz), then found the lowest voltage for that (0.975v).
RT at 4K eats up a lot of memory bandwidth, so GPU clock can make little difference in this scenario. In games like rise of the tomb raider you would see massive difference though. This Swedish guy had 91 fps without OC, 112 fps with OC on his 4070tiS.
 

Bojji

Member
RT at 4K eats up a lot of memory bandwidth, so GPU clock can make little difference in this scenario. In games like rise of the tomb raider you would see massive difference though. This Swedish guy had 91 fps without OC, 112 fps with OC on his 4070tiS.

UV done wrong? There should be no performance regression when you are aiming at stock clock, raster SN:


UV has even higher performance, why? Because stock is power limited (~290W) and drops clocks sometimes while UV keeps that 2800MHz 100%.

Bh3MUh9.jpeg
i1vzwie.jpeg
 
I don't know if 4000-Series UV differently, but on my 3080, I went as low as 750mV and -300Mhz on Clock but increased Memory by 1000.
It performs around 10% worse, but power draw went from 400W (had it on 120% power target) to 270W.
Some reviewers also already UV'd the 5090 and found that by using 875mV and 2.3GHz clock, it only lost 12% but was drawing around 40% less than at stock setting.
I bet depending on silicon and more tweaking (especially with the clock) you can get it to only lose 5-7% while pulling 40% less power.
Yes, RTX 40 series undervolts differently.

In my case I saw 50-60W less power usage with UV and that's not a big difference to me. Also in certain games like Metro Exodus (at 4K with RT) not even UV reduced my GPU power usage (for some strange reason). Only power limit at 70% did it, but then my performance went from 85fps to 63fps so I said to myself fu** that, I'm not going to limit performance on my PC that much for just 50-60W.

What's interesting my power bills have not even increased compared to my previous PC (GTX1080) and I was sure it will be the case. The RTX4080S can draw up to 315W, but it's usually well under 300W. Some games can draw as little as 230W even at 99% usage (GTA4). I often play older games, so my RTX4080S can sometimes only draw 50-60W when running games with the same settings as my old GTX1080 OC at 220W. The RTX4080S is more power hungry, but also much faster than my old GTX1080, so it doesnt need to use it's full power budged as often.
 

YeulEmeralda

Linux User
Yes, RTX 40 series undervolts differently.

In my case I saw 50-60W less power usage with UV and that's not a big difference to me. Also in certain games like Metro Exodus (at 4K with RT) not even UV reduced my GPU power usage (for some strange reason). Only power limit at 70% did it, but then my performance went from 85fps to 63fps so I said to myself fu** that, I'm not going to limit performance on my PC that much for just 50-60W.

What's interesting my power bills have not even increased compared to my previous PC (GTX1080) and I was sure it will be the case. The RTX4080S can draw up to 315W, but it's usually well under 300W. Some games can draw as little as 230W even at 99% usage (GTA4). I often play older games, so my RTX4080S can sometimes only draw 50-60W when running games with the same settings as my old GTX1080 OC at 220W. The RTX4080S is more power hungry, but also much faster than my old GTX1080, so it doesnt need to use it's full power budged as often.
That's an interesting point. How often do you max out your GPU?

I'm currently emulating a PSP game and my GPU is basically idling.
 

Bojji

Member
Damnit also having a 4k 240hz monitor I am ain’t watching that shit

I am not listening to them.

I TOLD YOU I AINT WATCHING IT

Hitting play now

Bryan Cranston Reaction GIF

They for sure are showing on of the worst usages for FG. 30fps as base? WTF.

I know they have technical limitations capturing it but MOST people won't play with 30fps base for sure, more like 60 and with that it will look better.

That said, I'm not the biggest fan of frame gen, between 2022 and now only some games look good with it (minimal artifacts) - in many it looks like shit.
 
Last edited:

kruis

Exposing the sinister cartel of retailers who allow companies to pay for advertising space.

This is the video that burst my bubble…and I have a 240hz 4K monitor


The video is slightly misleading because all of the examples were created from game synced to just 30 FPS. That was a necessity because their capture cards are limited to 4K 120 Hz. But the higher the frame rate,, less visual problems you'll see because there's more image data the frame generation algorithm will be able to use and the differences between each rendered frame will become smaller and smaller.

If you use multiple frame generation to artificially push the framerate from 80 fps to 240 fps (x3), you'll see far less visual issues than if you'd increase 30 fps to 120 fps.
 
TThey for sure are showing on of the worst usages for FG. 30fps as base? WTF.

I know they have technical limitations capturing it but MOST people won't play with 30fps base for sure, more like 60 and with that it will look better.

That said, I'm not the biggest fan of frame gen, between 2022 and now only some games look good with it (minimal artifacts) - in many it looks like shit.
They use multiple instances of base frame rate to demonstrate the effect it has.
 
This. Running a 14900k with 4090 in summer is already on the line. Maybe need to undervolt the 5090 for summer. There seems to be a big efficiency drop at certain clock speeds so you might actually saw off big watts doing it.
A proper case with proper airflow helps a lot. These fish tank things with fans blowing against the glas instead of over the components is fucking stupid.
 

simpatico

Member
Why can't Nvidia bin another 2 cards between the 5090 and 5070? The perf gap is laughable. IDK how all the 'consumer rights' enjoyers in the gaming world don't call this out.
 
That's an interesting point. How often do you max out your GPU?

I'm currently emulating a PSP game and my GPU is basically idling.
When I upgraded my PC in June I was mostly playing new games.

I do play a lot of older games now though, as my backlog has grown to around 50 games in the Steam library alone, and I want to finish them before I start buying new games. But I also try to play at least one modern game a week, so it's still hard to say how often my GPU is maxed out.

I'm currently playing Darksiders. On my old GTX1080 I would probably still be playing this game at 1440p because I did not have DLSDRx2.25 functionality and DSRx4 was too much for the GTX1080. At 1440p (lets call it GTX1080 settings) my GPU draws in this game around 46W, while the whole PC 167W.

Darksiders-1440p.jpg


20250126-181252-2.jpg



At 4K downscaled to my monitor's native resolution of 1440p, my RTX4080S draws around 101W. The whole PC case draws 218W (7800X3D, 32GB GDDR5 6000MHz, 5x ARGB fans, NVMe, 2x HDD, PCIe soundcard, M+K).


I had RDR1 on the PS3 but I have never played it more than one hour, so I want to finally complete it now. At 4K 144fps (engine limit) GPU power draw is around 220W (347W for the whole PC)


At 1440p 60fps (settings that I would use on my old PC) GPU consumes 65W (172W for the whole PC).


But when playing modern games, I often see GPU power consumption of 300W at 4K with uncaped framerate.

RDR2 isnt exactly new, but even this game at 4K use nearly 300W GPU power (430W for the whole PC).


At 1440p with 60fps cap the game consumes around 95W GPU power and 218W for the whole PC. My old GTX1080 would not even run this game at 60fps at 1440p with maxed out settings.


UV done wrong? There should be no performance regression when you are aiming at stock clock, raster SN:


UV has even higher performance, why? Because stock is power limited (~290W) and drops clocks sometimes while UV keeps that 2800MHz 100%.

I followed this UV guide. I saw a 50-60W reduction in GPU power, so I think UV worked. Only in certain games (Metro Exodus) my GPU power draw was the same (300-315W) despite UV. For some strange reason this game just ignores UV settings and pull the max amount of power. I saw the same behavior on youtube videos, so I dont think my UV settings are to blame.



I expected to see the same performance (after all the GPU clock was the same) but it wasnt and I saw 3-5fps reduction. I don't know why, maybe the GPU has internally reduced the performance of some of its components.

I have also UV'd my CPU and have not noticed any performance degradation, in fact my 7800X3D is even faster because it can maintain max boost clock for longer.

As for the Cyberpunk, here are my built-in benchmark results. I tried to match your settings but IDK what exact preset you used (Ultra or high, maybe even modified?)

Ultra preset

KNxzSGV.jpeg


High preset

L52oIYL.jpeg
 
Last edited:
The video is slightly misleading because all of the examples were created from game synced to just 30 FPS. That was a necessity because their capture cards are limited to 4K 120 Hz. But the higher the frame rate,, less visual problems you'll see because there's more image data the frame generation algorithm will be able to use and the differences between each rendered frame will become smaller and smaller.

If you use multiple frame generation to artificially push the framerate from 80 fps to 240 fps (x3), you'll see far less visual issues than if you'd increase 30 fps to 120 fps.
The original DLSS FGx2 only worked well at 40fps minimum. Below that I noticed judder and stuttering and increased artefacts. That's why I don't think the Hardware Unboxed video is a good representation of how this technology works during real gameplay.

zWORMz uploaded his RTX5090 gameplay and said that the FGx4 artefacts are very small and not a real problem. Timestamped 8m14s:

 
Last edited:

Bojji

Member
When I upgraded my PC in June I was mostly playing new games.

I do play a lot of older games now though, as my backlog has grown to around 50 games in the Steam library alone, and I want to finish them before I start buying new games. But I also try to play at least one modern game a week, so it's still hard to say how often my GPU is maxed out.

I'm currently playing Darksiders. On my old GTX1080 I would probably still be playing this game at 1440p because I did not have DLSDRx2.25 functionality and DSRx4 was too much for the GTX1080. At 1440p (lets call it GTX1080 settings) my GPU draws in this game around 46W, while the whole PC 167W.

Darksiders-1440p.jpg


20250126-181252-2.jpg



At 4K downscaled to my monitor's native resolution of 1440p, my RTX4080S draws around 101W. The whole PC case draws 218W (7800X3D, 32GB GDDR5 6000MHz, 5x ARGB fans, NVMe, 2x HDD, PCIe soundcard, M+K).


I had RDR1 on the PS3 but I have never played it more than one hour, so I want to finally complete it now. At 4K 144fps (engine limit) GPU power draw is around 220W (347W for the whole PC)


At 1440p 60fps (settings that I would use on my old PC) GPU consumes 65W (172W for the whole PC).


But when playing modern games, I often see GPU power consumption of 300W at 4K with uncaped framerate.

RDR2 isnt exactly new, but even this game at 4K use nearly 300W GPU power (430W for the whole PC).


At 1440p with 60fps cap the game consumes around 95W GPU power and 218W for the whole PC. My old GTX1080 would not even run this game at 60fps at 1440p with maxed out settings.




I followed this UV guide. I saw a 50-60W reduction in GPU power, so I think UV worked. Only in certain games (Metro Exodus) my GPU power draw was the same (300-315W) despite UV. For some strange reason this game just ignores UV settings and pull the max amount of power. I saw the same behavior on youtube videos, so I dont think my UV settings are to blame.



I expected to see the same performance (after all the GPU clock was the same) but it wasnt and I saw 3-5fps reduction. I don't know why, maybe the GPU has internally reduced the performance of some of its components.

I have also UV'd my CPU and have not noticed any performance degradation, in fact my 7800X3D is even faster because it can maintain max boost clock for longer.

As for the Cyberpunk, here are my built-in benchmark results. I tried to match your settings but IDK what exact preset you used (Ultra or high, maybe even modified?)

Ultra preset

KNxzSGV.jpeg


High preset

L52oIYL.jpeg



I don't get why got performance degradation, this guy video is good.

I undervolted all my gpus in last few years and pretty much get 99-100% of permanence with power consumption reduction. Cards are also cooler and less noisy.
 
It looks like the 5070 will be no better than the 4070 in that it won't be able to use ray tracing without significant temporal slop. Strangled memory bandwidth. At the very best the 4070 and 5070 should have been the xx60. If not xx50Ti's

Not sure people are ready for $600 x60 cards, heh.
 

Bojji

Member
It looks like the 5070 will be no better than the 4070 in that it won't be able to use ray tracing without significant temporal slop. Strangled memory bandwidth. At the very best the 4070 and 5070 should have been the xx60. If not xx50Ti's

5070 has almost 700GB/s of memory BW. Memory BW is not the problem for this card, memory amount it...
 
If you already own the RTX 4090, you probably tried playing games with DLSS FGx2, so you should know what to expect. Hardware Unboxed's video exaggerates a lot.

The RTX 5090 doesn't offer the Ampere vs. Ada generation gap, but it's still noticeably faster than the RTX4090. FE RTX5090 offers 35% relative difference compared to the RTX4090 at 4K native based on 25 games tested (MSI 5090 suprim 39%). The RTX5090 car run cyberpunk with PT at 4K 60-70fps with just DLSSQuality, while the RTX4090 isnt even close to 60fps (around 40-45fps).


average-fps-3840-2160.png
I'm quite surprised TBH. The performance gains for the 5090 in raster are better than I thought; shame about the massive power draw otherwise I'd be very interested in the GPU. Now I can only hope for a brand new node process for the 6000 series!
 
Last edited:
Does anyone smell a rat here - these new cards are being tested on dlss4 against current cards on dlss3. It is feasible that if you hold out until dlss4 releases, the differences may be even smaller.
 

Ulysses 31

Member
Does anyone smell a rat here - these new cards are being tested on dlss4 against current cards on dlss3. It is feasible that if you hold out until dlss4 releases, the differences may be even smaller.
3090(ti) and 4090 take a bigger performance hit % wise when using DLSS 4 than the 5090 does.
 
Last edited:

FingerBang

Member
Price for performance is insane in a bad way. I thought paying $1600 for the 4090 was too much.
The 4090 was a stupidly expensive card, but if offered a lot more compared to any other card that came before. It looked amazing also compared to what Nvidia was offering below the stack.

We're here more than 2 years later and we only have an extremely expensive and power hungry card to replace it.
Every other card made is inferior to the 4090 in every way. It's highly likely that even with the next generation the 4090 will still be more performant than most cards (maybe the 6070ti might match it? Who knows).

It's easily going to be a 6+ years card. It's aging much better than anyone could expect. Best PC hardware purchase ever.

As a matter of fact, I'm really pissed I can't buy another one for around the same price. Nvidia stopped producing them because they knew it would make their current lineup look bad.
 

Brigandier

Member
My new build has been sat waiting for a GPU for a few weeks now and I've been waiting for the 5000 cards but honestly I think I may just snag a cheap 4070ti Super and a new 1440p oled monitor instead of a 5080.

Budget constraints and I'm wondering if there will be a real big jump between the 4070tiS and the 5070ti on the fence whether to pull the trigger and get myself gaming asap or continue to hold.

Decisions Decisions.....
 

TrebleShot

Member
As we draw closer to the launch, I am tempted, but as others have said the uptick in perf seems pointless.

I am trying to convince myself not to buy it.

What I really want to do is get a 45" UW from LG and that thing is currently 1440p and the new ones 5k2k are far to heavy for high frame rates on the 4090/5090.
 

Kenpachii

Member
The 4090 was a stupidly expensive card, but if offered a lot more compared to any other card that came before. It looked amazing also compared to what Nvidia was offering below the stack.

We're here more than 2 years later and we only have an extremely expensive and power hungry card to replace it.
Every other card made is inferior to the 4090 in every way. It's highly likely that even with the next generation the 4090 will still be more performant than most cards (maybe the 6070ti might match it? Who knows).

It's easily going to be a 6+ years card. It's aging much better than anyone could expect. Best PC hardware purchase ever.

As a matter of fact, I'm really pissed I can't buy another one for around the same price. Nvidia stopped producing them because they knew it would make their current lineup look bad.

Honestly i was waiting for the 5080 to be basically 4090 but blackwell, instead its probably slower and less v-ram kinda disappointing.
 

FingerBang

Member
Honestly i was waiting for the 5080 to be basically 4090 but blackwell, instead its probably slower and less v-ram kinda disappointing.
That makes two of us. I think, at best, it will be 10% slower than a 4090; at worst, it will probably be 15-20% slower. It will depend on the resolution and game.

I recently built a small form factor PC under my 4K/120hz TV and moved my 4090 to it. Now I need another card for my desktop PC, and all the options suck:

The 5080 will be better than the 4080 but much slower than the 4090 for anything non-gaming related. That, along with 16GB of RAM. I think it would still perform well on my Ultrawide 1440p, but not well enough for 4K/120hz. At that point, I'd rather get a discounted 4080.
The 5090 is honestly a waste of money. I'm not putting it in the small PC because 575W is insane, and I don't want to spend £2000+ for a second GPU.
I like what I've seen from AMD: 4080 performance, FSR4, and much better RT for, possibly, a much lower price. Too bad it's now coming out in March, and AMD will still fuck up, making it too expensive for anyone to care (plus, only 16GB of RAM).
I've been looking at a 7900XTX for £800, but it bothers me that it lacks FSR4, and the RT is still subpar. That said, raster performance for the price is currently unbeatable, and it has 24GB of RAM. Too bad it seems to be meh in Blender and other GPU-intensive tasks.

I thought the previous generation was crap, but at least the 4090 was fantastic, with no compromise. With this gen, Nvidia has moved the top-end further away, leaving everything below unchanged. I suspect they will release a 5080ti in a year or so with performance between the 4090 and the 5090, and that might be the sweet spot. But for now, they all just suck.
 

TrebleShot

Member
That makes two of us. I think, at best, it will be 10% slower than a 4090; at worst, it will probably be 15-20% slower. It will depend on the resolution and game.

I recently built a small form factor PC under my 4K/120hz TV and moved my 4090 to it. Now I need another card for my desktop PC, and all the options suck:

The 5080 will be better than the 4080 but much slower than the 4090 for anything non-gaming related. That, along with 16GB of RAM. I think it would still perform well on my Ultrawide 1440p, but not well enough for 4K/120hz. At that point, I'd rather get a discounted 4080.
The 5090 is honestly a waste of money. I'm not putting it in the small PC because 575W is insane, and I don't want to spend £2000+ for a second GPU.
I like what I've seen from AMD: 4080 performance, FSR4, and much better RT for, possibly, a much lower price. Too bad it's now coming out in March, and AMD will still fuck up, making it too expensive for anyone to care (plus, only 16GB of RAM).
I've been looking at a 7900XTX for £800, but it bothers me that it lacks FSR4, and the RT is still subpar. That said, raster performance for the price is currently unbeatable, and it has 24GB of RAM. Too bad it seems to be meh in Blender and other GPU-intensive tasks.

I thought the previous generation was crap, but at least the 4090 was fantastic, with no compromise. With this gen, Nvidia has moved the top-end further away, leaving everything below unchanged. I suspect they will release a 5080ti in a year or so with performance between the 4090 and the 5090, and that might be the sweet spot. But for now, they all just suck.
Got any pics of your SFF?
Interested in going down this route, I run a long HDMI atm, but its a bit annoying.
 

Kilau

Gold Member
The power draw is a big no no for me. Will wait for 6000 series and perhaps a 6080 that performs as well as 5090 but at much much lower power.
The power and heat are certainly big issues with this GPU. The uplift over the 4090 scales pretty well with the wattage increase and RAM speed.

This is obviously not sustainable in the consumer market so hopefully the next series will see these improvements.
Sweatin’ to the oldies.
 
That makes two of us. I think, at best, it will be 10% slower than a 4090; at worst, it will probably be 15-20% slower. It will depend on the resolution and game.

I recently built a small form factor PC under my 4K/120hz TV and moved my 4090 to it. Now I need another card for my desktop PC, and all the options suck:

The 5080 will be better than the 4080 but much slower than the 4090 for anything non-gaming related. That, along with 16GB of RAM. I think it would still perform well on my Ultrawide 1440p, but not well enough for 4K/120hz. At that point, I'd rather get a discounted 4080.
The 5090 is honestly a waste of money. I'm not putting it in the small PC because 575W is insane, and I don't want to spend £2000+ for a second GPU.
I like what I've seen from AMD: 4080 performance, FSR4, and much better RT for, possibly, a much lower price. Too bad it's now coming out in March, and AMD will still fuck up, making it too expensive for anyone to care (plus, only 16GB of RAM).
I've been looking at a 7900XTX for £800, but it bothers me that it lacks FSR4, and the RT is still subpar. That said, raster performance for the price is currently unbeatable, and it has 24GB of RAM. Too bad it seems to be meh in Blender and other GPU-intensive tasks.

I thought the previous generation was crap, but at least the 4090 was fantastic, with no compromise. With this gen, Nvidia has moved the top-end further away, leaving everything below unchanged. I suspect they will release a 5080ti in a year or so with performance between the 4090 and the 5090, and that might be the sweet spot. But for now, they all just suck.
The RTX40 series was expensive, but certainly not crap. It offered big performance gains over Ampere (especially in PT games) and huge reduction in power consumption. DLSS FG was also an extremely useful feature. People who bought RTX40 series (except for RTX4060 8GB) had an awesome experience for the last two years and these cards will last like no other generation before (especially 4090 with 24GB, but also models with 16GB VRAM), because TSMC hit the node wall and it will take few years before we will see 2x faster GPU compared to the RTX4090.

My RTX4080S wasnt even a a true high end card (the 4090 was a true high end card), yet it runs the vast majority of my games at 4K 120fps, with the exception of UE5 games, or games with heavy RT. These games requires to use DLSS, and sometimes FG on top of that (Black Myth Wukong with PT) to hit 100-120fps. I game for over 25 years and no GPU generation before impressed me that match (and I had legendary cards like 1080ti, 8800Ultra).

The RTX5080 will be slower compared to the RTX4090, but not that much, especially with OC. Even my 4080S with OC is sometimes just a little bit slower (8%) compared to the stock RTX4090 (for example in black myth wukong, the RTX4090 has 111fps, while my card has 103fps with exactly the same settings). Not bad considering the cheapest RTX4090 was twice as expensive when I bought my card.

As for 7900XTX, it seems this AMD card is slowing down a little bit 😂. One year ago it was 1fps faster than the 4080S, now it's 1fps slower on average (techpowerup average fps based on 25 games tested).

average-fps-3840-2160.png
 
Last edited:

FingerBang

Member
The RTX40 series was expensive, but certainly not crap. It offered big performance gains over Ampere (especially in PT games) and huge reductions in power consumption. DLSS FG was also an extremely useful feature. People who bought RTX40 series (except for RTX4060 8GB) had an awesome experience for the last two years and these cards will last like no other generation before (especially 4090 with 24GB, but also models with 16GB VRAM), because TSMC hit the node wall.

My RTX4080S wasnt even a a true high end card (the 4090 is a true high end card), yet it runs the vast majority of my games at 4K 120fps, with the exception of UE5 games, or games with heavy RT. These games requires to use DLSS, and sometimes FG on top of that (Black Myth Wukong with PT) to hit 100-120fps. I game for over 25 years and no GPU generation before impressed me that match (and I had legendary cards like 1080ti, 8800Ultra).

The RTX5080 will be slower compared to the RTX4090, but not that much, especially with OC. Even my 4080S with OC is sometimes just a little bit (8%) slower compared to the stock RTX4090 (for example in black myth wukong, the RTX4090 has 111fps, my card 103fps with exactly the same settings). Not bad considering the cheapest RTX4090 was twice as expensive when I bought my card.

As for 7900XTX, it seems this AMD card is slowing down a little bit 😂. One year ago it was 1fps faster than the 4080S, now it's 1fps slower on average (techpowerup average fps based on 25 games tested).

average-fps-3840-2160.png
I'm glad to hear you're happy with the 4080. I've always seen it as a great card, but it was too expensive at launch. I jumped from a 3080 to a 4090, and the leap was insane. Maybe it's true Ampere spoiled us. The 3080 was very close in performance to the 3090, making it a no-brainer. I think the Super price cut mad it a compelling product, I might end up getting a 5080, but I'm pissed at the 16GB of RAM.

The issue there is that we used to have a similar generational leap for each one of the tiers. In rough terms, you would get the previous top-tier performance with the X70 card, which used to cost way less. The 4070 did not outmatch the 3090, and the jump 3080 -> 4080 was much smaller than 3090 -> 4090. Each card ended up being more or less the performance of a tier above but more expensive. Inflation and the cost of producing wafers are sure, but they are still disappointing and Nvidia just renamed their cards in a shameful way (by price and not by performance). The 3060 -> 4060 was embarrassing.

The 50 series is basically even worse than the 20 series.

Got any pics of your SFF?
Interested in going down this route, I run a long HDMI atm, but its a bit annoying.
I haven't, but can share some later! Went with a 9800x3D and the Cooler Master NR200P max, that comes with AIO cooler and 850W PSU. It's really crammed in there but the temperatures seem fine for now (I've also put it in a cabinet under TV, so I'm monitoring everything constantly). I will try to undervolt both CPU and GPU
 

JohnnyFootball

GerAlt-Right. Ciriously.
I'm hoping for the 5080 to be about 5 to 10 slower than the 4090. The rumors are about 15-20% faster than the 4080, which should put it more or less in that bracket (the 4090 is 25-30% faster).

It's a meh jump coming from a 4080, but still, overall, a better gain in price/performance compared to 4090 to 5090.
Don’t expect the 5080 to have much better availability than the 5090, but who knows.

However, I had a similar thought in that it might be worth downgrading my 4090 to a 5080. Once the 5090s sellout the 4090 will end up with insanely high demand. I could easily see them going for $2000.

They will still be the second fastest card on the market.

Plus, I could use multi frame frame gen to make up for the visual smoothness of lose.

But I’ll likely keep my 4090 as I expect 5080s to actually be going for $1500.

But let’s live in fantasy world and say I could acquire a 5080 for around $1100 and could sell my 4090 for around $2K I’d have to strongly consider that.
 
Last edited:

V1LÆM

Gold Member
The real winners are gonna be those who kept their 4090s.

The real losers will end up being those who sold their 4090s to try and get a 5090. It ain’t happening, when even MicroCenter might get less than 10 GPUs a store
Yeah seems like this is just a beefed up 4090. A 4090 Ti perhaps. That extra VRAM would be nice though if you were into AI stuff.

I would love to get a 5090. I do have a 4080 which is great but it seems like it would still be a good upgrade for me. I recently got a 360Hz monitor so if I can pump out some more frames to make better use of it then I'm happy.

That said.... one store is only getting 10x 5090s. Not sure about Scan.co.uk. I don't know if overclockers do FE cards. Scan is what Nvidia directs to when you buy FE. Maybe they'll have more... but still it's probably not going to be much better/easier.

I do like fucking with AI stuff so double the VRAM would be awesome but on the other hand with DLSS 4 it seems I'll still get something of a performance boost with the new model.

I'll try get a 5090 FE but if not it's no big deal.
 
Last edited:

HeisenbergFX4

Gold Member
Don’t expect the 5080 to have much better availability than the 5090, but who knows.

However, I had a similar thought in that it might be worth downgrading my 4090 to a 5080. Once the 5090s sellout the 4090 will end up with insanely high demand. I could easily see them going for $2000.

They will still be the second fastest card on the market.

Plus, I could use multi frame frame gen to make up for the visual smoothness of lose.

But I’ll likely keep my 4090 as I expect 5080s to actually be going for $1500.

But let’s live in fantasy world and say I could acquire a 5080 for around $1100 and could sell my 4090 for around $2K I’d have to strongly consider that.
Would be very tempting if you could put a few hundred in your pocket and pick up a new card with a warranty even if it is a slight step down, a couple of in game settings tweaks would fix that
 
Top Bottom