NVIDIA RTX 4090 is 300% Faster than AMD’s RX 7900 XTX in Cyberpunk 2077: Phantom Liberty Overdrive Mode, 500% Faster with Frame Gen


Yes, this is with RT Overdrive, the new full path tracing RT lighting system in Cyberpunk 2077. It's the same path tracing tech previously seen in Portal RTX, now made to run on a considerably more detailed and modern game.

The new Ray Reconstruction tech added in DLSS 3.5 helps both RTX 4000 and 3000 cards, but of course Frame Generation only works on RTX 4000 cards. That said, a last-gen RTX 3080 is still able to push past 60 fps in 1080p with RR enabled with this extremely hardware intensive path tracing. The RTX 4090 can get more than 60 fps in 4K with both RR and FG enabled, which is pretty impressive.

With nothing more than shitty ass FSR 2.0, no RR, and no FG, the 7900 XTX is only half as fast as the previous-gen 3080 and it is absolutely destroyed by the 4090 at 1080p. There is just no hope for the 7900 XTX with RT Overdrive enabled. 7900 XTX owners should probably stick with the older RT Ultra non-path tracing or just turn RT off entirely.
 
Cyberpunk takes good advantage of ray tracing and if that is an end all be-all feature for you, nvidia is the only option.

Cyberpunk rasteurized still looks great though.

My 4090 with DLSS 3 actually handles pathtracing pretty well and I am enjoying the lighting. I may play Phantom Liberty in it's entirety this way.
 
Last edited:
Certainly looking to upgrade my card when the 5000 series hits to take advantage of all this for now happy for my RTX 3080 to keep chugging along as is.
 
Last edited:
why is this marked nsfw lol.

Is the high framerates from the 4090 gonna make me jizz all over my PC?




..... well you'd be damn right. Why do you have to be so good at your job nvidia....
 
Wait. How is a 3080 going past 60fps at 1080p with pathtracing enabled when my 3090 only does 30-40fps :pie_thinking: Ray reconstruction must have a wild performance boost
 
Last edited:
I've found a near-perfect sweet spot with my 4070ti, 4k almost locked 60 (above indoors) with all of nVidia's neat RTX/DLSS bells and whistles turned on. Really does look gorgeous and run shockingly well for a £800 GPU.
 
Wait. How is a 3080 going past 60fps at 1080p with pathtracing enabled when my 3090 only does 30-40fps :pie_thinking: Ray reconstruction must have a wild performance boost
Same. It's with dlss set to performance or quality which means 720p or 640p internal resolution. At which point the game looks like shit with all the ghosting and blurriess.

The funny thing about these benchmarks is that you need a 4080 or 4090 to truly appreciate the path tracing visuals without severe artifacting.
 
hardly surprising when this mode kicks the bollocks clean off a 4090 let alone AMD cards, its a mode for the future to show whats to come.
 
hardly surprising when this mode kicks the bollocks clean off a 4090 let alone AMD cards, its a mode for the future to show whats to come.
With DLSS3 enabled, pathtracing mode is very playable. If you have a 4070 Ti or above, I recommend anyone turn on pathtracing and at least see it for yourself.
 
Even with RDNA 3 to 4 with 1.5x RT improvement and then again RDNA 4 to 5 with again 1.5x improvement, they would just catch up to 4090 in path tracing.

2080 Ti outperforms the 7900XTX, 2 gen behind, a card that is way gimped in transistors, bandwidth, cache and clocks compared to RDNA 3 flagship.

And this is starting with the baseline that Cyberpunk 2077 rasterization performs really well on AMD

They have to throw the hybrid RT pipeline in the trash can. Intel 2nd iteration will be very dangerous for AMD if they don't step it up.
 
Getting between 90 and 120 fps with Frame Gen, absolutely everything to max, 4K DLSS Quality, Ray Reconstruction with all RT/PT enabled etc. So not seeing how this is bringing my 4090 to its knees. Unless we want to start whinging about none native that is. Like, native hasn't been the true benchmark since 4K became a thing, not really, checkerboard, VRS, etc, etc. DLSS and Frame Gen are just more weapons in the reconstruction/4K arsenal. Damn fine job they do too.

Game is a sight to behold whacked all the way up and my 4090 is handling it beautifully.
 
Last edited:
No regrets getting a 4080 at launch. Cyberpunk looks incredible...of course I just ran into a stupid glitch where I need to download a mod to fix it though...
 
"Competition"

dumb jim carrey GIF


Time for Intel to start bringing the heat, because this is shambolic.
 
Last edited:
Wtf has AMD been doing? Are they just going to completely ignore ray tracing for this entire generation?

If they somehow by the grace of god do catch up to where Nvidia is right now, by that point everyone will be onto some new ground breaking tech and AMD will be left in the dust again.

So sad.
 
Wait. How is a 3080 going past 60fps at 1080p with pathtracing enabled when my 3090 only does 30-40fps :pie_thinking: Ray reconstruction must have a wild performance boost

Same. It's with dlss set to performance or quality which means 720p or 640p internal resolution. At which point the game looks like shit with all the ghosting and blurriess.

The funny thing about these benchmarks is that you need a 4080 or 4090 to truly appreciate the path tracing visuals without severe artifacting.
It's 1080p using DLSS Quality.
 
Wonder if HUB will put cyberpunk overdrive and alan wake full RT in their multi game benchmarks twice like they did with raster cod lol.......... yeh probably not, just can't figure out why.
 
I've found a near-perfect sweet spot with my 4070ti, 4k almost locked 60 (above indoors) with all of nVidia's neat RTX/DLSS bells and whistles turned on. Really does look gorgeous and run shockingly well for a £800 GPU.

While I get what you're saying, I can't believe we've reached a point where it's shocking that an £800 GPU runs a game well.
 
also helps that 2077 is the bes tlooking game hands down with path tracing on. surely well get some sony boys coming in saying uncharted 4 looks better lmao
 
point where it's shocking that an £800 GPU runs a game well.
above video show 1080p perfomance with 4070ti

How are they going to get ray tracing without Nvidia? Or any of the AI/DLSS upscaling techniques and frame gen they're using?
new Xbox already hint for you AI. I'll disappoint you, but AMD have also RT or you still think AMD will stay same? :messenger_tears_of_joy:
 
Cyberpunk's been Nvidia's showpiece game, AMD lacks frame generation, and suffers with ray tracing. Plus it's Nvidia's top $1600 card that has no AMD equivalent.

Can't say I'm surprised.

That's the beauty of PC gaming though. You can always slap a different GPU in your rig, if you're not satisfied.
 
Last edited:
The game looks fantastic even without RT so don't think you're missing out on a must-have feature. It's nice to have but I only really recommend it to people with a 4070 Ti or above.
 

performance-pt-1920-1080.png
performance-pt-3840-2160.png


RDNA 2 → RDNA 3 = 1.5~1.6x
Turing → Ampere = 2.1~2.4x
Ampere → Ada = 1.8~1.9x (not including frame gen )

7900XTX is 14.5 fps at 1080p
14.5*1.5 = 21.8 fps
21.8 fps*1.5 = 32.7 fps

You're right, i was wrong, it doesn't even catch up to 4090. They're royally fucked.
 
Last edited:
You're right, i was wrong, it doesn't even catch up to 4090. They're royally fucked.
Man, do you think RT will stay same?? since RDNA4 will get Traversal Engine, the last piece what AMD needed, so do not be surprised if RX 8700XT gonna be around 4070/ti level in PT
 
Last edited:
performance-pt-1920-1080.png
performance-pt-3840-2160.png


RDNA 2 → RDNA 3 = 1.5~1.6x
Turing → Ampere = 2.1~2.4x
Ampere → Ada = 1.8~1.9x (not including frame gen )

7900XTX is 14.5 fps at 1080p
14.5*1.5 = 21.8 fps
21.8 fps*1.5 = 32.7 fps

You're right, i was wrong, it doesn't even catch up to 4090. They're royally fucked.
Keep regurgitating that 1.5x number, nothing makes it true.

PC gamers have to be some of the dumbest people in existence, actively celebrating one company dominating the GPU industry while proactively building a walled garden.
 
Keep regurgitating that 1.5x number, nothing makes it true.

PC gamers have to be some of the dumbest people in existence, actively celebrating one company dominating the GPU industry while proactively building a walled garden.
How "smart" do you have to be to interpret simple benchmark numbers, really simple technological facts, as "celebration"?
Facts are neutral. The issue is you....
 
Keep regurgitating that 1.5x number, nothing makes it true.

PC gamers have to be some of the dumbest people in existence, actively celebrating one company dominating the GPU industry while proactively building a walled garden.

It's not consumer's fault that AMD has fallen so far behind. They've been making GPUs for a very long time and they have proven they can't keep up.

And what does walled garden mean to you? Because PC is literally the only option in gaming if you don't want to participate in one.
 
While I get what you're saying, I can't believe we've reached a point where it's shocking that an £800 GPU runs a game well.

Yeah, should clarify, shockingly well *with overbearing RTX stuff turned on. I don't even bother with RTX for most things but it does look crispy in a neon-soaked city.
 
Yeah, should clarify, shockingly well *with overbearing RTX stuff turned on. I don't even bother with RTX for most things but it does look crispy in a neon-soaked city.
Issue is some games have shit non RT lighting because they don't bother to spend the work hours on it. And of course vice verse.
 
Even with RDNA 3 to 4 with 1.5x RT improvement and then again RDNA 4 to 5 with again 1.5x improvement, they would just catch up to 4090 in path tracing.

2080 Ti outperforms the 7900XTX, 2 gen behind, a card that is way gimped in transistors, bandwidth, cache and clocks compared to RDNA 3 flagship.

And this is starting with the baseline that Cyberpunk 2077 rasterization performs really well on AMD

They have to throw the hybrid RT pipeline in the trash can. Intel 2nd iteration will be very dangerous for AMD if they don't step it up.
AMD wont release any highend cards with RDNA4
 
At least the 7900 XTX has Displayport 2.1 or whatever.... take that Nvidia!
Yeah it's great that the market leader dropped displayport over usb-c meaning that no one will support it, even though it has useful applications like in VR.
 
Keep regurgitating that 1.5x number, nothing makes it true.

PC gamers have to be some of the dumbest people in existence, actively celebrating one company dominating the GPU industry while proactively building a walled garden.

And what do you propose consumers do exactly?

Purchase substandard products that don't run the games they want to play well because...?

I'm genuinely interested to hear the answer to this.
 
Last edited:
And what do you propose consumers do exactly?

Purchase substandard products that don't run the games they want to play well because...?

I'm genuinely interested to hear the answer to this.

It's not like this is the standard performance delta between the 7900 XTX and 4090. It's literally one game. One game that has always performed uncharacteristically well on nVidia hardware and poorly on AMD hardware.


For the other 99% of games the 7900 XTX holds up just fine and actually provides a much better cost/performance ratio compared to the 4090.
 
And what do you propose consumers do exactly?

Purchase substandard products that don't run the games they want to play well because...?

I'm genuinely interested to hear the answer to this.
But this game is designed with Nvidia hardware in mind so Cyberpunk should run best on Nvidia GPUs. I'm not sure how well can path tracing run on AMD's GPU if the devised and equivalent tech but I can't say for certain AMD GPUs are bad.
 
Top Bottom