Black Myth: Wukong Only Hits 29 FPS On RTX 5090 At 4K Without DLSS 4

People arguing over the 29fps, while i'm sat here thinking since when does DLSS, even on performance mode, get you near 10 times the performance!.
29fps to 240fps is insane. I guess maybe they are using framegen too?
Its with the new frame gen
 
Between 4090 and 5090 native you have a gap of (2) 7900XTX

performance-pt-3840-2160.png



Love It Wow GIF by Wrexham AFC
 
We could save a lot of time from bickering about this if we just had comparison benchmarks for a decent sampling of games. 10 or so graphics heavy games should be reasonable.
 
Kinda shocked you guys think 20-40% in a year isn't a significant improvement; what other tech continually gains in steps as big or larger?
 
I'm not impressed, and this is why making performance claims with upscaling and frame gen on should be forbidden by law and punishable by death... Or a small fine.

Either way, it's more the og claim than the actual result that's a big surprise, Jensen has claimed years ago that in the future most of the graphics heavy lifting would be generated images by AI as opposed to rendered graphics.

So Nvidia brings us where the CEO said they would.

Next step is to make games that use a custom trained AI model to generate the world's graphics, in part or in full, while you play to achieve more realistic lighting, physics general appearance, using the actual 3d pipeline to render basic metadata about the world and object placement.

That kind of tech has been in the works for years, and like all AI tech it has only gotten much better over time, what remains to be seen is what generation of cards is fast enough to do this in real time at a good resolution while using an AI model that makes sense.
 
Considering you can get the 4090 used for half of the price of 5090, I wouldn't be surprised if people decided to pick that up. ( Hell I am debating now if I should go 4090 or 5090 )

It's just too much to justify the price difference now .


Still, there is a magic in owning a new tech . And 5090 is no exception.

It's just I think the 6090 if they hit a lower nod, will destroy the 5000 series
And the 4090 is obviously still a monster of a card through this next generation and by far the smartest thing to buy if you can find one at a decent price

If my 4090 would not have died recently I likely would have held off at least until real benchmarks came out

I sure don't blame anyone holding off just like I don't blame anyone for jumping on the 5090 even if coming from a 4090
 
Last edited:
And the 4090 is obviously still a monster of a card through this next generation and by far the smartest thing to buy if you can find one at a decent price

If my 4090 would not have died recently I likely would have held off at least until real benchmarks came out

I sure don't blame anyone holding off just like I don't blame anyone for jumping on the 5090 even if coming from a 4090
I just got my 4070 Ti Super. I am going to skip the 50xx gen and go straight to 6070 Ti, especially if it comes with 24 GB of VRAM.
 
I just got my 4070 Ti Super. I am going to skip the 50xx gen and go straight to 6070 Ti, especially if it comes with 24 GB of VRAM.
Thats a good card I have the 4070 super in a mini ITX hooked up to my TV and it more than does the job.
 
Last edited:
Also, just like Ada's SER and other optimizations, it'll not be obvious day 0 the raw power the cards have

When / if cyberpunk 2077 updates to Neural cache radiance path tracing, when Alan Wake 2 upgrades to Mega geometry and others, hopefully Wukong too as nanite is basically a hand shake with mega geometry, then you'll start to see why Blackwell has so much focus on NPUs
 
Wukong ain't no god damn Crysis. Why we even comparing the two? Also there ain't no way we can't get real frames from NVIDIA like we could in the past. They holding out on us.
 
The real performance jump between a 4090 and a 5090 is very small but this is the future where GPUs are headed, given the impossibility of reducing the node and consumption, the focus will be more on higher TOPS for AI than simple raster power and TFLOPS.
 
WTF are people talking about? PT is insanely demanding in native 4k in almost all games, 4090 is massacred as well.
No one that wants to play with PT is using native 4k. You need DLSS SR to bring FPS to 60+. Frame gen is not needed.
GPU is still ~40% faster than 4090 so what we are talking about?
Meanwhile AMD card is 5FPS on these settings so PS5 Pro would get like 2.5FPS?

pt-3840-2160.png
So what I'm hearing here is pc gamer is fine with 29fps😐
 
I bought my 4070 Ti Super about 2 months ago for basically 1.100K euro in Greece, which was more or less the average price in the market for non-used cards.

With my income, giving 3K every two years for a GPU is WAY too much money. I could never have my PC have the most powerful GPU every two years.
5070Ti should be a bit cheaper since MSRP is lower. That said, there is no point in upgrading if you already have 4070Ti S since you are getting all the DLSS4 improvements besides enhanced FG.

And performance difference other than new FG is going to be pretty small.
 
And the 4090 is obviously still a monster of a card through this next generation and by far the smartest thing to buy if you can find one at a decent price

If my 4090 would not have died recently I likely would have held off at least until real benchmarks came out

I sure don't blame anyone holding off just like I don't blame anyone for jumping on the 5090 even if coming from a 4090
One thing to consider is that 4090 used is going for almost $2K on eBay now days. It should drop at 5090 release, but it's nowhere near 1/2 the cost.

And even after release of 5090 I don't see 4090 dropping below $1500 or so, at least for first few months.
 
Bro, that's not a reflection on the tech. That continues to expose the mediocre dev talent at these shops. The dev talent can't keep pace with the advancement in tech, and unfortunately, most shops have to just throw raw tech power at the problem to offset weak dev teams.
 
So what I'm hearing here is pc gamer is fine with 29fps😐

What you expect from most demanding form of RT available for gamers?

No one is fine with 29fps but also no one sane plays with settings like that. Drop DLSS to performance and you will see 60+ without frame gen.

Crysis "DLC" was running like this on most powerful GPU available:

Enthusiast_01.png


And you had no DLSS to improve it without major hit to IQ like right now.
 
Did you sleep the rest of the years?

Fair point, but I think understandable given I'm not a gullible fool who buys every new product that gets a digit bump on its branding.

Also, 2022 I had vastly more important stuff to deal with in my life than tracking video-card releases, like bereavement, life-threatening health issues, and ongoing emotional consequences as a result of Covid.

That being said, its still a single product cycle and it should be obvious to anyone with half-a-brain-cell that the significance of ratio based improvement needs to be factored against what the baseline value that's being measured against is. Ultimately even completely linear advancement will result in steady decline in relative performance gain as optimal efficiency is approached.

Hence what we're seeing is improvement gained via a different paradigm, that using AI generation of frame data is the new primary axis of measurable improvement. Because contrary to what a lot of smooth-brains seem to think, this is science not magic.
 
Last edited:
The real performance jump between a 4090 and a 5090 is very small but this is the future where GPUs are headed, given the impossibility of reducing the node and consumption, the focus will be more on higher TOPS for AI than simple raster power and TFLOPS.

Yup an almost 600W GPU can only be up to 30% faster than the previous generation.
While solutions to this problem have been studied ( graphene, carbon nanotubes, gate all around transistors, stacked 3D dies) it seems the short-mid term solution will be to use AI by adding a relatively simple 8 bit integer calculation unit to improve things perceptually and assist traditional rendering techniques.
 
The real headline is how many FPS the Pro would manage in the same scenario - 4k maxed without upscaling. 5? 6?
 
Last edited:
About 40-50fps maxed out 4K no DLSS 4090.

Something is wrong.
Nonsense!

My 4090 with cinematic settings at native 4k ie dlaa it gives 23fps on the first levels.

Maybe on the menu you get 40 to 50 but not anywhere else I've seen.

I've finished it on performance ie 1080p to 4k dlss in cinematic and I got over 60fps that way, it felt like 30fps though but I got somewhat used to it, I think I tried ultra performance too on bosses as it gave much more fps.
 
People are saying that node shrinks don't make much difference anymore but isn't the 5000 series using pretty much the same process node as 4000 series?..
 
People are saying that node shrinks don't make much difference anymore but isn't the 5000 series using pretty much the same process node as 4000 series?..

Without nod shrinks you can't have much difference as seen with 4090 and 5090. They were only able to pack 33% mode cores on it.

Difference between 3090ti and 4090 was like 60% and on top of that 50% clock increase.
 
I think that card is the sweet spot for gaming and just my 2 cents you are set through this gen
I just need to update my RAM and CPU - I have an i7-9700K and GGDR3 RAM (I think - I may have GGDR4). So it's updating everything else except PSU/GPU this year, getting a few NvMe's in 2026 and 6070 Ti Super in 2027.
5070Ti should be a bit cheaper since MSRP is lower. That said, there is no point in upgrading if you already have 4070Ti S since you are getting all the DLSS4 improvements besides enhanced FG.

And performance difference other than new FG is going to be pretty small.
Has there been any detailed analysis on how better DLSS4 is from DLSS3 on a non-50XX card? I'd REALLY like to see some statistics without multiple frame generation 'distorting' the data.
 
Hey Nvidia, I'll take the RTX 5090, here is $500 (real money), the last 75% will be fake just like your resolutions and frames.
 
Last edited:
I want to know what it run with all the bell and whistle at max but not with RT? But yeah. Should had been atleast a better jump than that. Doesn't the 4090 gets like 20 something too with the same setup?
 
that using AI generation of frame data is the new primary axis of measurable improvement.
Which is why no one likes it. Do we have a choice? Probably not, none of us can do anything except not buy their stuff, but game devs, aaa game companies are hand in hand with hardware manufacturers so unless we also stop buying most big games as well as hardware, they won't change.
 
Top Bottom