Black Myth: Wukong PC benchmark tool available on Steam

Kenan And Kel Reaction GIF


Let's see what my aged 3080 can do
 
8GB benchmark!

Edit-RTX 4070 Intel 13700F
Using the same settings as Lecture Master above
67FPS average
35FPS low :(
I saw lots of artifacts in the sky that looked like Vsync strobing. Driver issue?

edit edit-I guess China now has all my PC's info. :pie_worried:
 
Last edited:
8GB benchmark!

Edit-RTX 4070 Intel 13700F
Using the same settings as Lecture Master above
67FPS average
35FPS low :(
I saw lots of artifacts in the sky that looked like Vsync strobing. Driver issue?

edit edit-I guess China now has all my PC's info. :pie_worried:
I'm sure they had it already.
 
Yea I could not tell a difference in performance either when turning on or off Raytracing.
You need to restart the game.

My fps dropped from an average of 55 to 20 fps when i turned on rt on very high.

Rtx 3080. Even cyberpunk path tracing isn't that high.
 
I don't know how useful it will be. As far as I've seen, there were no fights or no heavy effects. You can certainly subtract another 10fps for the main game.
 
Well, here's my results. Resizable Bar enabled, Windows 11 memory protection and other needless shit turned off. Locked FPS to 30 via Riva, turned AA all the way down to low cuz it's completely pointless since DLSS is doing all the work.

5Vt1F3k.png


I honestly can't believe that the game runs so well (perfectly straight line with zero deeps and stutters whatsoever) with this kind of visuals and only eating up 6,5GBs of VRAM at 88% of 4K and 99% of settings at max and without RT (with my GPU at 4K and DLSS even, there's no point in even bother cuz RT is optimized for Frame Gen and much better CPUs). I mean, I'm considering this level of performance on my rig and with my CPU and GPU at max settings and 88% of 4K as great performance so don't at me:messenger_beaming:

But somehow I don't think that this is indicative of real world performance during actual gameplay with lots of alpha effects and particles on screen and such, so based on that I still won't be pre-ordring the game and will wait for PC reviews and impressions.
 
Last edited:
Maxed out at 3440x1440p doesn't seem at all feasible on my rig (13700k + 4080) after restarting the application. I got in the 40s with very high Ray Tracing and DLAA + Frame Gen. This is pre-driver and pre-release though.

There's something weird with post processing. Image doesn't look clean even maxed out on DLAA, a lot of artifacts around branches for instance. Both smudgy and oversharpened at the same time. The game is supposed to also have Ray Reconstruction even if I didn't see it in the settings, maybe that is why there are some smudginess and artifacts like in Cyberpunk and Alan Wake.

EDIT: To be a bit specific, I got 46 fps with DLAA and Ray Tracing: Very High + Frame Gen. With same settings and DLSS Quality I got 78 fps, which is a huge jump.
 
Last edited:
Does the raytracing option work properly? I don't get much difference in FPS from off to max.
Is it because even RT off is using lumen? Your probably just changing from lumen to rtxdi, the performance difference is smaller going from lumen to RT than raster to RT.
 
If the benchmark roughly reflects real game performance, I will probably play with 4k DLSS quality (67%), FG off, additional RT effects off and details on high.

That should be enough for a constant 60 fps and deliver good image quality thanks to DLSS "Quality" + Details on High.
I'll probably leave FG out due to the nature of the gameplay.

Unbenannt.jpg
 
Ok I'm not a pc gamer (anymore) but there's barely anything happening in the benchmark. No destruction physics, combat, alpha effects, particles, fast cuts in the camera world placement… it's a camera flowing around a river. Is this your typical pc benchmark? How is this stretching hardware at any level? How would the results be representative of your frame rate during gameplay?
 
Last edited:
Is it because even RT off is using lumen? Your probably just changing from lumen to rtxdi, the performance difference is smaller going from lumen to RT than raster to RT.

I'm looking forward to seeing the difference between maxed out Lumen vs Nvidia's per pixel RTGI in the DF video. As it stands now I'm satisfied with just using maxed Lumen for lighting and reflections.
 
Ok I'm not a pc gamer (anymore) but there's barely anything happening in the benchmark. No destruction physics, combat, alpha effects, particles, fast cuts in the camera world placement… it's a camera flowing around a river. Is this your typical pc benchmark? How is this stretching hardware at any level? How would the results be representative of your frame rate during gameplay?

No. it's not how PC benchmarks usually work.
This Black Myth Wukong Benchmark Tool, is just bad at what it's supposed to be.

For example, this is the Lobby benchmark from 3DMark 2001. Based on the Max Payne engine.
As you can see, even 23 years ago, PC benchmarks had tons of action, physics, alpha effects, animations, etc.

 
Seems terrible for native 1440p with no RT if I'm honest.

It is, really bad. Upscaling helps, but not a lot.
And image quality is bad. It has a grainy and over sharpened look, and no option to disable fiml grain and sharpen.
And with max RT, it goes to 7 fps. Mind you, this is an RDNA2 card.
 
Last edited:

Show one with frame gen off, we will see the difference between high end Ampere and Ada (RT is optimized to Ada according to Nvidia in this game).

Same settings but without frame gen obviously:


B3sjKHM.jpeg


Everything on high (textures on cinematic) and RT medium:

4Zm8aEx.jpeg


Everything on high (textures on cinematic) and RT OFF, finally playable!

6fjRCUE.jpeg
 
Doesn't even look to good to be honest, also weird benchmark no combat.

4080 laptop, very high + rt very high + DLAA = 68 fps average at 1080p


1cb864fdf61745436c623015710090b7.jpg


RT off

5213af6e92f54432ecaed9b4f4640408.jpg


DLSS quality, RT very high, settings very high 1080p

7d1a1f8ad3640aa441e571bba2a13e88.jpg


DLSS quality, RT off, settings very high 1080p

8411645ed317a8fd287fe1961156da3d.jpg
 
Last edited:
This Benchmark only seems to torture the GPU, I think (Lumen + Nanite + additional RT features (translucency, ...).
The only thing I can think of is that the CPU isn't taxed that much in this game anyway. I've not seen anything that would be taxing to the CPU in trailers or gameplay. Hence why the benchmark is just a flythrough.
 
No. it's not how PC benchmarks usually work.
This Black Myth Wukong Benchmark Tool, is just bad at what it's supposed to be.

For example, this is the Lobby benchmark from 3DMark 2001. Based on the Max Payne engine.
As you can see, even 23 years ago, PC benchmarks had tons of action, physics, alpha effects, animations, etc.


Oh man this brings back memories. I ran this so many times just because it was the coolest shit.

And that's indeed what I would call a benchmark tool.
 
When a benchmark tool has more users than most games.
Imagine the devs of Concord looking at this :messenger_tears_of_joy:

j07h6Zh.png


EDIT: It has surpassed 85K concurrent people.
 
Last edited:
Top Bottom