• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Black Myth Wukong In-Game Performance Benchmark

I'd rather go down to minimum settings first, and only use upscaling if even that fails. There's not a massive difference between minimum and maximum in most games these days, but upscaling introduces an overall blur that I don't like.
You seen how this game looks on low? :eek:

Even when I ran at 720p I actually thought it wasnt that bad, at 1080p it was fine, the main difference between that and 1440p was things looked a "bit" sharper. I struggled to see the difference between native 1440p and DLSS at 75%.

You turn it down to low, you get massive lighting issues, very little follage, amongst other things like the objects not reacting to wind, it will genuinely look like a older generation game.

It seems it's not available in-game :
(source : https://www.pcgamingwiki.com/wiki/Black_Myth:_Wukong)
Shame no DoF adjustment.
 
This benchmark is close to useless, only testing the cinematic setting. The screenshots show there is good performance scaling but reading the bench provides literally no insight on the actual expected performance for gamers.
 
1440p, happy if he can get 60fps without losingto much eye candy. Thanks
These are my benchmark results on a 5800X3D. With the (almost) maximum graphics presets the CPU is going to be largely irrelevant. The 7800X3D may get slightly better percentile lows.

I started in native 1440p (100% resolution scaling) with everything on Cinematic but dropping Global Illumination and Shadow Quality to High. No ray tracing or frame generation. You could limit the fps to 60 for a very smooth experience without sacrificing too much visual quality:
custom.jpg

Alternatively, you may keep everything maxed out (no RT ofc) and use frame generation:
FG.jpg

Or disable FG and set upscaling to 67% (equivalent to FSR quality mode) on the highest preset:
FSR.jpg

Using either of these will lower the overall image fidelity and could introduce minor visual artifacts. Whether this is something you're likely to notice or not is another thing.
 
I don't see why you would only show benchmark results with 66% upscaling its kind of useless.
 
This benchmark is close to useless, only testing the cinematic setting. The screenshots show there is good performance scaling but reading the bench provides literally no insight on the actual expected performance for gamers.

What makes you think the performance difference between GPUs will change at different quality settings?

For example, the 4080 Super is 108% faster than 2080 Ti at 1440p with Cinematic Settings and 66% scaling. You think the 2080 Ti will close that gap at Medium?
 

The Recommended GPUs get 45?, 57 and 47 fps at 1080p Medium with no upscaling. Seems closer to minimum but that usually targets 30fps. What do companies usually target for recommended? 1080p 60fps at High would be 6750 XT, 3070, 4060 Ti.
 
The Recommended GPUs get 45?, 57 and 47 fps at 1080p Medium with no upscaling. Seems closer to minimum but that usually targets 30fps. What do companies usually target for recommended? 1080p 60fps at High would be 6750 XT, 3070, 4060 Ti.
Sadly now days dev's are using upscaling to meet minimal barely acceptable performance targets, optimisation is pretty much the lowest of the low priority. Many developers have revealed over the years that in typical game development, as late as just a month or so before release, the game is only tested on high end non consumer hardware builds, and then you get the crunch, the rush to make it playable on the target device.

I remember one company stating their standard test machines were quad titan or something.

I expect the future will be even worse as AMD now have made frame generation much more accessible so dont be surprised in 1-2 years from now top end cards needing combination of upscaling and frame generation to get 60fps at high resolution.
 
What makes you think the performance difference between GPUs will change at different quality settings?

For example, the 4080 Super is 108% faster than 2080 Ti at 1440p with Cinematic Settings and 66% scaling. You think the 2080 Ti will close that gap at Medium?
Well different settings can affect different gpus differently, but that's not my point. If I want to compare gpus between them there's already tpu synthetic benchmarks. This bench is supposed to give insights into how the game runs, but it does not. This kind of benchmark is usually not very useful because, but here it's worse because the devs gave access to the cinematic setting of UE5 which is not really meant for gameplay and has terrible scaling. How does the game run on my gpu with reasonable settings ? I have no idea.
 
How does the game run on my gpu with reasonable settings ? I have no idea.

The reviews are useful imo. It doesnt make sense to test for niche settings like your talking about. There is way too much varience. Better to test on the fastest systems money can buy at the most brutal settings possible. Then you can extrapolate that if you are running "more reasonable settings" they can only be better than what is shown unless something is wrong with your specific rig.
 
The reviews are useful imo. It doesnt make sense to test for niche settings like your talking about. There is way too much varience. Better to test on the fastest systems money can buy at the most brutal settings possible. Then you can extrapolate that if you are running "more reasonable settings" they can only be better than what is shown unless something is wrong with your specific rig.
But Cinematic IS the niche setting. It is not meant for gameplay, does not allow to infer performance for the 99% of gamers that can't enable it and even those that technically can would be better off with a different tradeoff.
 
But Cinematic IS the niche setting. It is not meant for gameplay, does not allow to infer performance for the 99% of gamers that can't enable it and even those that technically can would be better off with a different tradeoff.
You can extrapolate from the numbers on the "Image Quality Comparison" page. Frame rates for different settings with the same scene are there.
 
You can extrapolate from the numbers on the "Image Quality Comparison" page. Frame rates for different settings with the same scene are there.

Also for people who want an optimization guide hub has one and soon DF.


This is a gpu benchmark and as long as all settings were equal on all gpus it doesn't matter what settings are used. Technically he could have just maxed out the game and used RT to show the real differences from the best to worst gpu but given how far behind amd is in RT in this game there would have been too many people butthurt over it.

There is also a free benchmark for anyone that wants to test settings on your own pc which is much more useful than looking at graphs. The biggest advantage to these charts is if you are on somtning like a 3060 and want to upgrade how much faster would X card be.
 
But Cinematic IS the niche setting. It is not meant for gameplay, does not allow to infer performance for the 99% of gamers that can't enable it and even those that technically can would be better off with a different tradeoff.

I understand that, but I think thats also the point. Its easier to extrapolate down than up. If he benchmarked a 13400KF 8GB DDR4 and a 4050 w/ a 500w PSU those results would mean absolutely nothing to me.

Atleast if you have a lower end system you can make the assumption it will be better or still a bad time. a770 at 4k on one of the fastest platforms only managed 17fps? Thats gonna be a bad time overall. 4k 200FPS on the highest settings with the best system? You can assume that if you are running lower resolution on lesser parts the experience will just get better. You cant really do that the other way imo.
 
You can extrapolate from the numbers on the "Image Quality Comparison" page. Frame rates for different settings with the same scene are there.
That's exactly my point, the only useful part of the review is limited to a measurement on a single gpu and is a pain to access.

Edit: I don't see the point of a single game gpu benchmark. There is already a much better source of information for this it's the actual gpu reviews with a lot of games and relative performance and so on.
 
I understand that, but I think thats also the point. Its easier to extrapolate down than up. If he benchmarked a 13400KF 8GB DDR4 and a 4050 w/ a 500w PSU those results would mean absolutely nothing to me.

Atleast if you have a lower end system you can make the assumption it will be better or still a bad time. a770 at 4k on one of the fastest platforms only managed 17fps? Thats gonna be a bad time overall. 4k 200FPS on the highest settings with the best system? You can assume that if you are running lower resolution on lesser parts the experience will just get better. You cant really do that the other way imo.

Honestly it isn't W1z job to try and decide what a user might use for settings it just his job to show us how gpus stack up against each other.

Everyone's has different pc and prefers using different settings I'll play maxed out with cinematic settings doesn't mean that's how it should be benchmarked.


To be fair no matter what w1z choses to do people will complain becuase they can't separate how they play games vs how others might.
 
It works on PS5 - rx 6700 GPU but it needs ideally a 1000$ gpu on PC.
The usual Nvidia sponsored game to sell video cards and all it took was hair quality and shadows and the usual ray tracing that needs the most premium GPU.
 
It works on PS5 - rx 6700 GPU but it needs ideally a 1000$ gpu on PC.
The usual Nvidia sponsored game to sell video cards and all it took was hair quality and shadows and the usual ray tracing that needs the most premium GPU.

Runs like crap with every setting turned down and terrible texture quality if that's your definition of works sure. Also the FSR implementation on console destroys image quality although to be fair thats just fsr not the implementation.


If you turned down every setting to console level it would run fine on pc even on a 7600/4060.

The developers offer a much better visual experience on pc over an 3 year old 4-500 usd console who would have thunk.

But if you want to rock medium settings with crap fsr at 30-45fps go for it.

By your definition the game technically works on a 1060....

 
You seen how this game looks on low? :eek:

Even when I ran at 720p I actually thought it wasnt that bad, at 1080p it was fine, the main difference between that and 1440p was things looked a "bit" sharper. I struggled to see the difference between native 1440p and DLSS at 75%.

You turn it down to low, you get massive lighting issues, very little follage, amongst other things like the objects not reacting to wind, it will genuinely look like a older generation game.
Older generation game look is fine. Blurriness isn't. I don't need the bleeding edge all the time (especially with a 6750 XT), but I prefer my image to be sharp if possible. It's just my opinion, though.
 
Back
Top