• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

The Witcher 3: Performance Analysis

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
28,649 (3.74/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
In this article, we put the GTX Titan X, R9 295X2, GTX 980, R9 290X, GTX 970, R9 290, GTX 960, and R9 285 through the Witcher 3: Wild Hunt. We do so at resolutions of 1600x900, 1920x1080, 2560x1440, and 4K to assess what hardware you need to play this recently released top-title.

Show full review
 
Last edited:
Looking good but man does seem to be quite a needy beast on the high end side, guess those new gpu's in June got a game to be bought for :P.

Also totally with you on hating the (motion) blur effects.
Its just baffeling people A want it turned on and B developers even made it in the first place.

Lets reduce pixel lag and response time on monitors more and more to reduce blur during motion....and then introduce a motion blur effect....

Just wtf, and apart from that, why would you even want it? why do you want an image to blur when its moving and dont tell me its realistic because your eyes can keep it sharp irl because your eyes cant keep the image sharp during motion without adding some motion blur effect either.
 
I want to upgrade to 1440p screen too but can't find any brand that goes bigger than 28".
 
So, a 7850 2GB has no chance on 900p ? :(

With those image quality settings? No.

But I'm certain some medium-high blend settings would work well. If AMD could release a driver update that would be even better, perhaps.
 
Not great for AMD. Will you retest when/IF AMD release a driver for this with optimisations?

Terrible from AMD not to have a driver out in time.
 
I'm a GTX 780 TI user. I used to think me card could blast enything on my 1440P screen.
 
For some reason playing in borderless windowed mode made my frametimes jump through the roof and it introduced some insane stuttering even at over 50 fps
 
I'm a GTX 780 TI user. I used to think me card could blast enything on my 1440P screen.
That's gameworks for you right there brother - GTX 7** and below plus AMD cards are not optimized for this game.
 
Holy crap, under 2GB VRAM. Bravo devs. That there is the benefit of a game having a longer dev time instead of being rushed out. Sadly, EA will learn nothing from this.
 
The game is under 2GB VRAM for obvious reasons, have u seen the horrible vegetation?
Yeah, although I cannot remember where, I've read that while the devs at some point said the install was over 50GB, it now is way less.
One can take an educated guess what parts of the game took a hit from that (and why they did it in the first place if you fancy a tinfoil hat)
 
I'm running the game with two 970's @4K, and if you want to turn everything to max (including hairworks) then I'd recommend capping the game to 30fps, you certainly wont be able to run it at 60fps.

Even running at a 30fps cap, my GPU usage is right on the limits, and during some cut scenes, frame rates can drop to about 25fps (worst case), during general play though, I've not seen it dip (yet) below 30fps.
 
Should probably get that second 970 now... :D
 
I'm running the game with two 970's @4K, and if you want to turn everything to max (including hairworks) then I'd recommend capping the game to 30fps, you certainly wont be able to run it at 60fps.

Even running at a 30fps cap, my GPU usage is right on the limits, and during some cut scenes, frame rates can drop to about 25fps (worst case), during general play though, I've not seen it dip (yet) below 30fps.

Try disabling movie ubersampling(should be in file bin/config/base/visuals.ini):
Code:
[Visuals]
Gamma=1
MovieFramerate=30.0
MovieUbersampling=true
Change MovieUbersampling=false.

One tip for game play was lowering hairworks AA, in default it uses 8AA which is very demanding. For better fps one could try 4 or 2 or even disable it completely(0). So open file bin/config/base/Rendering.ini and find line HairWorksAALevel=8 try lower level of AA there.
 
"You can tweak the settings by editing the HairWorksAALevel in the game's bin\config\base\rendering.ini. " A great find wizard .
 
I was wondering it you were going to do a performance analysis of this game. I glad you did the game looks/plays great.
 
Not great for AMD. Will you retest when/IF AMD release a driver for this with optimisations?

Terrible from AMD not to have a driver out in time.

Drivers won't do much and not worth a re-test, I was playing GTA 5 on old drivers and didn't notice a difference with the latest (15.4), meanwhile my 970 friends are crashing with ERR_GFX_D3D errors.
 
I want to upgrade to 1440p screen too but can't find any brand that goes bigger than 28".
try Philips, that have some 3440x1440 34" monitors i think
 
Hello,

I am running it at 3440 x 1440p with a 970 and it's pretty demanding. Running around 40 fps.
Any tips on setting so I can reach 60 fps without losing the wonderful graphics?

Cheers!
 
"You can tweak the settings by editing the HairWorksAALevel in the game's bin\config\base\rendering.ini. " A great find wizard .

Uhoh, silly me didn't read the conclusion page. Yes good review Wizzard.

Hello,

I am running it at 3440 x 1440p with a 970 and it's pretty demanding. Running around 40 fps.
Any tips on setting so I can reach 60 fps without losing the wonderful graphics?

Cheers!

You should check tweaking guide on geforce.com(especially config file tweaks section):
http://www.geforce.com/whats-new/gu...-hunt-graphics-performance-and-tweaking-guide
 
No 780 Ti in the list? WTF? TPU are you kidding? Fast test and re-add.
 
Very good review W1zz, thanks for testing and posting. Can we assume Witcher 3 is going to be added into the test suite from here on out?
 
I take it that there isn't a CFx profile for the game as the 295x2 didn't scale at all (negative scaling in fact)...

... and either I missed it (likely) or there was no mention of the lack of scaling on AMD. I see a mention about drivers and performance, but no warnings about its current lack of scaling.

Excellent write up Wizz... keep em coming!
 
I would be also angry if I would have a Titan and I would get those terrible frame-rates, but I still find it a little bit funny how the interwebs flame Nvidia because a developer did not optimize their game for "older" Nvidia chips.
 
Back
Top