• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

The Witcher 3: Performance Analysis

What do you mean "bring back"? They were never there to begin with.
Whatever you saw back in 2013 was obviously a tech-demo or pre-rendered. Neither the GTX980 nor the Radeon 290X existed back then so nobody could accurately predict what would be possible to ship in 2015. One could only guess.
Of course it was, and anyone with any brain knows it will be, but in the trailers themselves it does say "in game footage"
 
Of course it was, and anyone with any brain knows it will be, but in the trailers themselves it does say "in game footage"

Yes but as always you cannot expect it be the same or better as was not a finished product. Now if it was released then got nerf that be a different matter then how ever it's not the case.

Like with most demos there is a warning that it's not the finished product you just would not of seen that, if it was on E3 or what ever playing the demo of it then it would of said that some were.

People need to stop this bs.
 
Of course it was, and anyone with any brain knows it will be, but in the trailers themselves it does say "in game footage"

Even when a trailer says "in-game footage", you know the whole world isn't there yet, the AI isn't finished - there are just too many unknowns at that point; now if you have a playable beta, that could be a fair indicator. I always take pre-release material as guidelines. If I want to know what a game looks like, I look at screenshots in day 1 reviews.

Also keep in mind that because of technical difficulties, video cards are still stuck on 28nm, whereas they were supposed to be on 20 or even 16nm today. That would have probably allowed CDPR to enable more stuff in their engine. And btw, settings beyond Ultra are avaialable directly in the config files. The game can actually look better than Ultra. Next year, when video cards move to 16nm and stacked memory will come with much greater bandwidth, we may actually be able to go beyond Ultra (not on mid-range cards, but enthusiasts will most likely benefit).

And one last thing: having played the first two installments, I preordered this one without even looking at how it looked. I knew it was going to look great (and bring my video card to its knees) and it does.
 
If you turn on everything the vram usage goes right up to ~3.5GB at 4k. I think I was hitting the 'slow' ram on my 970 as there were some slideshow cutscenes and texture pop-in. It just happened in one situation where the mem usage was around 3.4GB (In the palace in vizima cutscene, with geralt having a shave)

It's just barely playable if you have a gsync monitor and a single 970, but I am going to be purchasing another, or possibly upgrading to a 980 TI, depending on how expensive it is.
 
how does the game run with Gtx 970 sli, does anybodys know?
 
All I got to say is I don't like how this game doesn't look like what they showed at E3 in 2013 I believe it was. It is Ubi**** all over again.
 
how does the game run with Gtx 970 sli, does anybodys know?

I have all the graphics setting on ultra/max except for Hairworks (off) and it gives me 60fps constant at 1080p. Running my triple screen setup (5670x1080) and best I can do is 40fps on low settings, no overclock. Haven't needed to overclock until Witcher 3 and GTA5 were released. Thinking about selling the GTX970s and getting SLI GTX980Ti's for the addition memory. GTA5 at 5670x1080 cripples the GTX970s.
 
What do you mean "bring back"? They were never there to begin with.
Whatever you saw back in 2013 was obviously a tech-demo or pre-rendered. Neither the GTX980 nor the Radeon 290X existed back then so nobody could accurately predict what would be possible to ship in 2015. One could only guess.
You really didn't get my post?

The argument that the game had to be dropped down to the current quality level strikes me as bogus, but if it makes people happy that's fine. Keeping VRAM usage around 1-2 GB was an optional, not required, decision.
 
GTA5 at 5670x1080 cripples the GTX970s.

Not trying to be a team Red fanboy here, but with FXAA and everything maxed 6088x1080 I can get a steady 60FPS in GTA V with my CFX HD7950s (OC'd). Try turning grass density down 1 notch.
 
Not trying to be a team Red fanboy here, but with FXAA and everything maxed 6088x1080 I can get a steady 60FPS in GTA V with my CFX HD7950s (OC'd). Try turning grass density down 1 notch.

I think it's a SLI or VRAM issue with the SLI 970s on GTA5. I am running Med-High settings with FXAA and getting 85fps on avg. However there are some weird graphical anomalies as I get above 3-3.5GB VRAM usage that make the game unplayable. Only turning the settings way down and thus VRAM usage does this not happen.
 
I think it's a SLI or VRAM issue with the SLI 970s on GTA5. I am running Med-High settings with FXAA and getting 85fps on avg. However there are some weird graphical anomalies as I get above 3-3.5GB VRAM usage that make the game unplayable. Only turning the settings way down and thus VRAM usage does this not happen.

It's an architecture faults that it's 3.5GB VRAM + 0.5M Chunk card, only GTA 5 and Battlefield 4 are struggling with this.
 
I want back the videocards where you can add by yourself the amount of RAM you want.
I remember back in the day, when I upgraded my S3 Virge GX from 1MB of VRAM to 4MB of VRAM, and all of a sudden I could play Duke Nukem 3D and ROTT from ~30fps on 400x300 VGA resolution, up to ~65fps on 800x600 SVGA. Oh, what a jump in quality that was! But those were the good times. :D
 
Interesting that I am not able to get SLI working properly on my GTX 970 SLI rig....
Tried everything and second GPU is snoozing at 135mhz during gameplay.

Anyone else seeing SLI issues?
 
It's an architecture faults that it's 3.5GB VRAM + 0.5M Chunk card, only GTA 5 and Battlefield 4 are struggling with this.

I removed GeForce Experience and it seems to have helped with not just the graphical anomalies but also raised my FPS across both games on the SLI 970s. Will keep testing it.
 
You know what would be awesome on these game performance analysis... is putting CPU results in here too... For example, how it runs on Intel quad with Dual/DUal with HT, Quad, quad with HT, hex with HT... and the same for AMD - quad/hex/octo. Then overclock and underclock them...

wow, that is a lot of work, LOL!
 
As we all know by now, the AMD Radeon R9 Fury X is here... Time for a new review/thread.
 
As we all know by now, the AMD Radeon R9 Fury X is here... Time for a new review/thread.

It's a little below the 980 Ti. Trading blows in some games, beating it in a select few, but overall slightly below. Drains more power, too.
And not unexpected either: while using HBM, the much slower memory speed translates into not that much more bandwidth. And the raw power hasn't seen much of a bump either.

Edit: Do not read the above as a criticism to AMD. Because of TSMC, they're stuck on 28nm like everybody else, so they can't physically fit more processing power into the current GPUs. Remember, the larger the die, the higher the cost to build one. But next year when GPU makers will (hopefully) be able to make the jump to 16nm, there will be good things in store for us.
 
Last edited:
Any of you guys have "worst" frame rates in the witcher 3 since the patch 1.08?
 
Any of you guys have "worst" frame rates in the witcher 3 since the patch 1.08?
I've been beating the benchmarks in this original ariticle with a i5-4690k and GTX 980 at 1440p.

Average 55fps with everything on Ultra except Foliage distance is set to high
 
Back
Top