• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Witcher 3 + HairWorks tanks AMD performance - AMD complains it's NVIDIA again

Do you think NVIDIA's HairWorks deliberately sabotages AMD's performance?


  • Total voters
    33
Status
Not open for further replies.

qubit

Overclocked quantum bit
Joined
Dec 6, 2007
Messages
17,865 (2.77/day)
Location
Quantum Well UK
System Name Quantumville™
Processor Intel Core i7-2700K @ 4GHz
Motherboard Asus P8Z68-V PRO/GEN3
Cooling Noctua NH-D14
Memory 16GB (2 x 8GB Corsair Vengeance Black DDR3 PC3-12800 C9 1600MHz)
Video Card(s) MSI RTX 2080 SUPER Gaming X Trio
Storage Samsung 850 Pro 256GB | WD Black 4TB | WD Blue 6TB
Display(s) ASUS ROG Strix XG27UQR (4K, 144Hz, G-SYNC compatible) | Asus MG28UQ (4K, 60Hz, FreeSync compatible)
Case Cooler Master HAF 922
Audio Device(s) Creative Sound Blaster X-Fi Fatal1ty PCIe
Power Supply Corsair AX1600i
Mouse Microsoft Intellimouse Pro - Black Shadow
Keyboard Yes
Software Windows 10 Pro 64-bit
Here we go again. I don't know whom to believe, but I suspect it's AMD's fault for not working with the game developer to optimize HairWorks on their cards. NVIDA makes the source code available to those with a licence, so there's nothing stopping AMD from buying a licence and optimizing it with the gamer developer.

Wasn't it AMD's drivers which haven't had a WHQL release for 160+ days now? What does that say about the company?


Check out the non-HairWorks performance of both cards - NVIDIA is slightly faster here too.

"We've been working with CD Projeckt Red from the beginning. We've been giving them detailed feedback all the way through," AMD's chief gaming scientist Richard Huddy told Ars Technica. "Around two months before release, or thereabouts, the GameWorks code arrived with HairWorks, and it completely sabotaged our performance as far as we're concerned. We were running well before that... it's wrecked our performance, almost as if it was put in to achieve that goal."

www.eurogamer.net/articles/digitalfoundry-2015-does-nvidia-hairworks-really-sabotage-amd-performance
 
I read the title and thought 'Qubit'. No offence but these threads and votes simply create hostility.
Why give a torch to flamers?
 
there's nothing stopping AMD from buying a licence and optimizing it with the gamer developer.
This.
AMD has historically had issues with tessellation, and even the 290X/290 don't have the amount of tessellation hardware that Kepler / Maxwell have. Again they have suggested a "fix" by lowering the tessellation factor in their drivers - which is only an option because the Radeons are a bit lacking in tessellation performance in the first place. Takes me back to 2010 / 2011.

Its a shame they cry foul every time something like this happens, and as you say, 160+ days of no new drivers... That's not just a fail for new games that have beta drivers, but a fail for overall optimization. Back in the 5xxx / 6xxx days it wasn't uncommon to see driver after driver with "xx% increase in game X for 5800 / 6900 series".

I don't lean either way when it comes to performance of a specific manufacturer over another, but driver updates are a very important aspect of after sales support in GPU land. If they are going to continue in this vein, I don't see myself getting a 390X over my 970.
 
NVIDIA did it with PhysX. AMD did it with TressFX. NVIDIA is doing it with HairWorks. In my opinion, developers should never touch proprietary, hardware-based middleware. As such, it's CD Projekt RED that did the sabotaging, not AMD nor NVIDIA. If they're not going to bother implementing TressFX too, they should have never wasted time on HairWorks. The fact both the Xbox One and PlayStation 4 ports would have benefitted from TressFX further increaseses the shame on CD Projekt RED. To save face, I think they should pull the plug on HairWorks or rush a patch out for TressFX. They should have given NVIDIA the finger in the first place but that ship has already sailed.


Here's a link directly from AMD about it: https://community.amd.com/community/gaming/blog/2015/05/12/tressfx-hair-cross-platform-and-v20

As to the accusation of sabotage directly, I can't confirm nor deny. All we have are the facts as presented by AMD that, if they are to be believed, show NVIDIA did not optimize HairWorks for AMD cards. That said, NVIDIA has an established history of not giving any !@$%^ about AMD compatibility with GPU-accelerated PhysX. These numbers do not surprise. All of the developers that used hardware accelerated PhsyX in their games are to be blamed for pushing that on AMD gamers. The same applies here with HairWorks. How many times do developers have to get burned before they learn to stay the *%#! away from NVIDIA middleware?
 
Last edited:
AMD has historically had issues with tessellation, and even the 290X/290 don't have the amount of tessellation hardware that Kepler / Maxwell have.

I figured that Nvidia's Kepler based GPU's had better tessellation compared to AMD cards too but I've seen benchmarks where the R9 290X & 290 are beating out the GTX 780Ti & 780 in this game... The question I have is did Nvidia cripple their own kepler based GPU's with this game just so they can sell more 900 series???

Anyone here with mid to high end kepler based GPU's, are you getting good performance in this game? :(
 
Hairworks has TERRIBLE performance issues on nVidia. It is just as bad for nVidia cards as it is for AMD.

using 16x tesselation and 16x AA is just a joke of an idea and cant work well on any card currently on the market.

In my opinion they just had to find an exclusive feature to put for PCs to be able to support that "PC version is superior to console", cause really the graphics on ultra (with a 2000 euro pc) are only very slightly better than console versions which costs 400 euro.
 
From what I've been seeing Maxwell is simply better at tessellation than Kepler. Check out the review here and it shows that to be the case.
I don't think Nvidia did this to make people upgrade to a Maxwell. I think there would have been a leak within Nvidia by now if they did but who knows.

http://www.techpowerup.com/reviews/Performance_Analysis/The_Witcher_3/4.html

"On the NVIDIA side, recent cards based on the Maxwell architecture are doing much better than their Kepler predecessors, which is due the Tessellation improvements in the new architecture."
 
Hairworks sucks anyway the effect is minimal for the performance hit it has and it also hurts Nvidia cards aswell as AMD although AMD do seem to take the biggest hit.

If it's such a prob turn the damn setting off as your realy not missing much.
 
Last edited:
Dunno what the big issue is, as stated before, hairworks isn't even worth it, I've run the game on high end Maxwell and AMD rigs, as well as mid range Kepler systems, and it doesn't make that much of a visual impact.

Certainly not enough to justify the loss of performance, I leave it on for my two Maxwell based rigs, but on my other systems I just turn it off, and the game runs decently enough that I can even play it on high settings in my laptop with hairworks off.

Trust me when I say this, the feature doesn't even add that much value to the image quality, AMD and Kepler users aren't missing much.

Besides, didn't we go through the same problems with tressfx a few years ago? I remember when tomb raider had just come out, even on my titans the feature killed the performance and/or produced tons of artifacts, while my 290Xs ran the feature just fine. I don't remember Nvidia crying foul back then, they basically just told their card users to suck it up and turn off the feature while new drivers for tomb raider where developed.

Both teams play the same game, and there's always been plenty of fans from both camps crying foul play, except this time, the cries are coming from high up the managing ladder on the red team, I blame the Illuminati on this conspiracy to break the performance for the red team :slap:
 
Last edited:
I read the title and thought 'Qubit'. No offence but these threads and votes simply create hostility.
Why give a torch to flamers?
You're right to figure out it was me, but not so much about giving the torch to flamers.

Basically, the controversial stuff is inevitably the most interesting and generates the most conversation, which is why I post about it.

I've also noticed that the conversations around these issues are much more sensible and well thought out nowadays, so flamers aren't really much of a problem any more. We shouldn't feel gagged to talk about a subject just because some people may choose to misbehave. That's so wrong on many levels. The problem is with those people, not us and they'll be dealt with if necessary.

Take my AMD bust by 2020 thread for example. It's gone quite well and there have been some really good posts made on there with some great insights. :) Why give all that up through fear of potential trolls?
 
Yes, and its an obvious one. Just like Crysis 2 (or was it 3?) with its random ocean under the map, these games have excess un-needed tesselation. AMD drivers allow you to manually cap the tesselation, and you see no visible differences but huge performance increases - but average gamer Joe has no idea this setting exists, since its not an in-game setting.


I also think the 160 days thing for the AMD drivers is total flamebait, because guess what? Haven't needed them! the drivers they released were stable and bug free, and if it was Nvidia that did it there'd be threads all over the internet praising them.
Should you have experienced one of the few bugs in the stable release (such as the FM2/AM1 APU icon corruption issue) you could use the betas they've been releasing on average every 2 months and resolve your issue.
 
I think this is mainly an issue of Witcher 3 / HairWorks using too much tesselation and AA for the hair. There's been comparison shots of Geralt's hair with different tesselation level caps (forced by the option in Catalyst Control Center) and I think there's little visual hit from changing the tesselation level to 16x from 64x, while the performance impact is lowered dramatically.

I don't think AMD not releasing a WHQL-signed driver for half a year is that bad in their current situation. Their current drivers work well, don't have any serious bugs for most users, and all their current GPUs are old and they have got a lot of optimizations over the years already - more than their Kepler-based competitors. The GTX 680 beat the HD 7970 back in 2012, but nowadays it's the other way around. So there might not be much to optimize for the current GCN GPUs anymore. The only real issue is lack of CrossFire profiles - I'd be annoyed if I had a R9 295X2 and could only utilize one GPU for TW3.

About TW3 itself, I'm running it happily with an overclocked HD 7850. Settings are a mix between medium and high and I'm getting stable 30 FPS. No stuttering, it runs smoothly with the nice, "cinematic" 30-FPS experience and it looks awesome.
 
NVIDIA is doing this bullshit the entire time and this is among reasons why I don't buy their crap. When AMD implemented TressFX (hair simulation technology), ALL cards with DirectCompute (yes, even NVIDIA) could use it and none was discriminated. If there was performance hit it was roughly the same on all cards. But with NVIDIA and their proprietary junk, it's always some shit that makes competition look worse. So FU NVIDIA. One time it's a coincidence, observing this BS for almost 2 decades and it becomes an intentional thing...
 
Dunno what the big issue is, as stated before, hairworks isn't even worth it, I've run the game on high end Maxwell and AMD rigs, as well as mid range Kepler systems, and it doesn't make that much of a visual impact.

Certainly not enough to justify the loss of performance, I leave it on for my two Maxwell based rigs, but on my other systems I just turn it off, and the game runs decently enough that I can even play it on high settings in my laptop with hairworks off.

Trust me when I say this, the feature doesn't even add that much value to the image quality, AMD and Kepler users aren't missing much.

Besides, didn't we go through the same problems with tressfx a few years ago? I remember when tomb raider had just come out, even on my titans the feature killed the performance and/or produced tons of artifacts, while my 290Xs ran the feature just fine. I don't remember Nvidia crying foul back then, they basically just told their card users to suck it up and turn off the feature while new drivers for tomb raider where developed.

Both teams play the same game, and there's always been plenty of fans from both camps crying foul play, except this time, the cries are coming from high up the managing ladder on the red team, I blame the Illuminati on this conspiracy to break the performance for the red team :slap:
Didn't the reason for the geforce having bad perfomance on tress fx at launch, was just the fact that AMD was working with the dev of Tomb Raider ,while Nvidia just sat here ? Tress fx code is open, nvidia could have optimize their drivers the moment the code was available. It also seems that now , tress fx is more efficient on Nvidia gpu than hairworks. But stuff like gameworks is just what Nvidia do. Granted, when they are doing stuff, they are often more mature than the open source alternative.
But they will never stop to try making dev preferring cuda , rather than open cl for example. Hairworks is just the same thing. They do rather make a closed alternative, fragmenting the market, than working with what's available.
But AMD is also too lazy. Tomb Raider and the next Deus ex are the only big name using tress fx... CD projekt didn't even bother to see what they could have done with it. AMD don't understand that they have to sell/market the damn things !
 
Last edited:
Didn't the reason for the geforce having bad perfomance on tress fx at launch, was just the fact that AMD was working with the dev of Tomb Raider ,while Nvidia just sat here ? Tress fx code is open, nvidia could have optimize their drivers the moment the code was available. It also seems that now , tress fx is more efficient on Nvidia gpu than hairworks. But stuff like gameworks is just what Nvidia do. Granted, when they are doing stuff, they are often more mature than the open source alternative.
But they will never stop to try making dev preferring cuda , rather than open cl for example. Hairworks is just the same thing. They do rather make a closed alternative, fragmenting the market, than working with what's available.
But AMD is also too lazy. Tomb Raider and the next Deus ex are the only big name using tress fx... CD projekt didn't even bother to see what they could have done with it. AMD don't understand that they have to sell/market the damn things !

Wrong, Nvidia didn't receive any game code featuring tressfx from crystal dynamics until a week before the game was released, how quickly people forget things:

We are aware of performance and stability issues with GeForce GPUs running Tomb Raider with maximum settings. Unfortunately, NVIDIA didn’t receive final game code until this past weekend which substantially decreased stability, image quality and performance over a build we were previously provided. We are working closely with Crystal Dynamics to address and resolve all game issues as quickly as possible.

Please be advised that these issues cannot be completely resolved by an NVIDIA driver. The developer will need to make code changes on their end to fix the issues on GeForce GPUs as well. As a result, we recommend you do not play Tomb Raider until all of the above issues have been resolved.

In the meantime, we would like to apologize to GeForce users that are not able to have a great experience playing Tomb Raider, as they have come to expect with all of their favorite PC games.

http://www.geforce.com/whats-new/ar...14-14-beta-drivers-released#comment-820105287

The feature was quickly patched to properly work on Nvidia hardware; but back then, fans alleged Nvidia didn't receive the code because of the exclusivity deal for this particular feature between AMD and Crystal Dynamics, sounds familiar?

Like I said, same old story, both teams pull the same tricks over and over again, only short term memory seems to favor whoever cries foul the last. :nutkick:
 
Just like TressFX, It's too early to say. Maybe AMD haven't had time to optimize yet.
 
Wrong, Nvidia didn't receive any game code featuring tressfx from crystal dynamics until a week before the game was released, how quickly people forget things:

We are aware of performance and stability issues with GeForce GPUs running Tomb Raider with maximum settings. Unfortunately, NVIDIA didn’t receive final game code until this past weekend which substantially decreased stability, image quality and performance over a build we were previously provided. We are working closely with Crystal Dynamics to address and resolve all game issues as quickly as possible.

Please be advised that these issues cannot be completely resolved by an NVIDIA driver. The developer will need to make code changes on their end to fix the issues on GeForce GPUs as well. As a result, we recommend you do not play Tomb Raider until all of the above issues have been resolved.

In the meantime, we would like to apologize to GeForce users that are not able to have a great experience playing Tomb Raider, as they have come to expect with all of their favorite PC games.

http://www.geforce.com/whats-new/ar...14-14-beta-drivers-released#comment-820105287

The feature was quickly patched to properly work on Nvidia hardware; but back then, fans alleged Nvidia didn't receive the code because of the exclusivity deal for this particular feature between AMD and Crystal Dynamics, sounds familiar?

Like I said, same old story, both teams pull the same tricks over and over again, only short term memory seems to favor whoever cries foul the last. :nutkick:
So the whole tress Fx being open thing, it's just amd trying to look like the good guy ? -_- sigh.... that would be better if that sort of things wasn't in the hand of gpu makers, but more like an engine feature.
 
NVIDIA did it with PhysX. AMD did it with TressFX. NVIDIA is doing it with HairWorks. In my opinion, developers should never touch proprietary, hardware-based middleware. As such, it's CD Projekt RED that did the sabotaging, not AMD nor NVIDIA. If they're not going to bother implementing TressFX too, they should have never wasted time on HairWorks. The fact both the Xbox One and PlayStation 4 ports would have benefitted from TressFX further increaseses the shame on CD Projekt RED. To save face, I think they should pull the plug on HairWorks or rush a patch out for TressFX. They should have given NVIDIA the finger in the first place but that ship has already sailed.


Here's a link directly from AMD about it: https://community.amd.com/community/gaming/blog/2015/05/12/tressfx-hair-cross-platform-and-v20

As to the accusation of sabotage directly, I can't confirm nor deny. All we have are the facts as presented by AMD that, if they are to be believed, show NVIDIA did not optimize HairWorks for AMD cards. That said, NVIDIA has an established history of not giving any !@$%^ about AMD compatibility with GPU-accelerated PhysX. These numbers do not surprise. All of the developers that used hardware accelerated PhsyX in their games are to be blamed for pushing that on AMD gamers. The same applies here with HairWorks. How many times do developers have to get burned before they learn to stay the *%#! away from NVIDIA middleware?


Could not agree more, they should of added both but i bet the agreement with nvidia did not allow it.
 
The reason this one with the witcher is stupid, is because a simple .ini tweak fixes it - and it helps Nvidia performace at the same time.

An in-game setting would have resolved this drama before it started, and should be added in a patch.
 
the question i ask myself is : what will happen in the years to come ? will every game featuring tressfx/hairworks will be another "buy our gpus" wars ? will "real" hair just fade away ? or will dev have enough of amd/nvidia petty war, and just make in house hair/fur simulation ?
(If stuff like this keep hapenning, i'm 100 % behind closed, proprietary, engine exclusive features.)
 
Last edited:
the question i ask myself is : what will happen in the years to come ? will every game featuring tressfx/hairworks will be another "buy our gpus" wars ? will "real" hair just fade away ? or will dev have enough of amd/nvidia petty war, and just make in house hair/fur simulation ?

they'll just make all game characters bald, and then the glossy dome wars will begin.
 
They are in a business competition. Each will try to screw the other over at every opportunity.

upload_2015-5-25_11-23-56.jpeg
 
I voted no because even with my GTX 980 SLI the HairWorks part of the game is still not optimised, even though I am already installed the v1.03 update from GOG Galaxy which improves the HairWorks performance, albeit slightly. I do not need an AMD hardware to know that HairWorks codes in the game is crippling the game performance. Also I do not need fancy physics in hair, unless I am playing Barbie video games or Disney Princesses. HairWorks, TressFX, both are two sides of the same coin. What good is hair if I cannot open inventory without crashing or locking up the game?
 
NVIDIA is doing this bullshit the entire time and this is among reasons why I don't buy their crap. When AMD implemented TressFX (hair simulation technology), ALL cards with DirectCompute (yes, even NVIDIA) could use it and none was discriminated. If there was performance hit it was roughly the same on all cards. But with NVIDIA and their proprietary junk, it's always some shit that makes competition look worse. So FU NVIDIA. One time it's a coincidence, observing this BS for almost 2 decades and it becomes an intentional thing...
To bad HairWorks is also DirectCompute. NVIDIA was hit hard when Tomb Raider released when TressFX was on, it wasn't till 1-2 weeks later when Nvidia released a new driver that performance was back to where it should be.
 
The way i see it: yet another shinny example of:

"Hey: this looks nice!!!! Who cares if performance get's kripled? As long as it looks nice ..."


They should be concentrating on raising the performance of the cards for this game: not TANKING it ...
 
Status
Not open for further replies.
Back
Top