• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Batman: Arkham Asylum Enables AA Only on NVIDIA Hardware on PCs

Well, and just looking at the history of ATI driver releases. Almost every game that comes out gets a patch awhile after the fact, and continuously so. I'd say ATI has a more reactionary approach when it comes to supporting games, rather than a proactive one.

What doesn't make sense to me is why everyone was so ready to jump down NVIDIA's throat. And seriously, hear me out on this one. There are shit tons of games that are 'TWIMTBP' and have in game AA for ATI. Why would they cock block ATI on this game alone? This reeks more of ATI not supporting the game out of the gate, like most games that get emergency patches from them, than it does anything else.

Even the ATI fanboys should have looked at this one with a grain of salt.
 
I agree, but it's even worse IMO. From what I read they have discovered all this after the game has launched!!! That means they had no contact with the developer at all! I mean if you are a GPU maker, don't you contact developers and try to optimize before launch or at least start working on the optimization of the full game before it launches? Don't you ask for a copy? IMO if they cared so little about that game that they didn't even contact them, AMD deserves every bit of unoptimized code they get. Especially if it comes from a feature that has never been there and was developed for Nvidia at their request, paid by their money. The fact that the optimization works on Ati cards as well, changes nothing IMO. If I was the developer I would have done the same.

IMO people should stop caring IMO move on to the next game IMO game is simplistic and easy to beat IMO :spoiler: batman dies :spoiler:
 

No, it requires a lot of GPU power to render all the extra objects created by PhysX, it requires next to no GPU power to actually run PhysX. The rendering of the extra objects would all depend on the cards ability to render graphics.

Well, and just looking at the history of ATI driver releases. Almost every game that comes out gets a patch awhile after the fact, and continuously so. I'd say ATI has a more reactionary approach when it comes to supporting games, rather than a proactive one.

What doesn't make sense to me is why everyone was so ready to jump down NVIDIA's throat. And seriously, hear me out on this one. There are shit tons of games that are 'TWIMTBP' and have in game AA for ATI. Why would they cock block ATI on this game alone? This reeks more of ATI not supporting the game out of the gate, like most games that get emergency patches from them, than it does anything else.

Even the ATI fanboys should have looked at this one with a grain of salt.

What is even more interesting, and something I just noticed, is that in the Demo graphics launcher, it is even referred to as "nVidia Multi Sample Anti-Aliasing". Kind of makes sense to call it that if nVidia was the one that paid to have it added to the game...
 
Last edited:
Well, and just looking at the history of ATI driver releases. Almost every game that comes gets a patch awhile after the fact, and continuously so. I'd say ATI has a more reactionary approach when it comes to supporting games, rather than a proactive one.

What doesn't make sense to me is why everyone was so ready to jump down NVIDIA's throat. And seriously, hear me out on this one. There are shit tons of games that are TWIMTBP and have in game for ATI. Why would they cock block ATI on this game alone? This reeks more of ATI not supporting the game out of the gate, like most games that get emergency patches from them, than it does anything else.

Even the ATI fanboys should have looked at this one with a grain of salt.

Surely you dont think this is the only game where nvidia have been playing dirty ? since the whole twimtbp there have been numerous claims regarding different games of poor performance on ATI cards that doesnt make sense looking at similar performing nvidia cards, and to be honest this thread is just repetitive now on both sides.

I really could care less, the more I see "TWIMTBP" in games the more I will not buy another nvidia card simple as that, and if it comes to it I wont buy the games if it continues and both nvidia and the game devs can go screw themselves, this is the view of many people including nvidia users aswell!

Its not good practice and if they carry on with these tactics it will hurt them in the end.
 
I'm pretty sure most of the instances of people complaining about 'TWIMTBP' was fixed later in an ATI driver fix or a game fix that wasn't related to NVIDIA tampering.

Start naming the issues with 'TWIMTBP' that you've seen and we'll research how many turned out to be NVIDIA straight up tampering or just a broken driver that was later fixed. I'm not being an ass I'm curious to know the numbers on this myself.


Edit: If that's the case Newtekie, than it looks like it truely is an NVIDIA added feature. I almost expect to see NVIDIA demanding an apology from Ian McNaughton regarding this.
 
Last edited:
Surely you dont think this is the only game where nvidia have been playing dirty ? since the whole twimtbp there have been numerous claims regarding different games of poor performance on ATI cards that doesnt make sense looking at similar performing nvidia cards, and to be honest this thread is just repetitive now on both sides.

I really could care less, the more I see "TWIMTBP" in games the more I will not buy another nvidia card simple as that, and if it comes to it I wont buy the games if it continues and both nvidia and the game devs can go screw themselves, this is the view of many people including nvidia users aswell!

Its not good practice and if they carry on with these tactics it will hurt them in the end.

How is that playing dirty exactly? TWIMTBP's entire purpose it to optimize games to run better on nVidia hardware. Why are you surprised when it actually works? And furthermore, how it is playing dirty? People like to claim games running worse on ATi hardware than on comparable nVidia hardware is proof that nVidia is somehow hindering ATi's performance, but it is really just proof that TWIMTBP program is doing its job, optimizing performance on nVidia hardware.

ATi had a similar program, but they dropped it in favor of doing their own optimizations in drivers. It is just two different aproaches to optimization.
 
No, it requires a lot of GPU power to render all the extra objects created by PhysX, it requires next to no GPU power to actually run PhysX. The rendering of the extra objects would all depend on the cards ability to render graphics.

Well, there should be a lot of shit, that come with physx enabled, because it halved the fps!
(then high physx required again a 20% over that halved fps)

Also, CUDA is designed by its nature to be hardware independent. Once the hardware vender writes the driver to support CUDA, it will work. There really isn't a whole lot nVidia can do to make it perform worse on one over the other, and if they did, it would immediately send up red flags because the difference would be drastic.

Nvidia wants to spread physx, right? Then why should ati make a driver for cuda, if it already has its own api (ati stream 1.x, 2.x), and there is a common api called opencl (currently at 1.0)? If nvidia wants to spread physx, then he should port it to opencl, to become available to every card.
Nvidia would do this, except... if he wants to manipulate with this physx stuff.
 
How is that playing dirty exactly? TWIMTBP's entire purpose it to optimize games to run better on nVidia hardware. Why are you surprised when it actually works? And furthermore, how it is playing dirty?

ATi had a similar program, but they dropped it in favor of doing their own optimizations in drivers. It is just two different aproaches to optimization.

Anger fanbois are not able to differentiate between improving performance for one card and making the other run slower. Same people don't know what optimization is, so you know why that happens. Then again all of them expect a 20% improvement with the specific driver releases, and see that as normal when it does happen, but think that Nvidia achieved that optimization and more through various months of work? No, that's imposible.
 
but it is really just proof that TWIMTBP program is doing its job, optimizing performance on nVidia hardware.

...meantime excluding ati completely from developing.
 
I think it dose happen, What about Assissans creed, it come out with DX 10.1 and it worked perfectley, BUT then all of a sudden it dissapered. I was under the understanding that that had somthing to do with nv not being able to support it
 
You know what, you are the biggest fanboy on this thread which is quite obvious from your constant ass licking and defending of nvidia.

Its bullshit because there is no reason other than nvidia paying the devs not to optimise it for nvidia, but to give worse performance on ATI cards.

But hey, you defend them all you like, I could care less and like I said the more it goes on then the more I will not buy an nvidia product, and a lot of other people feel the exact same way.

Uhh there's alot of reasons for Devs to not optimize it to 100% for either. One key one being time costs money. Another being the more you screw with code optimizing it, the more chance you have to break it and create more issues for yourself. The more time spent on something the more money it costs. What's the best way to cut costs? Here's a scenario.

Well, the game runs at 95% on both systems, but they can get it 100% with another 20000 extra man hours. To them, 95% is good enough and they don't need to pay developers wage x 20000 hours. Why? Because at the end of the day, its still playable and both camps can run the game. Nobody is losing out.

But wait! Here's NVIDIA saying 'Hey, we'll give you money, you put TWIMTBP at the front, and spend that time to optimize it for our hardware to 100%. Leave ATI at 95%, we don't care.' Right there, you have the idea of TWIMTBP. Its not meant to take 100% and make it 112.5%. Its not meant to take 100% ATI down to 50%.

I think that aside from small optimization tweaks, you'll find people's complaints about TWIMTBP eventually turned out to be something else as the cause.
 
Well, there should be a lot of shit, that come with physx enabled, because it halved the fps!
(then high physx required again a 20% over that halved fps)

That has been the case since PhysX was first release in the market, long before nVidia even entered the picture, when PhysX was still owned by Ageia.

The graphical performance hit caused by PhysX is very large.



Nvidia wants to spread physx, right? Then why should ati make a driver for cuda, if it already has its own api (ati stream 1.x, 2.x), and there is a common api called opencl (currently at 1.0)? If nvidia wants to spread physx, then he should port it to opencl, to become available to every card.
Nvidia would do this, except... if he wants to manipulate with this physx stuff.

Actually, ATi wasn't even tasked with making the driver, an outside developer was willing to do it, they just needs some support from ATi. And at the time ATi Stream was, and still is, pretty much unused, I don't believe it even existed when CUDA and PhysX was developed. And nVidia just release an OpenCL compliant driver, because OpenCL has only been around for a short while also.

And at this point, it is kind of pointless to port PhysX over, as it is pretty much dead thanks to DX11's implementation of physics.
 
...meantime excluding ati completely from developing.

Game developers owe nothing to Ati. If Ati doesn't help them, why should they help Ati at all? And it goes far beyond that, because not only they do not help them optimize, but they don't even care enough to be around when the launch is close, and show some interest. So as I said they owe them nothing.

They don't owe you anything either, nor they owe me anything, they make a product the better they can or the better they want and if it's good enough for you, you buy it. If you don't like it you don't buy it and they loose. That's how it works.
 
i think what is needed here is a little calm.gif people there really is no need for anyone here to get heated it's just a computer game yes nvidia has done something to it , so what, are you going to die from it ?is it going to make your house burn down and eat your children? .mussels said this thread was getting over the top, please be carefull or this thread will face being closed,
we are all adults here not children in a school yard
anyways both cards have there plus's and minus's i will probably get both ,
why because i can;)
 
You know what, you are the biggest fanboy on this thread which is quite obvious from your constant ass licking and defending of nvidia.

Its bullshit because there is no reason other than nvidia paying the devs not to optimise it for nvidia, but to give worse performance on ATI cards.

But hey, you defend them all you like, I could care less and like I said the more it goes on then the more I will not buy an nvidia product, and a lot of other people feel the exact same way.

Says the guy with the big ATi symble under his name....

And because you are so blinded, you can't even believe that it might entirely be possible that nVidia is actually doing what they say they are doing and paying for the game to be optimized on their hardware. You find that way too far fetched, and instead believe that they are simply paying to have performance retarded on ATi hardware...that makes sense...I guess...:laugh:

You know the simplest solution is usually the correct solution. Which seems simpler to you? The program is being used like nVidia says, and optimizing the game to run on their hardware OR there is a huge conpiracy where nVidia uses the program as a front to hinder performance on ATi hardware to screw ATi over?

But now that you have degraded to simply flaming instead of making intelligent points to backup your argument, I'll ignore you now, as you have lost.
...meantime excluding ati completely from developing.

No, not really.
 
Says the guy with the big ATi symble under his name....

And because you are so blinded, you can't even believe that it might entirely be possible that nVidia is actually doing what they say they are doing and paying for the game to be optimized on their hardware. You find that way too far fetched, and instead believe that they are simply paying to have performance retarded on ATi hardware...that makes sense...I guess...:laugh:

You know the simplest solution is usually the correct solution. Which seems simpler to you? The program is being used like nVidia says, and optimizing the game to run on their hardware OR there is a huge conpiracy where nVidia uses the program as a front to hinder performance on ATi hardware to screw ATi over?

But now that you have degraded to simply flaming instead of making intelligent points to backup your argument, I'll ignore you now, as you have lost.


No, not really.

Says the guy who has been fighting this like his life counted on it ?

oh noes I have lost on the interwebs :eek: haha get over it mate after all it is a discussion not a fight/game ;)

how is it a conspiracy ? ATI is nvidia's only real competitor ( I'm sorry I dont see intel/matrox/SIS as competitors) so why is it so far fetched that so many ATI users who actually own the hardware and have seen the numbers on various "optimised" games are consistently reporting poor performance on similar even lower performing nv cards ?

Optimised my arse, its not xbox 360 Vs PS3 its PC and it should play the same regardless on similar performing cards whether they are ATI or nv

It never used to be that way, a game was made and it played the same on both nv and ati as long as they were on par in terms of performance overall and there were slight differences in image quality between the 2 but that is all.
 
What NV is doing here is reducing the overall sales of the game by pulling this stunt, which in turn will affect their cash flow due to the game not being bought by users of ATI hardware as well. Remember anything with a logo or slogan such as "TWIMTBP" gets bought some of that money goes to them as well due to just the logo.
 
once again NV has pulled out the dirty tactics again

"I typically save most gaming news for the semi-regular Binge, but I think that this story deserves its own slot. As part of a software update to its popular Assassin's Creed game, Ubisoft seems to have removed DirectX 10.1 functionality from the game entirely. This is interesting for a few reasons; first and foremost, it just doesn't make sense to remove "main attraction" features from a game - especially if the removal of these features results in reduced game performance on systems using a graphics card supporting DX 10.1. Secondly - and most importantly - because this title is part of Nvidia's "The Way it's Meant to be Played" program, the moves smells very much like collusion - seeing as no current Nvidia graphics cards support DX 10.1. This was a terrible decision, and one can only wonder if karma will rear its ugly head...as it should."

DX 10.1 offered a 20% increase in preformance when AA was being used, But then they scraped it, Go figure?
I assuming that that this stunt is along the same lines.
 
It never used to be that way, a game was made and it played the same on both nv and ati as long as they were on par in terms of performance overall and there were slight differences in image quality between the 2 but that is all.

You seem unwilling to acknowledge my posts directly refuting yours. No matter, this statement above is entirely wrong. Before TWIMTBP there was Digital Vibrance. ATI itself had a TWIMTBP variant though I can't remember what it was called. And what of Sideport and its expected massive boost over NVIDIA? Eyefinity? Cuda and PhysX...? Havok?

Everyone has had many marketing gimmicks. And everyone still does.
 
what nvidia is doin' to ati sonunds like what intel did to amd and they were fined more than 1 billion for that! i'm just curious what will happen if ati finds proof for the dirty games of nvidia! will they be fined the same way like intel? i think that's the only way nvidia to stop playin' dirty and focus on other things!

I believe there will be a lawsuit if this is not made right.
 
Wow, I think this thread has more replies than the 5870 release... Honestly, this doesn't surprise me... Although you really have to wonder how Nvidia is making so much money that they can not only afford to develop competing cards, but also pay (or bribe as somone said earlier) developers to optimize games solely for their gpu's.

Maybe it has something to do with their $500+ cards? :) ... edit: and huge marketshare (in b4 fanboyz)
 
Alright, the off-topic discussions stop here.
 
Phew, thanks BTA I mean I love reading drama as much as anyone but this is getting ridiculous.
 
Back
Top