• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Batman: Arkham Asylum Enables AA Only on NVIDIA Hardware on PCs

Wow that is some serious statements, made all the more serious by the judicious use of large lettering.
 

Interesting. So not only does it not actually work, but it also breaks something in the game.

Sounds like one of the reasons I said in the beginning of this thread...

Both of these claims are NOT true. Batman is based on the Unreal Engine 3, which does not natively support anti-aliasing. We worked closely with Eidos to add AA and QA the feature on GeForce. Nothing prevented AMD from doing the same thing.

Hey, another reason I said in the beginning.

Seems like the simplest solution is most likely to be correct...

You know, a good reporter would put up a retraction correcting his misinformation...of course real reports do research to make sure their story is straight before reporting it and then wrongfully bashing who they believe to be at fault...

But Nvidia should still lower their prices since the 5870 had such a strong launch.

If people want to rage about Nvidia just complain about the pricing. It won't take lies or misconceptions to do so. Its just plain facts.

I got ripped buying a 7950GX2 back in the day. It scaled like crap and drivers had taken forever to make it scale decently. 9800GX2 and GTX 295's were another story though(still overpriced). :)

Definitely, but I'm sure they will, it just takes time. We are only a week out from the launch of the HD5870, so I expect a price cut announcement on at least the GTX285 and GTX295 very soon.(The others still fit well in the Performance per dollar graph, thanks to the fact that they have had competition from ATi already).
 
up's sorry guy's. i don't mean to bother you all, i'm just angry to the developer.

i will not do that again, i'm really sorry


NB : if you came to indonesia just call me, i will be your guide. and i will show you how beautiful is indonesia. and btw i'm at 20 now
 
This thread needs to die.


Those that feel offended by NV and the developers antics know what not to buy, and those who don't can support division of gamers.
 
Bullshit, the tests with hacked drivers were showing PhysX running just fine on ATi hardware.

You are seriously over estimating the power required to run PhysX, any current ATi hardware would have been able to completely kill in PhysX performance. Remember, the original hardware the PhysX API ran on was 128MB PCI cards...

Its not running on ati hardware. Its using the "software mode" which utilizes the CPU for the physics processing. Much like ageia before. In which you may be able to realize physics effects on screen with a performance hit as opposed to having the card itself.
 
Its not running on ati hardware. Its using the "software mode" which utilizes the CPU for the physics processing. Much like ageia before. In which you may be able to realize physics effects on screen with a performance hit as opposed to having the card itself.

He is talking about the hack that they were preparing in ngohq.com which allowed PhysX to be accelerated on Ati hardware.

http://www.tomshardware.com/news/nvidia-physx-ati,5764.html

Nvidia even gave him a lot of support.

http://www.tomshardware.com/news/nvidia-ati-physx,5841.html

Quote: "In the end, if Badit could get PhysX to run on Radeon cards, the PhysX reach would be extended dramatically and Nvidia would not be exposed to any fishy business claims - since a third party developer is leading the effort."

In the end AMD didn't allow that to happen, and lied about which the reasons were behind that decision, because they had a deal with Intel's Havok which only runs on the CPU. Since Intel didn't want GPU acceleration at all, PhysX could not happen, at least fully supported PhysX couldn't happen.

EDIT: And yeah, I know they are slowly porting Havok to run on GPUs too, but that is more than a year after that happened, because PhysX has some support after all, despite their efforts to block it and because by the time they finish porting it Intel will have their Larrabee out. The thing about GPU Havok is so fishy that the demo of Havok running in AMD's HD5xxx were using AMD's propietary Stream API, but the final product is going to be OpenCL...
 
Last edited:
Well you see where Physx is, just like it was when Ageia appeared on the scene in 2005.
 
Ok here is my take on this whole thing. ATI "fooled" the game into running AA natively by telling the game it was in fact an Nvidia card. Once they did this it ran better AA than with a real Nvidia card. So basically the feature was not added to the Geforce game profile but removed from the games ATI profile. Yes the Unreal 3 engine does not in fact support AA but ATIs catalyst has supported AA in the Unreal engine since I believe 9.2. To me this is proof TWIMTBP program is paying developers to hamstring ATI.

NOW if AA was offered no matter what GPU you had but in fact ran better on Nvidia than I would accept fair play with TWIMTBP program. However Nvidia cheated ATI users out of something their card is VERY capable of doing natively. After all we are talking about AA. Not Physx.

Nvidia just had a Tonya Harding moment.
nancy_kerrigan_biography_2.jpg
 
Ok here is my take on this whole thing. ATI "fooled" the game into running AA natively by telling the game it was in fact an Nvidia card. Once they did this it ran better AA than with a real Nvidia card. So basically the feature was not added to the Geforce game profile but removed from the games ATI profile. Yes the Unreal 3 engine does not in fact support AA but ATIs catalyst has supported AA in the Unreal engine since I believe 9.2. To me this is proof TWIMTBP program is paying developers to hamstring ATI.

NOW if AA was offered no matter what GPU you had but in fact ran better on Nvidia than I would accept fair play with TWIMTBP program. However Nvidia cheated ATI users out of something their card is VERY capable of doing natively. After all we are talking about AA. Not Physx.

Nvidia just had a Tonya Harding moment.
http://1.bp.blogspot.com/_Wn9gB8wTe...RimJJ5xoY/s400/nancy_kerrigan_biography_2.jpg

Did you read the latest info that has been given in the last posts? Not only the AA is not better in Ati cards, but they are not doing AA at all, and they break the game. :shadedshu
 
TheMailMan78 did not get the memo. :o

Seriously read the other posts and edit if necessary. ;)
 
Did you read the latest info that has been given in the last posts? Not only the AA is not better in Ati cards, but they are not doing AA at all, and they break the game. :shadedshu

TheMailMan78 did not get the memo. :o

Seriously read the other posts and edit if necessary. ;)

There is 11 pages! Give me some links damn it!
 
Start on page 7, around my first post. I'm still not sure which way this is going as both sides have good evidence against each other. I tend to lean towards NVIDIA though because breaking only one game doesn't make sense.
 
There is 11 pages! Give me some links damn it!

I'm too lazy to look it up, so here is a summary:

  • The claim was made that nVidia paid to have AA disabled for ATi hardware.
  • The claim was made that AA works.
  • The claim was made that there was no reason to disable the feature for ATi hardware, other than nVidia paying to have it disabled.
  • Some arguing.
  • The claim was made that AA was a feature that nVidia funded the addition of.
  • The claim was also made that, perhaps the feature was disabled on ATi hardware due to it breaking the game.
  • Some arguing.
  • The claim was made that AA is a standard feature in the Unreal 3.5 Engine.
  • The claim was made that ATi proved it doesn't break the game, because if it works in the demo, it will work in the entire game.
  • Some arguing.
  • It was revealed that AA is not a standard feature in the Unreal 3.5 Engine, and nVidia did infact fund the addition of it to the game.Source
  • It was revealed that changing the device ID to allow AA to be enabled in-game, actually breaks the game on ATi hardware.Source
  • It was revealed that, even with the setting enabled, ATi hardware didn't actually do AA because the feature was not designed for ATi hardware.Source

I think that about covers it.

The discussion should be pretty much over with that. There is no wrong doing on nVidia's part. They paid for the developement and inclusion of AA in Batman, it is only fair that only their hardware gets the benefit. ATi was more than open to do the same, but they didn't, it is their loss, and more imporantly the loss of their customers. And unlike the original reports by ATi, the feature doesn't actually work on ATi hardware. The setting can be enabled in the demo, and full game, but it doesn't actually do anything and it breaks the full version of the game.

Perhaps if the two of them would work a little bit more together, we could see extras like this added to all games that work on both. Though we don't want them working so closely together we get another price fixing situation...:laugh:
 
Last edited:
Perhaps if the two of them would work a little bit more together, we could see extras like this added to all games that work on both. Though we don't want them working so closely together we get another price fixing situation...:laugh:

And the feature probably almost works, I mean it requires just some light recoding. What it does need is a lot of testing and QA on Ati hardware, with someone with extensive knowledge of the Ati architecture (AKA AMD engineer) helping a bit and that's pretty much all. It's not a feature of the UE, it's not a feature present in DX, not in this exact form at least. So it's not something you can take as grated that it will properly work under all conditions. A game developer can't release a game with a feature that has not been properly tested.
 
I'm amazed this discussion is still going on. It only shows how ignorant fanboys can be when they care nothing about facts as long as they have found a reason to rant. Human weakness at it's finest.

ATI hacked a demo.

The developer did not cripple ATI because Nvidia paid them to do it. Seriously people... this isn't the US Government. Somebody needs to get facts and settle this BS because I've seen nothing but hearsay from ATI.

At the end of the day, I have an Nvidia card :roll:
 
Nvidia wants to estabilish a new tradition: gpu makers have to pay for (basic or non-basic) features, if they want it in-game.
It will be fun to see a game with nvidia (tm) AA, nvidia (tm) physx, ati (tm) tessellation, s3 (tm) AF, ati (tm) hdr, etc...

Pathetic.

Anyway, bioshock and mass effect had aa through control panel (both manufacturer).
 
Last edited:
[*]It was revealed that AA is not a standard feature in the Unreal 3.5 Engine, and nVidia did infact fund the addition of it to the game.Source
[*]It was revealed that changing the device ID to allow AA to be enabled in-game, actually breaks the game on ATi hardware.Source
[*]It was revealed that, even with the setting enabled, ATi hardware didn't actually do AA because the feature was not designed for ATi hardware.Source

Ok the first link is Nvidia and the rest are some nut job on a forum that has nothing to do with ATI or Nvidia. :laugh:

THIS is what you guys bring to the table as facts?! Nivdia ok but a quack from a forum?! Come on guys. I thought you had better rebuttals than that. :shadedshu
 
Ok the first link is Nvidia and the rest are some nut job on a forum that has nothing to do with ATI or Nvidia. :laugh:

THIS is what you guys bring to the table as facts?! Nivdia ok but a quack from a forum?! Come on guys. I thought you had better rebuttals than that. :shadedshu

Its because the argument is not even acknowledged by the tech media. A guy forced it to work and it makes it broken in-game. Try it yourself, you just change a device ID. Unless you think its a conspiracy too. :laugh:
 
Its because the argument is not even acknowledged by the tech media. A guy forced it to work and it makes it broken in-game. Try it yourself, you just change a device ID. Unless you think its a conspiracy too. :laugh:

I think all of you work for Nvidia and made my dog sterile.
 
No, I don't work for NVIDIA but I did make your dog sterile.
 
No, I don't work for NVIDIA but I did make your dog sterile.

Give him a reach around next time. He likes that.

Its because the argument is not even acknowledged by the tech media. A guy forced it to work and it makes it broken in-game. Try it yourself, you just change a device ID. Unless you think its a conspiracy too. :laugh:

I'm not downloading the demo again. Making baseless claims against shit I have no idea about is way easier.
 
Ok the first link is Nvidia and the rest are some nut job on a forum that has nothing to do with ATI or Nvidia. :laugh:

THIS is what you guys bring to the table as facts?! Nivdia ok but a quack from a forum?! Come on guys. I thought you had better rebuttals than that. :shadedshu

Well, nVidia coming right out and saying what they did, is kind of all the proof needed. They are the ones that did it, they know best. It puts all the other other baseless acusations to rest.

And you have to kind of read the whole thing from the "nut job". That "nut job" is the one that originally claimed AA was disabled for ATi, and originally claimed it worked in the demo. Posting screenshots to prove it.

The other forum members later went on to disprove the fact that AA was even working. And the "nut job" himself confirmed that it broke the game(even if he didn't want to admit it at first).
 
Well, nVidia coming right out and saying what they did, is kind of all the proof needed. They are the ones that did it, they know best. It puts all the other other baseless acusations to rest.

And you have to kind of read the whole thing from the "nut job". That "nut job" is the one that originally claimed AA was disabled for ATi, and originally claimed it worked in the demo. Posting screenshots to prove it.

The other forum members later went on to disprove the fact that AA was even working. And the "nut job" himself confirmed that it broke the game(even if he didn't want to admit it at first).

The game yes. Not the demo. ATI never said anything about the game due to secure rom. Anyway the accusation was from a ATI blog not that forum.
 
Slight offtopic:

S.T.A.L.K.E.R. was originally TWIMTBP, but IIRC, the game runs better with Ati hardware... maybe it ran better at launch with nV but Ati drivers were improved from there. (confirmation needed)

But afaik, now they display Ati Radeon logo at startup... in CS and COP.
 
Though we don't want them working so closely together we get another price fixing situation...:laugh:

again, yeah I think not. lol the 600$ standard for highend single card single core and 300$ standard for decent midrange was quite annoying.

now we can typically pick up a 100-200$ card that will grant us all the performance we need. I like it better now.

the argument was interestign to watch, as a former ati fanboy o have to admit I jumped to conclusions, but gettign older I didn't want to post without evidence. I'm glad I didn't and I'm glad the truth came to light.
 
Back
Top