• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Batman: Arkham Asylum Enables AA Only on NVIDIA Hardware on PCs

My god you eighter are naive or you insult me even further , if let's say Physx was free , for how much time ? 1 month ? 1 year ? or until it became a standard and Nvidia would force them to pay if they want to use it ? something like that ?
I'm starting to think i'm wasting my time here , i have better things to do , good luck with all this and keep it nice :) .

Same goes for anything proprietary. You think all companies throw in funds evenly to develop all technology?

Owning a technology means that yes its yours and you can charge for it. Are you saying that it shouldn't be allowed to create a return as an investment just because ATi doesn't own it? Its the business world and thats how it operates, whether you like it or not.

PS. If they get the Eyefinity going without the need for display port primary or powered adapters I could see grabbing a 5870 or two. Damn bezel sizes.
 
Last edited:
Take this competitive marketing Batman AA rumor with a grain of salt. Any jab by the competitor who's stuck in a corner should be looked at with extreme skepticism. This may simply be another marketing ploy (reverse psychology). As an Nvidia user I know for a fact there are many games where Nvidia's own default forced AA modes are not compatible and the only option Nvidia provides is to 'enhance application settings' which even those do not always work properly. Depending on how a developer implements AA it may or may not work properly on specific hardware and specific drivers. When a demo and a finished product are released nearly simultaneously it's pretty safe to assume that a code branch took place before the product was finalized. This could easily lead to discrepancies such as is taking place here. Optimizations may have occurred after the demo split that changed how certain features might be implemented, financially motivated or otherwise.

That being said, I'd wait to hear more information before nay saying Nvidia. Let it be known that I was an ATI owner for YEARS before becoming fed up with certain quality issues never being resolved and finally switched to Nvidia about two years ago. As it stands now Nvidia has far more to offer the consumer in terms of overall features and upgraded paths than ATI does. Who stands to benefit the most from a sucker punch?

Just my opinion but I trust no company and am always skeptical when I see something like this...
 
BatmanAndRobin.jpg


Needs AA enabled ;)
 
Last edited:
i dont feel like reading all 8 pages, and i can be accused of being a ati fanboy, but honestly whens the last time you seen a game with a opening screne Saying ATI!!!!! THE WAY ITS MEANT TO BE PLAYED!! OR!! PHYSX GET IN THE GAME......*blinks*......and if this is true and not a bug, im gonna be very sad for nvidia. seeing there already living on old technology, the 300 series...really....

well see what comes about this, all i know is when i tried to play batman on my system..has a ati card..it would blue screen it....NO game blue screens my computer...i dont even understand it myself, ive reinstalled and everything. but whatever.
 
i dont feel like reading all 8 pages, and i can be accused of being a ati fanboy, but honestly whens the last time you seen a game with a opening screne Saying ATI!!!!! THE WAY ITS MEANT TO BE PLAYED!! OR!! PHYSX GET IN THE GAME......*blinks*......and if this is true and not a bug, im gonna be very sad for nvidia. seeing there already living on old technology, the 300 series...really....

well see what comes about this, all i know is when i tried to play batman on my system..has a ati card..it would blue screen it....NO game blue screens my computer...i dont even understand it myself, ive reinstalled and everything. but whatever.

gitg-logo.jpg


http://en.wikipedia.org/wiki/Get_in_the_game
 
i dont feel like reading all 8 pages, and i can be accused of being a ati fanboy, but honestly whens the last time you seen a game with a opening screne Saying ATI!!!!! THE WAY ITS MEANT TO BE PLAYED!! OR!! PHYSX GET IN THE GAME......*blinks*......and if this is true and not a bug, im gonna be very sad for nvidia. seeing there already living on old technology, the 300 series...really....

well see what comes about this, all i know is when i tried to play batman on my system..has a ati card..it would blue screen it....NO game blue screens my computer...i dont even understand it myself, ive reinstalled and everything. but whatever.

If all else fails try a legit copy. :laugh:
 
Sad fact of life: Intelligent and un-sensational posts are bad for attracting 13-year-olds. They should be ignored and buried.
 
...the only way AA can be used is by forcing it in Catalyst Control Center. This causes the driver to use AA on every 3D object in the scene, reducing performance, compared to if the game's in-game AA engine is used.

Wait.. So I get to use superior AA in the game? Sounds good to me. :toast:
 
Ive added somthing ;)

Until Christian Bale signed on to the batman franchise the live action series/movies were so fruity/kiddie.

I mean no offense to any Batman enthusiasts here BTW.
 
I'm so glad i didnt give in and buy this game now
 
I might hire the game tonight, load it up and see what happens.
Anyone know what might happen if i try to run this program?
 
Wow,almost 200 posts!(Ok,I contributed a little...)

So,the conclusions i can draw after reading those 9 pages is that either Nvidia is a big bad company who doesn't like to play fair or ATI is full of incopetent people.There's another thing which I think is more reasonable:if Nvidia put their money helping in the development of better features for the game(in-game AA and Phsyx),then why would it share it with ATI?Is there a rule anywhere saying that it should?
 
If all else fails try a legit copy.

i was talking about the demo, yes its unfinished but still it shouldnt blue screen a system so i just bought it for 360 and its been playing just fine....
 
this really blows hate this stuff they just saying what has been going on for years lol
 
Bad moves, ruin their own reputations, there is no need to go this far :/.
 
This is a great opportunity to put up a poll to see if people will continue to buy their products after reading about news like this.
 
I wouldn't stop buying their products because of this, I buy who ever offer better performance for my games.
 
This is hilarious. Almost everyone is ignoring the fact that (DX9)UE3 did not have proper AA until now. And this game happens to be a TWIMTBP one. Isn't it obvious that nVidia payed or worked with Rocksteady to make this happen?
 
And we began to rock....Steady!....Steady rockin all night long ::Sings:: Sorry, that brought a flash back!
 
To Put An End To This Argument

Just get a 30" monitor with 2560 * 1600 resolution and sit back several feet and FSAA just won't matter much at all.

Anyhow I think doing just about anything proprietary on an open platform is a bad way of doing business. What ever happened to Glide and 3Dfx? Let Nvidia and their Cuda simply fall off a cliff since the future will bring a better way of doing things anyhow. I've supported ATI for a long time now and have not bought a Nvidia video card since my 8800 GTX. I remember the first AMD Athlon and what it stood for. It meant that some other people existed out there that could do the same thing as the current king of the hill and do it at a better value and do it with creative thinking and a better product. I think ATI/AMD is doing that again right now especially on the graphics front and the way games like Crysis and the likes seem to want to sleep only with Nvidia hardware makes me angry. I'll probably still buy Arkham Asylum but there is no way in hell I'm gonna buy an Nvidia GPU to see FSAA done the "right" way. :toast:
 
Just get a 30" monitor with 2560 * 1600 resolution and sit back several feet and FSAA just won't matter much at all.

Anyhow I think doing just about anything proprietary on an open platform is a bad way of doing business. What ever happened to Glide and 3Dfx? Let Nvidia and their Cuda simply fall off a cliff since the future will bring a better way of doing things anyhow. I've supported ATI for a long time now and have not bought a Nvidia video card since my 8800 GTX. I remember the first AMD Athlon and what it stood for. It meant that some other people existed out there that could do the same thing as the current king of the hill and do it at a better value and do it with creative thinking and a better product. I think ATI/AMD is doing that again right now especially on the graphics front and the way games like Crysis and the likes seem to want to sleep only with Nvidia hardware makes me angry. I'll probably still buy Arkham Asylum but there is no way in hell I'm gonna buy an Nvidia GPU to see FSAA done the "right" way. :toast:

And nVidia still churns out the best drivers for *nix systems. How evil.
 
Just get a 30" monitor with 2560 * 1600 resolution and sit back several feet and FSAA just won't matter much at all.

Anyhow I think doing just about anything proprietary on an open platform is a bad way of doing business. What ever happened to Glide and 3Dfx? Let Nvidia and their Cuda simply fall off a cliff since the future will bring a better way of doing things anyhow. I've supported ATI for a long time now and have not bought a Nvidia video card since my 8800 GTX. I remember the first AMD Athlon and what it stood for. It meant that some other people existed out there that could do the same thing as the current king of the hill and do it at a better value and do it with creative thinking and a better product. I think ATI/AMD is doing that again right now especially on the graphics front and the way games like Crysis and the likes seem to want to sleep only with Nvidia hardware makes me angry. I'll probably still buy Arkham Asylum but there is no way in hell I'm gonna buy an Nvidia GPU to see FSAA done the "right" way. :toast:

Actually not the right way for AA, the corner cutting way :p.
 
If you need AA to play, I pitty you, and you are not a gamer. But beyond that, you should have learned how to enable it in CCC a long time ago, because there are a lot of games that don't even give the option of AA unless you force it. So one would assume you did the same with Batman, and enjoyed it.

unlike nvidia, ATI do not have profiles. if you enable AA in there, you're taking a large performance hit (due to using an un-optimised AA, at least until driver updates emerge) AND You have to go into the CCC and turn it on and off everytime you change between batman and another game

No, stop trying to tell people what to say.

No true gamer would require AA. And no true gamer would say a game has to have AA to be enjoyable. It is that simple.

stop trying to tell gamers how they play their games?

i hate gaming without AA, its the whole reason i have overkill graphics cards.


Right, and through all my searching I can't find where it says AA is natively supported in 3 or 3.5. If its added in by the developers and its co-developed by NVIDIA then there is no problem. Does anyone have the spec sheet that says UE3.5 has native AA in the engine?

the engine supports it, its just that on nvidia card DX10 rendering is normally required to use it.
ATI could always run it, with a few odd glitches when HDR was used at the same time.

http://www.pcbuyersguide.co.za/showthread.php?t=6757
Capture160.jpg



all thats happened here, is that nvidia took tweaking to get it to work and ATI didnt, but instead of just letting ATI run they "hid" the option to make their sponsors look better.
 
Last edited:
Back
Top