• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Batman: Arkham Asylum Enables AA Only on NVIDIA Hardware on PCs

cauby

New Member
Joined
Sep 28, 2009
Messages
121 (0.02/day)
Location
Brazil
System Name The New One
Processor Intel Core i5 750@2.67 GHz
Motherboard MSI-P55-CD53
Cooling Intel HSF
Memory Kingston 4GB (2x2GB) 1333MHz DDR3 CL9 Value
Video Card(s) Sapphire HD5770 1GB GDDR5 Vapor-X
Storage WD 1TB Green 64MB Cache
Display(s) Samsung SyncMaster 2233 1920x1080 21,5"
Case CoolerMaster CM 690
Power Supply 500W OCZ SXS
Software Windows 7 Ultimate Edition 64-bit
Joined
Apr 26, 2009
Messages
513 (0.09/day)
Location
You are here.
System Name Prometheus
Processor AMD Ryzen 9 5950x
Motherboard ASUS ROG Strix B550-I Gaming
Cooling EKWB EK-240 AIO D-RGB
Memory G.Skill Trident Z Neo 32GB
Video Card(s) MSI RTX 4070Ti Ventus 3X OC 12GB
Storage WD Black SN850 1TB + 1 x Samsung 970 Evo Plus 2TB
Display(s) DELL U4320Q 4K + Wacom Cintiq Pro 16 4K
Case Jonsbo A4 ver1.1 SFF
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Corsair SF750 Platinum SFX
Mouse Logitech Pro Wireless
Keyboard Vortex Race 3 75% MX Brown
Software Windows 11 Pro x64
The Unreal Engine 3.5 can only use AA modes if it's under Vista/DX10+. Because of the frickin' consoles, the game is mostly a DX9 game. In DX9 compatibility mode, UE cannot use AA because of the shadowing algorithm that is incompatible with current AA modes.

An important point for ATI DX10.1 cards was an interesting way of producing soft shadows (something nVidia doesn't have in it's DX10 implementation). Could this be the problem?
 
Joined
Sep 5, 2004
Messages
1,956 (0.27/day)
Location
The Kingdom of Norway
Processor Ryzen 5900X
Motherboard Gigabyte B550I AORUS PRO AX 1.1
Cooling Noctua NB-U12A
Memory 2x 32GB Fury DDR4 3200mhz
Video Card(s) PowerColor Radeon 5700 XT Red Dragon
Storage Kingston FURY Renegade 2TB PCIe 4.0
Display(s) 2x Dell U2412M
Case Phanteks P400A
Audio Device(s) Hifimediy Sabre 9018 USB DAC
Power Supply Corsair AX850 (from 2012)
Software Windows 10?
Well tell to ATI to invest more in the development and refinement of the drivers.

The biggest problem is the fact, ATI drivers has always been poor and bad at this point they would be like nvidia or even higher.

ATI GPU with tremendous computing power but they are too lazy to develop drivers able to exploit.

I would say to stop the childish acting fanboyism " i hate nvidia" etc. ...
might want to re evaluate that one dude
search google for "NVIDIA Vista Driver FAIL" and you might find a billion pages
http://www.google.com/search?hl=en&q=nvidia+vista++fail

given the fact that ATI's Catalyst drivers has been released every singel month since January 2004, they even release hotfixes and everything, btw did you know that ATi has had both DX10.1 and DX11 hardware long before nvidia?

but every one do have to agree that Intel's Crapstics drivers suck, dont they? compared to ATI and NVIDIA :p
get a ATI card and try again with this bad driver bitching
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,472 (4.23/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
Very conservative there only being so mindful for the corporations when the wealth will never get to you. In the end the consumer loses.

Not really. If it leads to a more playable game for the consumer, then I hardly call that a loss. If a large number of consumers were having stability issues making the game completely unplayable for them, and their issues were fixed with little affect on the other consumers playability of the game, I don't consider that an overall loss for the consumer.

¨
AC didnt crash for me or any of my friends. DX10.1
There was no problem, review sites didnt have issues either.

The fact that you support paying of game devs for other cards to be bad is just unbearable.

Way its meant to be played is perfectly fine if the FACT that the game ran as it should, and not with intentional crippled performance like its proven...
ATi does support game devs, and give videocards to them so they can check if it works, and support them with documentation and alike, nvidia's strategy is bigger, but they also bribes as it seems like with the result in some TWIMTBP games.

It doesn't matter that your small group of friends didn't have an issue. I didn't have an issue with the game either, but we know for a fact that there was major issues with nVidia hardware at least, and 2 of my machines were running nVidia hardware at the time, one of which was my main machine that I played the majority of the game on. When you get a sample size in the millions with no issues, then your argument will be sound, but until then, you have no clue if there were wide spread issues with DX10.1, the only people who know that are the ones working at Ubisoft.

I support nVidia putting money into better developement of games for their cards. Which is exactly what is happening here. Again, there is just as much evidence that nVidia paid entirely to have the feature added to the game for their cards as there is to say that the feature was already there, and nVidia just paid to have it removed for ATi cards. Either senerio is just as plausable given what we know so far. The only difference is one make nVidia out to be the bad guy, and one doesn't, so you really just have to pick if you want to give nVidia a bad name or not. Personally, I prefer to give everyone the benefit of the doubt and go with the senerio that makes them look best.
 
Joined
Feb 19, 2009
Messages
1,151 (0.21/day)
Location
I live in Norway
Processor R9 5800x3d | R7 3900X | 4800H | 2x Xeon gold 6142
Motherboard Asrock X570M | AB350M Pro 4 | Asus Tuf A15
Cooling Air | Air | duh laptop
Memory 64gb G.skill SniperX @3600 CL16 | 128gb | 32GB | 192gb
Video Card(s) RTX 4080 |Quadro P5000 | RTX2060M
Storage Many drives
Display(s) M32Q,AOC 27" 144hz something.
Case Jonsbo D41
Power Supply Corsair RM850x
Mouse g502 Lightspeed
Keyboard G913 tkl
Software win11, proxmox
Benchmark Scores 33000FS, 16300 TS. Lappy, 7000 TS.
might want to re evaluate that one dude
search google for "NVIDIA Vista Driver FAIL" and you might find a billion pages
http://www.google.com/search?hl=en&q=nvidia+vista++fail

given the fact that ATI's Catalyst drivers has been released every singel month since January 2004, they even release hotfixes and everything, btw did you know that ATi has had both DX10.1 and DX11 hardware long before nvidia?

Totally back the drivers up! not a issue except 9.2 which could not be upgraded.
Only issue since 2007 for me.
I agree X850XT PE, Niiiiiiiiiiiiightmare.
 
Joined
Feb 23, 2008
Messages
1,064 (0.18/day)
Location
Montreal
System Name Aryzen / Sairikiki / Tesseract
Processor 5800x / i7 920@3.73 / 5800x
Motherboard Steel Legend B450M / GB EX58-UDP4 / Steel Legend B550M
Cooling Mugen 5 / Pure Rock / Glacier One 240
Memory Corsair Something 16 / Corsair Something 12 / G.Skill 32
Video Card(s) AMD 6800XT / AMD 6750XT / Sapphire 7800XT
Storage Way too many drives...
Display(s) LG 332GP850-B / Sony w800b / Sony X90J
Case EVOLV X / Carbide 540 / Carbide 280x
Audio Device(s) SB ZxR + GSP 500 / board / Denon X1700h + ELAC Uni-Fi 2 + Senn 6XX
Power Supply Seasonic PRIME GX-750 / Corsair HX750 / Seasonic Focus PX-650
Mouse G700 / none / G602
Keyboard G910
Software w11 64
Benchmark Scores I don't play benchmarks...
Again, does it work on an Intel GPU? A test made in-house by...

This isn't even worth joking about at the moment...

In any case, I have both an ATI and an nVidia rig, and I'll be picking up the game once the price drops a bit and I actually have time to play it. Who knows, by then a patch might actually "fix" the AA issue.
 

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
46,390 (7.67/day)
Location
Hyderabad, India
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard ASUS ROG Strix B450-E Gaming
Cooling DeepCool Gammax L240 V2
Memory 2x 8GB G.Skill Sniper X
Video Card(s) Palit GeForce RTX 2080 SUPER GameRock
Storage Western Digital Black NVMe 512GB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,472 (4.23/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
might want to re evaluate that one dude
search google for "NVIDIA Vista Driver FAIL" and you might find a billion pages
http://www.google.com/search?hl=en&q=nvidia+vista++fail

given the fact that ATI's Catalyst drivers has been released every singel month since January 2004, they even release hotfixes and everything, btw did you know that ATi has had both DX10.1 and DX11 hardware long before nvidia?

but every one do have to agree that Intel's Crapstics drivers suck, dont they? compared to ATI and NVIDIA :p
get a ATI card and try again with this bad driver bitching

http://www.googlefight.com/index.php?lang=en_GB&word1=ATi+Crash&word2=nVidia+crash
http://www.googlefight.com/index.php?lang=en_GB&word1=ATi+driver+problem&word2=nVidia+driver+problem
http://www.googlefight.com/index.ph...ver+vista+fail&word2=nVidia+driver+vista+fail

It all depends on what you search for. Both sides have driver issue, neither is perfect. NVidia has an issue early on with vista, which is likely why there are so many hits when you search for it.

However, currently, both sides put out very good drivers on a consistant basis. So really, the whole X has better drivers than Y argument should really stop, because in the present it is hard to pick which is better, and if you look in the past both have had some pretty rocky times.
 
Joined
Feb 19, 2009
Messages
1,151 (0.21/day)
Location
I live in Norway
Processor R9 5800x3d | R7 3900X | 4800H | 2x Xeon gold 6142
Motherboard Asrock X570M | AB350M Pro 4 | Asus Tuf A15
Cooling Air | Air | duh laptop
Memory 64gb G.skill SniperX @3600 CL16 | 128gb | 32GB | 192gb
Video Card(s) RTX 4080 |Quadro P5000 | RTX2060M
Storage Many drives
Display(s) M32Q,AOC 27" 144hz something.
Case Jonsbo D41
Power Supply Corsair RM850x
Mouse g502 Lightspeed
Keyboard G913 tkl
Software win11, proxmox
Benchmark Scores 33000FS, 16300 TS. Lappy, 7000 TS.
I support nVidia putting money into better developement of games for their cards..
Notice- Not full quote!

Well, i wonder how the future will be if both companies put mouths full of money to cripple other cards performance:
Start up game:
Play, get tired of it.
Want to play a new one.
Shutdown, change videocard.
Power on, start game.
Play.
....
..

It should be about what card thats best made, gives best performance per buck, or just is the best card on the damn planet, like the MARS, cause someone just likes the big e-peeen!
Imagine, being on a lan with some friends, you wanna play a game, and you get a disadvantage cause you have nvidia, you wine wine wine, then you guys start playing another game, and they get a disadvantage and you advantage.
It should be supported on all cards, general support is the best way, i dont care if its 10% faster on a nvidia card, i care if its 20% slower and with lack of features that really do work without any quirks on ati cards but is intentionally disabled cause someone payed it to be like it.
 
Joined
Sep 16, 2009
Messages
166 (0.03/day)
With batman, they left an option in the settings menu for "optimized nv AA" This is just MSAA (an efficient method of smoothing jaggies) that both ati and nvidia can do. The issue was that if it detected an ati card, that option was not available. The result was the older standard method of smoothing jaggies (regular AA) that is much less efficient, became the method by which ati cards had to render AA in batman. That meant that Batman with AA enabled gave ati cards much worse framerates than Nvidia cards because the nv cards were using fast MSAA and the ati cards were using old ass slow regular AA. Same thing going on with NFS Shift.

For physx, Eidos and rocksteady took certain special effects of the game and packaged them to be able to render via Cuda on the nv gpus. These effects only work on geforce 8800 and higher (along with the MSAA). However, these effects (fog, smoke, sparks, particles shimmer, cloth, mattresses, destructible tiles, flying papers, aluminum cans, garbage, spiderwebs, destructible elements, haze, etc) can also all be rendered using an Ati card. They were just "removed" when cuda + nv is not detected since it is part of the cuda package. If you check rage3d, beyond3d or youtube you can see people with ATI cards + Corei7's running batman using MSAA and all the "physx" effects (because they edited the physx code, and tricked the game to thinking it had an nvidia card).

Nvidia would love to control the usage of these effects because it makes the game more emmersive and appealing to users of their own hardware, while decreasing the "coolness" of the game on ati hardware. The sad part is that if you know what you're doing, a few lines of code will have your ati card running perfect MSAA and your corei7 running all those fancy special effects in about 5 minutes, and probably at better framerates than nvidia (if you have a 5800 series). The really sad part is that, the more they do this, and get away with it, the farther apart technological competition becomes. The ati cards are already very powerful on the hardware level compared to G80/Gt200. With their talented engineers and hardware design team, it's bad that ati isn't as efficient with developer relations, driver model programming, and aggressive business practices.
 
Joined
Feb 19, 2009
Messages
1,151 (0.21/day)
Location
I live in Norway
Processor R9 5800x3d | R7 3900X | 4800H | 2x Xeon gold 6142
Motherboard Asrock X570M | AB350M Pro 4 | Asus Tuf A15
Cooling Air | Air | duh laptop
Memory 64gb G.skill SniperX @3600 CL16 | 128gb | 32GB | 192gb
Video Card(s) RTX 4080 |Quadro P5000 | RTX2060M
Storage Many drives
Display(s) M32Q,AOC 27" 144hz something.
Case Jonsbo D41
Power Supply Corsair RM850x
Mouse g502 Lightspeed
Keyboard G913 tkl
Software win11, proxmox
Benchmark Scores 33000FS, 16300 TS. Lappy, 7000 TS.
IF ati isnt aggressive, I dont know.

Pushing out hardware at very low prices, atm the demand is controlled by prices, the cards is already overdemanded, i cant find them in stock at all, they was, and they were gone straight away.
The fact that ati pushing out this is aggressive movement against nvidia, they are aggressive, but the way it should be.
Just like Intel and amd used to do.
Just like ati and nvidia used to do. before GF8xxx came and everything started to fall rapidly!

ATi is aggressive in pointing out flaws, and prices, and bringing out new products fast and with big improvements, this is also how nvidia did things in the past, and they rocked! now they're pushing software like they're Microsoft.
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,472 (4.23/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
So much for misinformed arguments. AMD tested the in-game AA, and it worked. So regardless of this AA implementation being a standard between NVIDIA and AMD, it works, and was yet disabled for ATI hardware.

That really isn't a problem. Whether the feature 'works' or not on the given hardware is all that matters, and it does. Stability issues is cannot be used as an excuse to completely remove the feature. If stability issues did exist, they should have left the feature available to everyone and worked on them. Besides, the game does not advertise that its AA features don't work on ATI hardware (or that it requires NVIDIA hardware for AA, just like it properly advertises PhysX).

despite your trying to use rational logic, you umm, failed.
http://en.wikipedia.org/wiki/Batman_Arkham_Asylum
http://img.techpowerup.org/090929/.jpg

note the "unreal engine" ? you see, this game was made from an existing game engine, well known to work on ATI and NVIDIA hardware with antialiasing.

you can also rename the games .exe to UE3.exe from what i hear, and then use catalyst control centers AA (even before the patch) and everything works well.

This is purely a dirty trick from nvidia, since NV only add AA to some things in game, while ATI now has to waste power antialiasing EVERYTHING (taking a performance hit) and inconveniencing end users.




Indeed. and a default feature of the engine used.



indeed. i went RE5 over this, due to this lame issue.



unfortunately, the lack of AA will never make it into enough news to hinder sales that much.

Yes, Unreal Engine 3's AA is proven to work stable on AMD GPUs. Thanks for cementing my argument.

Why is it so difficult to understand that this isn't just normal AA? Of course AA works in the game, it can be forced via the control panel, and could always be forced via the control panel. In fact, isn't that what we all do when AA isn't an option in a game? And yes, there are still games released without AA as an option.

But why it is hard to understand that this isn't traditional AA? BTA, you out of everyone should make sure you understand the concept, you are reporting it. It lowers your credibility to report such crap, and make such statements.

This is optimised AA! Done in a way to limit performance loss to next to nothing. This is not a standard feature. This is not a feature that exists in the Unreal Engine by default.

Yes, AA works, but not the AA that is used in the game! That is the difference BTA. The AA forced in CCC is obviously different, as it comes with a performance hit, unlike the in game AA setting. So while the effect might be very similar, it is two different features.

And you can not confirm that changing the device ID to trick it to work in game, really does function properly, as it was not tested in the actual game. Again, it is not likely, that a part of the game causes an issue with the feature on ATi cards, and the developers simply disabled the feature as a quick fix to get the game shipped? I mean the game was already delayed a month on the PC, so we know the developers were under a time crunch to get it shipped...so maybe in the end they did start to implement quick fixes. Is that so far fetched?

then they don't deserve our money (using reasoning no.1)

do you have a solid proof that the game will crashes halfway through(using reasoning 2).


so i say ATI owner card must boycott this game and rate it so low in every on-line store. so they will suffer 40% loss from us ATI owner.

Why not, you got a game. The game isn't any less playable.

Does anyone have any solid proof that it won't crash halfway through? Even ATi said they only tested the demo. I do know that I've encounted games, even recently, that would suffer unexplained crashes or have unexpected and unwanted issues caused by visual features enabled in the game, or driver issues. How many times have we seen "update your drivers" as a responce when someone is having game crashing issue? Just as an example: Prototype crashes on my GTX285 if I have AA enabled in the game menu, but works fine on my HD4890 or if I force AA using nVidia control panel. And Prototype just came out a few months ago!
 
Joined
Feb 19, 2009
Messages
1,151 (0.21/day)
Location
I live in Norway
Processor R9 5800x3d | R7 3900X | 4800H | 2x Xeon gold 6142
Motherboard Asrock X570M | AB350M Pro 4 | Asus Tuf A15
Cooling Air | Air | duh laptop
Memory 64gb G.skill SniperX @3600 CL16 | 128gb | 32GB | 192gb
Video Card(s) RTX 4080 |Quadro P5000 | RTX2060M
Storage Many drives
Display(s) M32Q,AOC 27" 144hz something.
Case Jonsbo D41
Power Supply Corsair RM850x
Mouse g502 Lightspeed
Keyboard G913 tkl
Software win11, proxmox
Benchmark Scores 33000FS, 16300 TS. Lappy, 7000 TS.
Why is it so difficult to understand that this isn't just normal AA?
This is optimised AA! Done in a way to limit performance loss to next to nothing. This is not a standard feature. This is not a feature that exists in the Unreal Engine by default.

They proved it by changing ID of the videocard, that it bumped the performance.
Its proven that Physx runs fine without a GPU.

Btw, prototype is a quick port, work just as good as GTA4. which is terrible. no matter make, blame the devs there, no features blocked though.
 

Mussels

Freshwater Moderator
Staff member
Joined
Oct 6, 2004
Messages
58,413 (8.18/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
Why is it so difficult to understand that this isn't just normal AA? Of course AA works in the game, it can be forced via the control panel, and could always be forced via the control panel. In fact, isn't that what we all do when AA isn't an option in a game? And yes, there are still games released without AA as an option.

But why it is hard to understand that this isn't traditional AA? BTA, you out of everyone should make sure you understand the concept, you are reporting it. It lowers your credibility to report such crap, and make such statements.

This is optimised AA! Done in a way to limit performance loss to next to nothing. This is not a standard feature. This is not a feature that exists in the Unreal Engine by default.

Yes, AA works, but not the AA that is used in the game! That is the difference BTA. The AA forced in CCC is obviously different, as it comes with a performance hit, unlike the in game AA setting. So while the effect might be very similar, it is two different features.

And you can not confirm that changing the device ID to trick it to work in game, really does function properly, as it was not tested in the actual game.

our point is simple: the game turns the AA settings off instead of the other options a normal developer would do.

A: leave the option in game, and ATI has worse performance (but can be tweaked via drivers)
B: Work with ATI prior to game release, giving them the same advantages as nvidia
C: disable the setting, sweep it under the rug, make the userbase who want things to "just work" run it on nviida cards

Most games go with A: the good ones go with B: - this game went with C:


where you're going wrong newtekie is that you're thinking the in game AA Is some special super duper thing they cooked up. its not. games have used their own in game AA for as long as in game AA options have been, well, in game options. they can say "oh, only AA stuff close to the camera" or "ignore things on this level, it hurt performance badly with all the action"

the other assumption you appear to be making is that "ATI get AA, nvidia get faster AA" - not hte case. ATI didnt get shit, until this was made a big issue, and the game got patched. NO AA AT ALL.


You're taking the "nvidia can do what they want" approach, but i bet if games came out and said "no AA for nvidia, only ATI" you'd be saying a different story.

remember that to even force the AA in the game via CCC, it took hardware ID hacking, public shaming .exe renaming, and finally a game patch - and thats with an un-neccesary performance hit!
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,472 (4.23/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
They proved it by changing ID of the videocard, that it bumped the performance.
Its proven that Physx runs fine without a GPU.

Btw, prototype is a quick port, work just as good as GTA4. which is terrible.

They proved it worked in the demo. Which is all of 15 Minutes of the actual game. There is a lot more to the game then just what was in the demo, and any part of the game could have been giving then problems.

And what do you think Batman is? What do you think the extra month was for? Porting it to the PC and adding PhysX...

our point is simple: the game turns the AA settings off instead of the other options a normal developer would do.

A: leave the option in game, and ATI has worse performance (but can be tweaked via drivers)
B: Work with ATI prior to game release, giving them the same advantages as nvidia
C: disable the setting, sweep it under the rug, make the userbase who want things to "just work" run it on nviida cards

Most games go with A: the good ones go with B: - this game went with C:


where you're going wrong newtekie is that you're thinking the in game AA Is some special super duper thing they cooked up. its not. games have used their own in game AA for as long as in game AA options have been, well, in game options. they can say "oh, only AA stuff close to the camera" or "ignore things on this level, it hurt performance badly with all the action"

the other assumption you appear to be making is that "ATI get AA, nvidia get faster AA" - not hte case. ATI didnt get shit, until this was made a big issue, and the game got patched. NO AA AT ALL.


You're taking the "nvidia can do what they want" approach, but i bet if games came out and said "no AA for nvidia, only ATI" you'd be saying a different story.

remember that to even force the AA in the game via CCC, it took hardware ID hacking, public shaming .exe renaming, and finally a game patch - and thats with an un-neccesary performance hit!

A: It would have to be a different setting.
B: When ATi starts paying for developer time, then this become viable, until then nVidia will get more dev time than ATi.
C: Seems like a good option for a time crunched game.

And where you and everyone else seems to be going wrong, is that you don't understand that the in game AA used in Batman isn't normal AA. It is optimized to give next to no performance loss. When have you seen that? That is "super-duper" IMO.
 
Joined
Feb 19, 2009
Messages
1,151 (0.21/day)
Location
I live in Norway
Processor R9 5800x3d | R7 3900X | 4800H | 2x Xeon gold 6142
Motherboard Asrock X570M | AB350M Pro 4 | Asus Tuf A15
Cooling Air | Air | duh laptop
Memory 64gb G.skill SniperX @3600 CL16 | 128gb | 32GB | 192gb
Video Card(s) RTX 4080 |Quadro P5000 | RTX2060M
Storage Many drives
Display(s) M32Q,AOC 27" 144hz something.
Case Jonsbo D41
Power Supply Corsair RM850x
Mouse g502 Lightspeed
Keyboard G913 tkl
Software win11, proxmox
Benchmark Scores 33000FS, 16300 TS. Lappy, 7000 TS.
They proved it worked in the demo. Which is all of 15 Minutes of the actual game. There is a lot more to the game then just what was in the demo, and any part of the game could have been giving then problems.

And what do you think Batman is? What do you think the extra month was for? Porting it to the PC and adding PhysX...

YES!

The engine is already a PC engine, no port needed except smash textures models maps and add physx for the most part.
 

Mussels

Freshwater Moderator
Staff member
Joined
Oct 6, 2004
Messages
58,413 (8.18/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
And what do you think Batman is? What do you think the extra month was for? Porting it to the PC and adding PhysX...

you've been asking for evidence of everyone elses unsubstantiated claims, where is yours for this? how do you know this month wasnt spent making the game "better" for their sponsor?

its hypocritical to say everyone else needs direct evidence (it needs to work in the full game - the demo, with the same engine, does not count) - yet you can make up claims like that.


YES!

The engine is already a PC engine, no port needed except smash textures models maps and add physx for the most part.

thats what i was aware of too. with compatible engines, they'd only have bug fixes to do (and finding ways to make physx look like its doing something, since the game was designed without it)
 
Joined
Jun 17, 2007
Messages
7,335 (1.19/day)
Location
C:\Program Files (x86)\Aphexdreamer\
System Name Unknown
Processor AMD Bulldozer FX8320 @ 4.4Ghz
Motherboard Asus Crosshair V
Cooling XSPC Raystorm 750 EX240 for CPU
Memory 8 GB CORSAIR Vengeance Red DDR3 RAM 1922mhz (10-11-9-27)
Video Card(s) XFX R9 290
Storage Samsung SSD 254GB and Western Digital Caviar Black 1TB 64MB Cache SATA 6.0Gb/s
Display(s) AOC 23" @ 1920x1080 + Asus 27" 1440p
Case HAF X
Audio Device(s) X Fi Titanium 5.1 Surround Sound
Power Supply 750 Watt PP&C Silencer Black
Software Windows 8.1 Pro 64-bit
Come on guys, this shouldn't even be an argument...

There is no justifying what Nvidia did and we ATI users should be used to this kind of treatment by now. If Nvidia wants to use cheap methods to trick the consumers into thinking their card is better then fine, I say let them. ATI is doing just fine regardless and all the wiser people out there will always know and always be a little more educated to know the truth.
 
Joined
Mar 28, 2007
Messages
2,490 (0.40/day)
Location
Your house.
System Name Jupiter-2
Processor Intel i3-6100
Motherboard H170I-PLUS D3
Cooling Stock
Memory 8GB Mushkin DDR3L-1600
Video Card(s) EVGA GTX 1050ti
Storage 512GB Corsair SSD
Display(s) BENQ 24in
Case Lian Li PC-Q01B Mini ITX
Audio Device(s) Onboard
Power Supply Corsair 450W
Mouse Logitech Trackball
Keyboard Custom bamboo job
Software Win 10 Pro
Benchmark Scores Finished Super PI on legendary mode in only 13 hours.
YES!

The engine is already a PC engine, no port needed except smash textures models maps and add physx for the most part.

you've been asking for evidence of everyone elses unsubstantiated claims, where is yours for this? how do you know this month wasnt spent making the game "better" for their sponsor?

its hypocritical to say everyone else needs direct evidence (it needs to work in the full game - the demo, with the same engine, does not count) - yet you can make up claims like that.

thats what i was aware of too. with compatible engines, they'd only have bug fixes to do (and finding ways to make physx look like its doing something, since the game was designed without it)

Come on guys, this shouldn't even be an argument...

There is no justifying what Nvidia did and we ATI users should be used to this kind of treatment by now. If Nvidia wants to use cheap methods to trick the consumers into thinking their card is better then fine, I say let them. ATI is doing just fine regardless and all the wiser people out there will always know and always be a little more educated to know the truth.


Nope... you're all crazy! Don't you see what Nvidia was doing here?!? They were being magnanimous -- uh... they were looking out for the poor ATI player!

By doing this, they were, uh, improving the game experience for ATI users! How nice of them!!

Wait, that... totally doesn't make any sense at all. :wtf:
 

cauby

New Member
Joined
Sep 28, 2009
Messages
121 (0.02/day)
Location
Brazil
System Name The New One
Processor Intel Core i5 750@2.67 GHz
Motherboard MSI-P55-CD53
Cooling Intel HSF
Memory Kingston 4GB (2x2GB) 1333MHz DDR3 CL9 Value
Video Card(s) Sapphire HD5770 1GB GDDR5 Vapor-X
Storage WD 1TB Green 64MB Cache
Display(s) Samsung SyncMaster 2233 1920x1080 21,5"
Case CoolerMaster CM 690
Power Supply 500W OCZ SXS
Software Windows 7 Ultimate Edition 64-bit
Boo-hoo to Batman,DC Comics and Nvidia...

I'm playing Spiderman,enough of this troubles!
 

the_wolf88

New Member
Joined
Sep 2, 2009
Messages
34 (0.01/day)
Location
Bahrain
Processor Intel Core 2 Duo E8400 @ 4.0GHz
Motherboard Asus P5QL-E
Cooling Coolermaster V8
Memory 4GB
Video Card(s) EVGA Geforce GTX260
Storage 2x Seagate Barracuda 250GB @ RAID 0
Display(s) Samsung SyncMaster 2253BW
Case Coolermaster HAF 932
Audio Device(s) Creative XtremeGamer
Power Supply Coolermaster CM550
Software Windows 7 Professional x64
Comic book games are so... c**p :D I dont miss Batman at all :)

"With no in-game AA available to ATI Radeon users, although the features do technically work on ATI Radeon hardware, the only way AA can be used is by forcing it in Catalyst Control Center."
Problem solved :toast:

No it's not !!

you should read the full line..

With no in-game AA available to ATI Radeon users, although the features do technically work on ATI Radeon hardware, the only way AA can be used is by forcing it in Catalyst Control Center. This causes the driver to use AA on every 3D object in the scene, reducing performance, compared to if the game's in-game AA engine is used.

Performance drops a lot !!!

Damn you Nvidia :mad:
 
Joined
Jun 17, 2007
Messages
7,335 (1.19/day)
Location
C:\Program Files (x86)\Aphexdreamer\
System Name Unknown
Processor AMD Bulldozer FX8320 @ 4.4Ghz
Motherboard Asus Crosshair V
Cooling XSPC Raystorm 750 EX240 for CPU
Memory 8 GB CORSAIR Vengeance Red DDR3 RAM 1922mhz (10-11-9-27)
Video Card(s) XFX R9 290
Storage Samsung SSD 254GB and Western Digital Caviar Black 1TB 64MB Cache SATA 6.0Gb/s
Display(s) AOC 23" @ 1920x1080 + Asus 27" 1440p
Case HAF X
Audio Device(s) X Fi Titanium 5.1 Surround Sound
Power Supply 750 Watt PP&C Silencer Black
Software Windows 8.1 Pro 64-bit
Nope... you're all crazy! Don't you see what Nvidia was doing here?!? They were being magnanimous -- uh... they were looking out for the poor ATI player!

By doing this, they were, uh, improving the game experience for ATI users! How nice of them!!

Wait, that... totally doesn't make any sense at all. :wtf:

How am I crazy, I think were on the same side here lol.
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,472 (4.23/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
YES!

The engine is already a PC engine, no port needed except smash textures models maps and add physx for the most part.

I guess you are right here, the extra time was to add features.

you've been asking for evidence of everyone elses unsubstantiated claims, where is yours for this? how do you know this month wasnt spent making the game "better" for their sponsor?

http://www.actiontrip.com/rei/comments_news.phtml?id=080609_9

There you go. Article explaining the game was delayed to add PhysX.

its hypocritical to say everyone else needs direct evidence (it needs to work in the full game - the demo, with the same engine, does not count) - yet you can make up claims like that.




thats what i was aware of too. with compatible engines, they'd only have bug fixes to do (and finding ways to make physx look like its doing something, since the game was designed without it)

Its not really hypocritical, as when I'm ask for it, I provide it.

And yes, it is the same engine, I get that, we all do. However, that doesn't mean it will work in every single part of the game. That is my point. Testing it in the demo is one step, but testing it in the real game, playing completely through is another.
 

PEPE3D

New Member
Joined
Feb 12, 2009
Messages
44 (0.01/day)
Location
NJ, USA
System Name PEPE3D-Custom Built
Processor Intel QX9650 OC @4.02 GHz
Motherboard Asus Maximus II Formula P45
Cooling CoolIT Systems Freezone Elite TEC CPU Cooler with MTEC
Memory Corsair PC-8500 8GB 1066
Video Card(s) Daimond 4870x2 2GB XOC X two CF
Storage C:\Kingston ssdNowV+ series 128 GB(SNVP325-S2B)
Display(s) Samsung 244T 24" 1920x1200
Case Coolermaster HAF-932
Audio Device(s) Sound Blaster X-Fi Titanium Fatal1ty Champion
Power Supply Ultra X 1600 wattts
Software Windows 7 Pro 64 bit Edition
Benchmark Scores 3DMARK 06(24,507)3DMARK VANTAGE(20,587)
Batman AA and ATI

I try to play this game. It plays good with nice graphics but it crashes a lot. I am angry at the fact that it maybe do to the fact that I have two ATI cards in CF 4870x2 2gb GDDR5. Great cards. I can pretty much play any game maxed out at 1900x1200. But this game is different. I have to play it at low resolution or the game won't play at all. I am very annoyed. If I knew this, I would've not spent the money on this game. Also, I think is time for developers to start thinking about us, We are the one that buy the games regardless of what brand of GPU we have in our pc. We should start to look at the reviews of the game before they come out and if by any chance the developer suggests that it will play better with NVIDIA or ATI, "we" the customers should boycott the game. We have the power, they need us, they need our money. Therefore, there should be no preference of what GPU you have. It should play great regardless. These are bad business practices and someone should do something about it. GAMES are not cheap!
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,472 (4.23/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
If you plan to boycott every game that plays better on one over the other, don't expect to be playing any games. They all favor one over the other, that is just how it is. However, they all also run far beyond acceptably on both, regardless of who they favor.

You are never going to have a game that is playable on one, and completely unplayable on the other. You might have to lower a setting or two on one or the other, but really all games tend to be very similar on both when using comparable cards.
 
Top