• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Batman: Arkham Asylum Enables AA Only on NVIDIA Hardware on PCs

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,472 (4.23/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
Well if nVidia was the one that paid for the AA optimizations to be put in the game, I see no problem with them limitting them to nVidia hardware also.

For all we know, if nVidia didn't put the money in to implement the AA optimizations, we would never have seen them in the game, so why should ATi benefit from that?

It might not be a case of nVidia or the game developers removing a feature, but instead a case of nVidia paying to have the feature added in the first place.

These could have been performance optimizations that nVidia entirely paid for, the whole purpose of TWIMTBP program, so why should then enable them for ATi?

Or a completely different reasoning:

It could be that having it enabled with ATi cards causes problems in the retail game(remember they only tested this on the demo). For all we know, something with the way ATi cards handles AA causes the game to crash or be extremely buggy with the optimized AA enabled. Maybe a certain part of the game is completely unplayable on ATi cards with the feature enabled, so the developers(nothing to do with nVidia at all) just gave up trying to fix it, and simply disabled the feature on ATi cards as a quick fix to get the game shipped. They now have more time to work on a patch to make it work. It wouldn't be the first time we've seen games have problems with one manufacturer, but not the other, due to certain visual elements conflicting with the current drivers.

Either way, I highly doubt nVidia caused a feature that was already in the game to be disabled.

Nvidia pressured Ubisoft to remove DirectX 10.1 support.

How do you know this?
 
Joined
Oct 21, 2004
Messages
203 (0.03/day)
System Name Gaming Rig
Processor Phenom II 940BE @ 3.7ghz
Motherboard ASUS M3A78-T
Cooling Coolermaster V8
Memory 4 x 2gb DDR2 800mhz
Video Card(s) Sapphire 5870 (Asus bios)
Storage 2TB SEAGATE SATA2
Display(s) Samsung T240 24" Widescreen
Case Coolermaster Cosmos S
Audio Device(s) Creative X-Fi extreme music
Power Supply Corsair TX 850W
Software Windows 7 Ultimate 64bit
Nvidia pressured Ubisoft to remove DirectX 10.1 support.

Thats why i don't like nVidia and their anti-competitive behavior and shady morals.
My next purchase will definitely be a 5870. U don't really need more power than that.
 
Joined
Oct 21, 2004
Messages
203 (0.03/day)
System Name Gaming Rig
Processor Phenom II 940BE @ 3.7ghz
Motherboard ASUS M3A78-T
Cooling Coolermaster V8
Memory 4 x 2gb DDR2 800mhz
Video Card(s) Sapphire 5870 (Asus bios)
Storage 2TB SEAGATE SATA2
Display(s) Samsung T240 24" Widescreen
Case Coolermaster Cosmos S
Audio Device(s) Creative X-Fi extreme music
Power Supply Corsair TX 850W
Software Windows 7 Ultimate 64bit
Well if nVidia was the one that paid for the AA optimizations to be put in the game, I see no problem with them limitting them to nVidia hardware also.

For all we know, if nVidia didn't put the money in to implement the AA optimizations, we would never have seen them in the game, so why should ATi benefit from that?

It might not be a case of nVidia or the game developers removing a feature, but instead a case of nVidia paying to have the feature added in the first place.

These could have been performance optimizations that nVidia entirely paid for, the whole purpose of TWIMTBP program, so why should then enable them for ATi?

Or a completely different reasoning:

It could be that having it enabled with ATi cards causes problems in the retail game(remember they only tested this on the demo). For all we know, something with the way ATi cards handles AA causes the game to crash or be extremely buggy with the optimized AA enabled. Maybe a certain part of the game is completely unplayable on ATi cards with the feature enabled, so the developers(nothing to do with nVidia at all) just gave up trying to fix it, and simply disabled the feature on ATi cards as a quick fix to get the game shipped. They now have more time to work on a patch to make it work. It wouldn't be the first time we've seen games have problems with one manufacturer, but not the other, due to certain visual elements conflicting with the current drivers.

Either way, I highly doubt nVidia caused a feature that was already in the game to be disabled.



How do you know this?

This is common knowledge. Been discussed before.

BTW AA isn't an added feature...it's a standard.
 
Joined
Aug 10, 2007
Messages
4,267 (0.70/day)
Location
Sanford, FL, USA
Processor Intel i5-6600
Motherboard ASRock H170M-ITX
Cooling Cooler Master Geminii S524
Memory G.Skill DDR4-2133 16GB (8GB x 2)
Video Card(s) Gigabyte R9-380X 4GB
Storage Samsung 950 EVO 250GB (mSATA)
Display(s) LG 29UM69G-B 2560x1080 IPS
Case Lian Li PC-Q25
Audio Device(s) Realtek ALC892
Power Supply Seasonic SS-460FL2
Mouse Logitech G700s
Keyboard Logitech G110
Software Windows 10 Pro
newtekie, I always appreciate your sound logical postings and thank them on occasion. But trying to bridge logic to consumers who will use emotions, pov ethics, etc, is bit futile :D
 
Joined
Nov 4, 2005
Messages
11,687 (1.73/day)
System Name Compy 386
Processor 7800X3D
Motherboard Asus
Cooling Air for now.....
Memory 64 GB DDR5 6400Mhz
Video Card(s) 7900XTX 310 Merc
Storage Samsung 990 2TB, 2 SP 2TB SSDs and over 10TB spinning
Display(s) 56" Samsung 4K HDR
Audio Device(s) ATI HDMI
Mouse Logitech MX518
Keyboard Razer
Software A lot.
Benchmark Scores Its fast. Enough.
I now won't buy the game, another Nvidia card for any PC I build, and will show this as the shit that is, just like the IQ Nvidia tried forcing on users years ago to keep up.

Fuck you Nvidia, and the horse you ride on.
 
Joined
Sep 26, 2006
Messages
6,959 (1.08/day)
Location
Australia, Sydney
Well if nVidia was the one that paid for the AA optimizations to be put in the game, I see no problem with them limitting them to nVidia hardware also.

For all we know, if nVidia didn't put the money in to implement the AA optimizations, we would never have seen them in the game, so why should ATi benefit from that?

It might not be a case of nVidia or the game developers removing a feature, but instead a case of nVidia paying to have the feature added in the first place.

These could have been performance optimizations that nVidia entirely paid for, the whole purpose of TWIMTBP program, so why should then enable them for ATi?

Or a completely different reasoning:

It could be that having it enabled with ATi cards causes problems in the retail game(remember they only tested this on the demo). For all we know, something with the way ATi cards handles AA causes the game to crash or be extremely buggy with the optimized AA enabled. Maybe a certain part of the game is completely unplayable on ATi cards with the feature enabled, so the developers(nothing to do with nVidia at all) just gave up trying to fix it, and simply disabled the feature on ATi cards as a quick fix to get the game shipped. They now have more time to work on a patch to make it work. It wouldn't be the first time we've seen games have problems with one manufacturer, but not the other, due to certain visual elements conflicting with the current drivers.

Either way, I highly doubt nVidia caused a feature that was already in the game to be disabled.

You do realise that this game has been ported from the xbox 360, which mind you runs a R500 based GPU-Ati/AMD's stuff. Now in this case what is preventing AA from working in the first place is secuROM-a questionable use of something that is MEANT to be used for anti-piracy reasons, and not for anti-competetive market practises. Yes i love that phrase anti-competetive market practises. And I love using it agains you since you always argue against such matters.

The blogger has proven that they're able to get the game to run with AMD's hardware with a rather evasive measure.

Edios will probably stay silent on this matter. Whats next? Disabling rendering altogether?
 
Joined
Feb 21, 2008
Messages
4,985 (0.84/day)
Location
Greensboro, NC, USA
System Name Cosmos F1000
Processor i9-9900k
Motherboard Gigabyte Z370XP SLI, BIOS 15a
Cooling Corsair H100i, Panaflo's on case
Memory XPG GAMMIX D30 2x16GB DDR4 3200 CL16
Video Card(s) EVGA RTX 2080 ti
Storage 1TB 960 Pro, 2TB Samsung 850 Pro, 4TB WD Hard Drive
Display(s) ASUS ROG SWIFT PG278Q 27"
Case CM Cosmos 1000
Audio Device(s) logitech 5.1 system (midrange quality)
Power Supply CORSAIR HXi HX1000i 1000watt
Mouse G400s Logitech
Keyboard K65 RGB Corsair Tenkeyless Cherry Red MX
Software Win10 Pro, Win7 x64 Professional
"Removing a feature," especially when it comes to making the game look better through AA, is pretty much the same as "making it run shittier," since I care about IQ.

I don't believe for a second that there was something about this game that didn't allow it to run AA just FINE on ATI hardware, especially considering (like one poster pointed out) it's an Xbox port. :shadedshu

Its not an xbox port, someone else has pointed this out already. Development was separate to incorporate GPU Physx.

"run sh!ttier" implies lower frame rates, and that is only when you force it in CCC not having the selective AA feature which is the very definition of an optimization. Only difference is you see it in the form of a button now so its worse somehow?

I think its stupid they did it but its a difference in IQ and not framerate unless you change the ATi profile to make it swing the other way by forcing AA in CCC.

They can just have an aftermarket patch to swing it the other way. Its no big deal.
 
Last edited:
Joined
Feb 19, 2009
Messages
1,151 (0.21/day)
Location
I live in Norway
Processor R9 5800x3d | R7 3900X | 4800H | 2x Xeon gold 6142
Motherboard Asrock X570M | AB350M Pro 4 | Asus Tuf A15
Cooling Air | Air | duh laptop
Memory 64gb G.skill SniperX @3600 CL16 | 128gb | 32GB | 192gb
Video Card(s) RTX 4080 |Quadro P5000 | RTX2060M
Storage Many drives
Display(s) M32Q,AOC 27" 144hz something.
Case Jonsbo D41
Power Supply Corsair RM850x
Mouse g502 Lightspeed
Keyboard G913 tkl
Software win11, proxmox
Benchmark Scores 33000FS, 16300 TS. Lappy, 7000 TS.
Well if nVidia was the one that paid for the AA optimizations to be put in the game, I see no problem with them limitting them to nVidia hardware also.

For all we know, if nVidia didn't put the money in to implement the AA optimizations, we would never have seen them in the game, so why should ATi benefit from that?

It might not be a case of nVidia or the game developers removing a feature, but instead a case of nVidia paying to have the feature added in the first place.

These could have been performance optimizations that nVidia entirely paid for, the whole purpose of TWIMTBP program, so why should then enable them for ATi?

Or a completely different reasoning:

It could be that having it enabled with ATi cards causes problems in the retail game(remember they only tested this on the demo). For all we know, something with the way ATi cards handles AA causes the game to crash or be extremely buggy with the optimized AA enabled. Maybe a certain part of the game is completely unplayable on ATi cards with the feature enabled, so the developers(nothing to do with nVidia at all) just gave up trying to fix it, and simply disabled the feature on ATi cards as a quick fix to get the game shipped. They now have more time to work on a patch to make it work. It wouldn't be the first time we've seen games have problems with one manufacturer, but not the other, due to certain visual elements conflicting with the current drivers.

Either way, I highly doubt nVidia caused a feature that was already in the game to be disabled.



How do you know this?

Nvidia fanboy at the best ?

I completed the AC on 3 comps using diffrent setups, all DX10.1, NEVER crashed, the patch came and it crashed. Lol.

Did you read it ?
They changed a ID and it WORKED!
There is always something that seems to be bad with ati cards as long as its way its meant to be played/payed.

Xbox runs AA, Why cant a simular architecture run it ? since 2900xt/1950xtx in the middle of those is what a Xbox ati chip is, somewhere in that path.
It doesnt work with my 2900 XT or a 1950XTX... how about That ?.
(Yes i tested :D)

And btw, i have a shitvidia card, which i could have used but nvidia doesnt let me.
Nvidia, the way its meant to fail.
I'm talking about PHysx with ATI as rendering.
I totally liked nvidia products, until.

Rename.
Rename.
Meant to be played issues here n there.
Physx bullshit pushing.
Bashing at AMD for no reason
Bashing again.
Meant to be played starting to become Way its meant to bug you.
Till.
The way its meant to piss you off big time

Physx for me was money well thrown away, i can sell it, but its not worth anything now anyways due to HD5xxx.
 
Last edited:
Joined
Jun 17, 2007
Messages
7,335 (1.19/day)
Location
C:\Program Files (x86)\Aphexdreamer\
System Name Unknown
Processor AMD Bulldozer FX8320 @ 4.4Ghz
Motherboard Asus Crosshair V
Cooling XSPC Raystorm 750 EX240 for CPU
Memory 8 GB CORSAIR Vengeance Red DDR3 RAM 1922mhz (10-11-9-27)
Video Card(s) XFX R9 290
Storage Samsung SSD 254GB and Western Digital Caviar Black 1TB 64MB Cache SATA 6.0Gb/s
Display(s) AOC 23" @ 1920x1080 + Asus 27" 1440p
Case HAF X
Audio Device(s) X Fi Titanium 5.1 Surround Sound
Power Supply 750 Watt PP&C Silencer Black
Software Windows 8.1 Pro 64-bit
Well nothing more needs to be said by me to elaborate this BS brought on by Nshitia again. I just want them to pull this shit off with the PS3's RSX so I can benfit from it at least one time :laugh:.
 

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
46,369 (7.67/day)
Location
Hyderabad, India
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard ASUS ROG Strix B450-E Gaming
Cooling DeepCool Gammax L240 V2
Memory 2x 8GB G.Skill Sniper X
Video Card(s) Palit GeForce RTX 2080 SUPER GameRock
Storage Western Digital Black NVMe 512GB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
ATI GPUs can handle that game's in-game AA, this comes from AMD. The game just disables the feature when it sees an AMD GPU. This is total blasphemy. I'm not going to / can't tell you what you should choose with your wallets, but I'll tell you what my wallet says.

"no Batman Arkham Asylum for GeForce for you, bta."

evil wallet.
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,472 (4.23/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
This is common knowledge. Been discussed before.

BTW AA isn't an added feature...it's a standard.

It's been discussed, but no one has ever shown any proof that nVidia was really behind Ubisoft removing it. Not a single shred. Plenty of claims, but claims don't equate proof.

And AA is a feature that has to be added to a game, it isn't just magically in there, at least not the type of optimized adaptive AA that is present in Batman.

FSAA is just a driver switch that any developer can enabled. However, it always comes with a drop in framerate. The AA used in Batman has been optimized to not only make the game look better, but do it at no performance loss, by optimizing what object get AA applied and which don't. This is definitely not a standard feature in games.

If ATi users want AA, enabled it in CCC, there are plenty of other games that don't have AA and require this also. You will get a FPS hit just like those other games also.

ATI GPUs can handle that game's in-game AA, this comes from AMD. The game just disables the feature when it sees an AMD GPU. This is total blasphemy. I'm not going to / can't tell you what you should choose with your wallets, but I'll tell you what my wallet says.

"no Batman Arkham Asylum for GeForce for you, bta."

evil wallet.

In the demo, but how do we know it doesn't cause a problem further along in the game, as I've already pointed out? They haven't tested more than 15 minutes of gameplay and we all assume it works through the entire game.

How many times have we played a game, that worked fine through 3-4 hours of gameplay, then suddenly crashes at the exact same spot no matter what we do? I know I've had it happen several times in the many years I've been playing. In games as recently released as a few months ago. It is actually pretty common in newly released game, as the drivers haven't been fixed yet. The solution is often to disable some visual feature(because the drivers don't like it), or to wait for better drivers.

We don't know that this isn't the case here. Instead, some are jumping to the conclusion that because it has an nVidia stamp on it, that nVidia disabled the feature for ATi. We don't know that. And frankly for a news reporter to even suggest it without any shred of proof completely removes all credibility that new reporter has.
 
Joined
Jan 12, 2006
Messages
145 (0.02/day)
System Name Little Bear
Processor Inten 3770K @4.4Ghz 1.2v
Motherboard ASRock Extreme4-M
Cooling Thermalright Macho120 Rev.A
Memory 4x4GB Samsung LP DDR3 @ 1866MHz 9-9-9-28
Video Card(s) Radeon R9 290
Storage 256GB Samsung SSD 830 + 1TB SSD 840 EVO + 2.5in 1TB Hitachi
Display(s) Dell U2711
Case Silverstong SG10
Audio Device(s) USB SB Omni
Power Supply Enermax Modu 82+ 620w
Software Windows 8.1 x64
You help to pay for a games creation and you want something in return? Thats crazy, Nvidia must have used magic. :laugh:

There are games optimized to do well with Nvidia drivers and vice versa. Its nothing new, its why they throw their hat in the ring to help out in development and funding. Its what they get in return. Not like its the best game of the year. It probably sucks. But its sales are supposed to be good so idk.

No reason to act "butthurt" you guys.

Ignoring the artificial limitation of extra Physx Effects to the GPU only for a second. There is a reason to be annoyed. NVIDIA and the game developer colluded to make the game run WORSE on ATI hardware. There is no hardware reason why Batman:AA can't do MSAA on ATI hardware.

Then there is the Physx issue where the developer did a rather shitty job of that too. Advanced Physx effects only run on NVIDIA GPUs. There is no (in game) option to enable them under the CPU. And some of the effects like cloth and dynamic fog were simply removed in the non Physx version. Apparently it was too much work to replace the effects with at least semi static ones......


Bottom line is that this game, with help from NVIDIA, was intentionally neutered for when it was run on non NVIDIA hardware.
 
Joined
Mar 2, 2009
Messages
5,061 (0.92/day)
Processor AMD Ryzen 5 7600
Motherboard Gigabyte B650 Aorus Elite AX
Cooling Thermalright Peerless Assassin 120 SE
Memory Kingston Fury Beast DDR5-5600 16GBx2
Video Card(s) Gigabyte Gaming OC AMD Radeon RX 7800 XT 16GB
Storage TEAMGROUP T-Force Z440 2TB, SPower A60 2TB, SPower A55 2TB, Seagate 4TBx2, Samsung 870 2TB
Display(s) AOC 24G2 + Xitrix WFP-2415
Case Montech Air X
Audio Device(s) Realtek onboard
Power Supply Be Quiet! Pure Power 11 FM 750W 80+ Gold
Mouse Logitech G304
Keyboard Redragon K557 KAIA RGB Mechanical Keyboard
Software Windows 10
How do you know this?


There are many articles out there, hinting towards that. But in some articles Nvidia insists they had no hand to play in the removal of DirectX10.1 support, which is naturally what they'll say (and Ubisoft likewise says that "implementation is costly" - see the TechReport link).

DirectX 10.1 support was removed because:

1. Nvidia cards don't support it.
2. HD 3000 series cards were 20% better than their respective 9000 GT series counterparts with DirectX 10.1 (and AA enabled).

DirectX 10.1 gives the shader units access to all anti-aliasing buffers in a single pass – something that developers have been unable to do with DirectX 10.0. "DX10.0 screwed AA [performance]. . . . 10.1 would solve that [issue]," said one developer reportedly close to Ubisoft.

"Of course it removes the render pass! That's what 10.1 does! Why is no one pointing this out, that's the correct way to implement it and is why we will implement 10.1. The same effects in 10.1 take 1 pass whereas in 10 it takes 2 passes," added another anonymous developer, said to be working on a title that implements DirectX 10.1 support – in addition to DirectX 10.

The quoted part makes Ubisoft's reasoning pointless. How can making a process take one less step to finish as "costly"? Which ultimately "feeds fuel to the fire" that there really is a different reason.


http://techreport.com/discussions.x/14707
http://www.bit-tech.net/news/hardware/2008/05/12/ubisoft-caught-in-assassin-s-creed-marketing-war/1
http://www.tgdaily.com/content/view/37326/98/
http://www.fudzilla.com/index.php?option=com_content&task=view&id=7355&Itemid=1
 
Joined
Aug 19, 2008
Messages
107 (0.02/day)
Location
Jozi
If we would all get over ourselves. Changing device ID's to get software working is not new. How about when FarCry detected that you were using NV3X, hardware the precision was changed to FP16 instead of FP32? you could could disable this by changing vendor and device ID. (Had the FarCry ATI demo which apparently used Truform that would not work with NVIDIA hardware but changing device and vendor ID allowed it to work on NV hardware as well)
Also Batman is Unreal3 engine which actually doesn't support AA (at least MSAA) natively, so some tweaking needs to be done to get it to support it properly. If NVIDIA paid for these optimizations and getting AA working on this title then they should benefit from this. (TWIMTBP isn't just a stamp, they actually send people out to sit with developers and optimize the game together. No money is paid to the developer to lower performance on competitor hardware!)

ATI used to have a GITG campaign which vanished into thin air, despite the company having said NVIDIA's campaign is nothing more than a marketing gimmick. :rolleyes:

I'm not sure of the AA implementation of UE3 games on the Xbox, but chances are it's SSAA which is exactly what we can get on our ATI graphics cards and if be it SSAA or MSAA, on a console you can tweak performance right down to per cycle level. don't compare a closed system with a PC.

So before we say there's a conspiracy, lets calm ourselves and think about it a little.
 
Joined
May 19, 2007
Messages
7,662 (1.24/day)
Location
c:\programs\kitteh.exe
Processor C2Q6600 @ 1.6 GHz
Motherboard Anus PQ5
Cooling ACFPro
Memory GEiL2 x 1 GB PC2 6400
Video Card(s) MSi 4830 (RIP)
Storage Seagate Barracuda 7200.10 320 GB Perpendicular Recording
Display(s) Dell 17'
Case El Cheepo
Audio Device(s) 7.1 Onboard
Power Supply Corsair TX750
Software MCE2K5
Well if nVidia was the one that paid for the AA optimizations to be put in the game, I see no problem with them limitting them to nVidia hardware also.

For all we know, if nVidia didn't put the money in to implement the AA optimizations, we would never have seen them in the game, so why should ATi benefit from that?

It might not be a case of nVidia or the game developers removing a feature, but instead a case of nVidia paying to have the feature added in the first place.

These could have been performance optimizations that nVidia entirely paid for, the whole purpose of TWIMTBP program, so why should then enable them for ATi?

Or a completely different reasoning:

It could be that having it enabled with ATi cards causes problems in the retail game(remember they only tested this on the demo). For all we know, something with the way ATi cards handles AA causes the game to crash or be extremely buggy with the optimized AA enabled. Maybe a certain part of the game is completely unplayable on ATi cards with the feature enabled, so the developers(nothing to do with nVidia at all) just gave up trying to fix it, and simply disabled the feature on ATi cards as a quick fix to get the game shipped. They now have more time to work on a patch to make it work. It wouldn't be the first time we've seen games have problems with one manufacturer, but not the other, due to certain visual elements conflicting with the current drivers.

Either way, I highly doubt nVidia caused a feature that was already in the game to be disabled.



How do you know this?

anyone else hear this shit?
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,472 (4.23/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
There are many articles out there, hinting towards that. But in some articles Nvidia insists they had no hand to play in the removal of DirectX10.1 support, which is naturally what they'll say (and Ubisoft likewise says that "implementation is costly" - see the TechReport link).

DirectX 10.1 support was removed because:

1. Nvidia cards don't support it.
2. HD 3000 series cards were 20% better than their respective 9000 GT series counterparts with DirectX 10.1 (and AA enabled).



The quoted part makes Ubisoft's reasoning pointless. How can making a process take one less step to finish as "costly"? Which ultimately "feeds fuel to the fire" that there really is a different reason.


http://techreport.com/discussions.x/14707
http://www.bit-tech.net/news/hardware/2008/05/12/ubisoft-caught-in-assassin-s-creed-marketing-war/1
http://www.tgdaily.com/content/view/37326/98/
http://www.fudzilla.com/index.php?option=com_content&task=view&id=7355&Itemid=1

Ah, so a bunch of conspiracies with no evidence at all. Thats what I thought.
 
Joined
Feb 19, 2009
Messages
1,151 (0.21/day)
Location
I live in Norway
Processor R9 5800x3d | R7 3900X | 4800H | 2x Xeon gold 6142
Motherboard Asrock X570M | AB350M Pro 4 | Asus Tuf A15
Cooling Air | Air | duh laptop
Memory 64gb G.skill SniperX @3600 CL16 | 128gb | 32GB | 192gb
Video Card(s) RTX 4080 |Quadro P5000 | RTX2060M
Storage Many drives
Display(s) M32Q,AOC 27" 144hz something.
Case Jonsbo D41
Power Supply Corsair RM850x
Mouse g502 Lightspeed
Keyboard G913 tkl
Software win11, proxmox
Benchmark Scores 33000FS, 16300 TS. Lappy, 7000 TS.
Game just got patched. :p
 
Joined
Mar 2, 2009
Messages
5,061 (0.92/day)
Processor AMD Ryzen 5 7600
Motherboard Gigabyte B650 Aorus Elite AX
Cooling Thermalright Peerless Assassin 120 SE
Memory Kingston Fury Beast DDR5-5600 16GBx2
Video Card(s) Gigabyte Gaming OC AMD Radeon RX 7800 XT 16GB
Storage TEAMGROUP T-Force Z440 2TB, SPower A60 2TB, SPower A55 2TB, Seagate 4TBx2, Samsung 870 2TB
Display(s) AOC 24G2 + Xitrix WFP-2415
Case Montech Air X
Audio Device(s) Realtek onboard
Power Supply Be Quiet! Pure Power 11 FM 750W 80+ Gold
Mouse Logitech G304
Keyboard Redragon K557 KAIA RGB Mechanical Keyboard
Software Windows 10
Ah, so a bunch of conspiracies with no evidence at all. Thats what I thought.

EVIDENCE 1: A specific API for the game can increase performance
EVIDENCE 2: That API however is only supported by ATi cards

API is removed from the game. So the game developers doesn't want increased performance from their games? :confused:
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,472 (4.23/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
EVIDENCE 1: A specific API for the game can increase performance
EVIDENCE 2: That API however is only supported by ATi cards

API is removed from the game. So the game developers doesn't want increased performance from their games? :confused:

Thats grasping at straws, at best.

And if the increasd performance comes at the cost of stability, no they probably don't. Especially when they have to handle all the calls from people complaining about the game crashing all the time. I bet if they gave all those people your phone number, you'd probably want DX10.1 removed also.

And since I forgot to add it in the previous post. DX10.1 is "costly" because it requires extra developement time to implement into the game code. It is not costly to render, which is what the quote you posted talks about. However, that is not the "costly" we are talking about when we say it is costly to implement. DX10 has to be implemented either way, DX10.1 only adds to developement costs.
 
Joined
Feb 21, 2008
Messages
4,985 (0.84/day)
Location
Greensboro, NC, USA
System Name Cosmos F1000
Processor i9-9900k
Motherboard Gigabyte Z370XP SLI, BIOS 15a
Cooling Corsair H100i, Panaflo's on case
Memory XPG GAMMIX D30 2x16GB DDR4 3200 CL16
Video Card(s) EVGA RTX 2080 ti
Storage 1TB 960 Pro, 2TB Samsung 850 Pro, 4TB WD Hard Drive
Display(s) ASUS ROG SWIFT PG278Q 27"
Case CM Cosmos 1000
Audio Device(s) logitech 5.1 system (midrange quality)
Power Supply CORSAIR HXi HX1000i 1000watt
Mouse G400s Logitech
Keyboard K65 RGB Corsair Tenkeyless Cherry Red MX
Software Win10 Pro, Win7 x64 Professional
Bottom line is that this game, with help from NVIDIA, was intentionally neutered for when it was run on non NVIDIA hardware.

Bottom line is if you don't support ATi equally its an "evil" game and you feel wronged?

If Nvidia pays for development they could make sure ATi cards can't play it at all. The developer is a company. Its not required to make a game run in any way. If it doesn't play to your liking don't buy it.

There is no "international video game creation bill of rights". A company can make a game play however they want as long as it doesn't cause harm to the person playing it or his/her property. Thats reality. If a game developer doesn't support your hardware to your liking don't buy the game. :)

Game just got patched. :p

If you are serious it makes this look no longer intentional by the developer. Got a link. trollin? :)
 
Joined
Mar 2, 2009
Messages
5,061 (0.92/day)
Processor AMD Ryzen 5 7600
Motherboard Gigabyte B650 Aorus Elite AX
Cooling Thermalright Peerless Assassin 120 SE
Memory Kingston Fury Beast DDR5-5600 16GBx2
Video Card(s) Gigabyte Gaming OC AMD Radeon RX 7800 XT 16GB
Storage TEAMGROUP T-Force Z440 2TB, SPower A60 2TB, SPower A55 2TB, Seagate 4TBx2, Samsung 870 2TB
Display(s) AOC 24G2 + Xitrix WFP-2415
Case Montech Air X
Audio Device(s) Realtek onboard
Power Supply Be Quiet! Pure Power 11 FM 750W 80+ Gold
Mouse Logitech G304
Keyboard Redragon K557 KAIA RGB Mechanical Keyboard
Software Windows 10
Thats grasping at straws, at best.

And if the increasd performance comes at the cost of stability, no they probably don't. Especially when they have to handle all the calls from people complaining about the game crashing all the time. I bet if they gave all those people your phone number, you'd probably want DX10.1 removed also.

And since I forgot to add it in the previous post. DX10.1 is "costly" because it requires extra developement time to implement into the game code. It is not costly to render, which is what the quote you posted talks about. However, that is not the "costly" we are talking about when we say it is costly to implement. DX10 has to be implemented either way, DX10.1 only adds to developement costs.

I wasn't grasping at straws actually, since I made a question, not a statement.

The removal of the DX10.1 support was THROUGH a patch. So how did they get to the initial version in the first place if it was costly? Why did they include it in the first place then?

Stability issues were almost always because of an Nvidia card though (pre-patch). And a post here also talking about AC with an ATi card, said his game was running perfectly before the patch, but then crashes after the patch. Selective stability then?
 
Last edited:
Joined
Feb 21, 2008
Messages
4,985 (0.84/day)
Location
Greensboro, NC, USA
System Name Cosmos F1000
Processor i9-9900k
Motherboard Gigabyte Z370XP SLI, BIOS 15a
Cooling Corsair H100i, Panaflo's on case
Memory XPG GAMMIX D30 2x16GB DDR4 3200 CL16
Video Card(s) EVGA RTX 2080 ti
Storage 1TB 960 Pro, 2TB Samsung 850 Pro, 4TB WD Hard Drive
Display(s) ASUS ROG SWIFT PG278Q 27"
Case CM Cosmos 1000
Audio Device(s) logitech 5.1 system (midrange quality)
Power Supply CORSAIR HXi HX1000i 1000watt
Mouse G400s Logitech
Keyboard K65 RGB Corsair Tenkeyless Cherry Red MX
Software Win10 Pro, Win7 x64 Professional
Thats grasping at straws, at best.

And if the increasd performance comes at the cost of stability, no they probably don't. Especially when they have to handle all the calls from people complaining about the game crashing all the time. I bet if they gave all those people your phone number, you'd probably want DX10.1 removed also.

And since I forgot to add it in the previous post. DX10.1 is "costly" because it requires extra developement time to implement into the game code. It is not costly to render, which is what the quote you posted talks about. However, that is not the "costly" we are talking about when we say it is costly to implement. DX10 has to be implemented either way, DX10.1 only adds to developement costs.


Thats true. But I did hear that DX11(not DX10.1) makes it cost less because they made the tools easier to use for the developers somehow with the creation of DX11.
 
Joined
Feb 19, 2009
Messages
1,151 (0.21/day)
Location
I live in Norway
Processor R9 5800x3d | R7 3900X | 4800H | 2x Xeon gold 6142
Motherboard Asrock X570M | AB350M Pro 4 | Asus Tuf A15
Cooling Air | Air | duh laptop
Memory 64gb G.skill SniperX @3600 CL16 | 128gb | 32GB | 192gb
Video Card(s) RTX 4080 |Quadro P5000 | RTX2060M
Storage Many drives
Display(s) M32Q,AOC 27" 144hz something.
Case Jonsbo D41
Power Supply Corsair RM850x
Mouse g502 Lightspeed
Keyboard G913 tkl
Software win11, proxmox
Benchmark Scores 33000FS, 16300 TS. Lappy, 7000 TS.
The setting Now says :
Use ati Control Panel, Why doesnt it work if ati themself made it work by changing ID ? .


http://bildr.no/view/497400
 
Joined
Feb 8, 2008
Messages
2,665 (0.45/day)
Location
Switzerland
Processor i9 9900KS ( 5 Ghz all the time )
Motherboard Asus Maximus XI Hero Z390
Cooling EK Velocity + EK D5 pump + Alphacool full copper silver 360mm radiator
Memory 16GB Corsair Dominator GT ROG Edition 3333 Mhz
Video Card(s) ASUS TUF RTX 3080 Ti 12GB OC
Storage M.2 Samsung NVMe 970 Evo Plus 250 GB + 1TB 970 Evo Plus
Display(s) Asus PG279 IPS 1440p 165Hz G-sync
Case Cooler Master H500
Power Supply Asus ROG Thor 850W
Mouse Razer Deathadder Chroma
Keyboard Rapoo
Software Win 10 64 Bit
Well tell to ATI to invest more in the development and refinement of the drivers.

The biggest problem is the fact, ATI drivers has always been poor and bad at this point they would be like nvidia or even higher.

ATI GPU with tremendous computing power but they are too lazy to develop drivers able to exploit.

I would say to stop the childish acting fanboyism " i hate nvidia" etc. ...
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,472 (4.23/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
I wasn't grasping at straws actually, since I made a question, not a statement.

The removal of the DX10.1 support was THROUGH a patch. So how did they get to the initial version in the first place if it was costly? Why did they include it in the first place then?

Stability issues were almost always because of an Nvidia card though (pre-patch). And a post here also talking about AC with an ATi card said his game crashes after the patch. Selective stability then?

I was explaing why implementing DX10.1 is costly in the first place, as you seem to believe that it comes free. In the Ubisoft case, they didn't say they removed it because it was costly, their reason for removing it was because it made the game unstable. In that case, it had nothing to do with being costly to implement(though it might have been costly to fix the implementation...).

The stability issues with nVidia cards was due to PhysX mostly. I'm sure there were plenty of stability issue with ATi cards, but they were drastically overshadowed by the PhysX issues. It might have come down to a decision of what features to fix, and which features to just give up on. Sometimes that is what has to be done in the business world.

The patch definitely made the game more stable on both side, but no game is ever going to be perfect. There will always be crashes on certain configurations.
 
Top