• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Criticism of Nvidia's TWIMTBP Program - HardOCP's Just Cause 2 Review

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,472 (4.23/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
You have misunderstood me. I did not state that AA is something that exists in all graphics engines, but rather something that should exist in all graphics engines. Where we differ is that I place onus on the developer to ensure that this feature is included, irrespective of the engine they decide to use.

Yes, this is my thing as well. If they cannot do it, they shouldn't be developing games in the first palce. Basic stuff, AA. Next, they won't be able to get AF to work...

I mean really...you can expect very little from those developers, for sure. If it was any other character than Batman, the game would have been passed over by everyone that hyped it as being so good.

Game developers use pre-made engines for a reason. It is cheaper and less time consuming then creating their own. That certainly does not make them not suited to develope games. If we are saying that every game developer that uses a pre-made game engine shouldn't be making games, we wouldn't see 90% of the games we have, including some extremely good ones.

It takes huge resources to develope an engine, and it takes huge resources to implement such things as in game AA. Which is exactly why most game developers use another larger studio's pre-built engine. We are talking about some pretty big developers too: EA, Midway, Sony, Square Enix. Some pretty good games also: X-men Origins, Unreal Tournament 3, Rainbox Six Vegas and Vegas 2, Stranglehold, Mirror's Edge, and the list goes on.

Hell, even Unreal Tournament 3, the premier game on the Unreal Engine 3 lacks in game AA.

Uh, you owe me an apology.



http://ati.amd.com/products/radeonx1900/index.html



http://ati.amd.com/products/radeonx1800/index.html

Every card since the X800 has supported PS3.0. Lack of support on legacy products in unimportant.

No, just because you have an inability to read and comprehend, that does not mean I owe you and appology.

dave he has ony 3 ### meaning 850/800 or similar cards. At this time the 6800/7series already had it.

none of my X850 XTPE's has SM3....didnt run the putry dragon test in 3DM06 till I got a 7600

He said X### NOT X####. That is X800, X600... Nvidia had PS3.0 on their hardware GF6800, ATi was like 6 months late with PS3.0 hardware. Games like Oblivion lacked proper PS3.0 for that reason, for example.

See, they don't have a problem with reading and comprehension.
 

HeroPrinny

New Member
Joined
Feb 10, 2010
Messages
73 (0.01/day)
chill people chill, this could be a very interesting topic but this. back and forth slagging ruins it.
 

cadaveca

My name is Dave
Joined
Apr 10, 2006
Messages
17,232 (2.61/day)
Hell, even Unreal Tournament 3, the premier game on the Unreal Engine 3 lacks in game AA.

Actually, only in DX9 does it not support AA, due to using deferred rendering. It's the use of deffered rendering that makes it THE choice for games with Phys-X. There's no reason for AA to be left out of DX10 UE3 games.

Tim Sweeney: Yes, we'll ship Unreal Tournament 3 with full DirectX 10 support. Support for multisampling is the most visible benefit.
 

Benetanegia

New Member
Joined
Sep 11, 2009
Messages
2,680 (0.50/day)
Location
Reaching your left retina.
Uh, actually, it was. Back then it was called WGF, and was part of "Longhorn". Circa 2001. DX 9.0c released in 2004, 3 years after work on DX10 began.

http://en.wikipedia.org/wiki/Development_of_Windows_Vista

http://en.wikipedia.org/wiki/DirectX

Vista is not that Longhorn, in fact Vista was more of a rushed Win7 than anything else. The original Longhorn was delayed and much changed over the time, eventually adopting many things from the next OS M$ was working on, now called Win7. Things are not black and white, so M$ released 9.0c, because Longhorn was no longer what it was thought to be: a OS for 2003 release. They included things that were supossed to be part of Longhorn? Obviously they could not wait until Vista (aka rushed Win7) was released, so they implemented things that were much needed. They crippled DX10 because that? Nope and certainly Nvidia had nothing to do with that, which is what my post was about anyway.
 

EastCoasthandle

New Member
Joined
Apr 21, 2005
Messages
6,885 (0.99/day)
System Name MY PC
Processor E8400 @ 3.80Ghz > Q9650 3.60Ghz
Motherboard Maximus Formula
Cooling D5, 7/16" ID Tubing, Maze4 with Fuzion CPU WB
Memory XMS 8500C5D @ 1066MHz
Video Card(s) HD 2900 XT 858/900 to 4870 to 5870 (Keep Vreg area clean)
Storage 2
Display(s) 24"
Case P180
Audio Device(s) X-fi Plantinum
Power Supply Silencer 750
Software XP Pro SP3 to Windows 7
Benchmark Scores This varies from one driver to another.
You know what's interesting about that review is that even with those IQ features disabled the 480 still loses. I read some of the posts over there and one person said that the benchmark test clearly put the 480 ahead. I wonder if the drivers were optimized for the benchmark?
 

cadaveca

My name is Dave
Joined
Apr 10, 2006
Messages
17,232 (2.61/day)
Vista is not that Longhorn, in fact Vista was more of a rushed Win7 than anything else. The original Longhorn was delayed and much changed over the time, eventually adopting many things from the next OS M$ was working on, now called Win7. Things are not black and white, so M$ released 9.0c, because Longhorn was no longer what it was thought to be: a OS for 2003 release. They included things that were supossed to be part of Longhorn? Obviously they could not wait until Vista (aka rushed Win7) was released, so they implemented things that were much needed. They crippled DX10 because that? Nope and certainly Nvidia had nothing to do with that, which is what my post was about anyway.

We can start another thread on this all on it's own, there's so much to this subject. So I'll leave the O/T chat for now. Your impressions of what longhorn, WGF, and DX10 are is a bit skewed, but if you can put up with me, I'd more than love to talk about it in another thread.
 
Joined
Oct 5, 2008
Messages
1,802 (0.32/day)
Location
ATL, GA
System Name My Rig
Processor AMD 3950X
Motherboard X570 TUFF GAMING PLUS
Cooling EKWB Custom Loop, Lian Li 011 G1 distroplate/DDC 3.1 combo
Memory 4x16GB Corsair DDR4-3466
Video Card(s) MSI Seahawk 2080 Ti EKWB block
Storage 2TB Auros NVMe Drive
Display(s) Asus P27UQ
Case Lian Li 011-Dynamic XL
Audio Device(s) JBL 30X
Power Supply Seasonic Titanium 1000W
Mouse Razer Lancehead
Keyboard Razer Widow Maker Keyboard
Software Window's 10 Pro
The developer is only responsible to only deliver a product upon which the vast majority of their target audience will be willing to purchase, and thus sustain that company until next release. If you take issue with the product, vote with your wallet. Just like the Graphic's Card company has to measure feature(s) vs cost. In the end, the goal is simply to make profit.

We, are the minority that spends heavily(in comparison to the average), we demand more and drive the industry. If Nvidia wants to insure there product has a greater feature set, and yes paying a developer to test code is a valid way to do so. Then is not Nvidia's fault by any-means to do so. Where things get dicey, is when a company can pay to put a competitor at a disadvantage intentionally. That, if provable(which conclusively has not been or ever will be proven in a forum such as this), goes against Anti-Trust(mainly anti-competition) laws. Like the ones Apple is about to face, despite being a minority provider.

Now, your thinking well paying a developer to test there code on your hardware by definition puts an apposing GPU manufacturer at a disadvantage. No it doesn't, Ati had a choice, and it made its decision. Again, vote with your wallet if your unhappy with your horse.
 

Benetanegia

New Member
Joined
Sep 11, 2009
Messages
2,680 (0.50/day)
Location
Reaching your left retina.
There's no reason for AA to be left out of DX10 UE3 games.

I think we can all agree with that, although I'm not sure they did implement it in the end. And there's no reason to leave it out of DX9 either. It can be done after all and Batman is the proof. Fact of the matter is that all and every UE3 game released to date is a DX9 game, as so is 99% of the games released (copy/paste the code and create a DX10 executable doesn't make it a DX10 game). It's the choice that developers are making and only them are to blame. Nvidia's intervention (or AMD's in the case of the few games where they helped) is always a good thing for us, consumers, gamers.
 

cadaveca

My name is Dave
Joined
Apr 10, 2006
Messages
17,232 (2.61/day)
Sure, but my point is that in DX10 UE3, AA is natively supported(on DX10 hardware).

Q: So why did nV need to code it?

A: For DX9 cards.

Q: So why does that effect ATi cards in DX10?

A: Because the developer purposely changed the AA implementation throughout the engine, breaking AA on ATi's cards, using nV's code.

So, nV broke AA in Batman, and seemingly, purposefully. This is the real basis for the whole argument in the first place.
 
Last edited:

Benetanegia

New Member
Joined
Sep 11, 2009
Messages
2,680 (0.50/day)
Location
Reaching your left retina.
Q: So why does that effect ATi cards in DX10?

A: Because the developer purposely changed the AA implementation throughout the engine, breaking AA on ATi's cards, using nV's code.

Once again no. DX10 supports AA with deferred shading, but UE3 does not. Plain and simple. They said they would implement it on DX10, but I highly doubt they did in the end. And all that is irrelevant, since every game is using the DX9 code anyway (is there even a DX10 code path for UE3?). My point is it doesn't matter if Epic implemented AA on DX10 UE3 or not (if it exists at all), because developers are buying and using the DX9 one, for multi-platform purposes.
 

cadaveca

My name is Dave
Joined
Apr 10, 2006
Messages
17,232 (2.61/day)
Yes, there is a DX10 codepath, and I know for a fact that UE3 supports AA.

Tim Sweeny who I quoted above, is the engine's writer, by the way. I know for a fact that since early 2008 AA has worked in DX10...you can even enable it in Vista with the UT3 demo.

And they do not "just buy the DX9 version"...one of the major selling points of UE3 is that you need not worry as the engine will scale features automatically to the platform it's on. Nv's code changes this detection method to use nV's code for AA at all times, rather than defaulting to the normal path for AA.

I mean sure, nV's code is brilliant. It does come at very little performance penalty compared to running the native implementation, but that doesn't chagne that fact that the code customized the engine, and broke AA.

I can even explain why the native method comes at such a performance hit, but that's irrelevant.
 

Benetanegia

New Member
Joined
Sep 11, 2009
Messages
2,680 (0.50/day)
Location
Reaching your left retina.
Yes, there is a DX10 codepath, and I know for a fact that UE3 supports AA.

Tim Sweeny who I quoted above, is the engine's writer, by the way. I know for a fact that since early 2008 AA has worked in DX10...you can even enable it in Vista with the UT3 demo.

And they do not "just buy the DX9 version"...one of the major selling points of UE3 is that you need not worry as the engine will scale features automatically to the platform it's on. Nv's code changes this detection method to use nV's code for AA at all times, rather than defaulting to the normal path for AA.

I mean sure, nV's code is brilliant. It does come at very little performance penalty compared to running the native implementation, but that doesn't chagne that fact that the code customized the engine, and broke AA.

I can even explain why the native method comes at such a performance hit, but that's irrelevant.

I would like you to show proofs of that, because I was never able to use AA on UT3 (nor Mass Effect, nor Mirror's Edge, Borderlands, Brothers in Arms....), except via Control Panel, something that you can do with Batman too. I'd really apreciate it, because I see many people claim all those things everywhere and I have yet to see a single proof of anything.
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.63/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
http://ati.amd.com/products/radeonx1800/index.html

Every card since the X800 has supported PS3.0. Lack of support on legacy products in unimportant.
I started the X### thing here.

9### = 9700 and 9800 family of cards (DirectX 9.0, Pixel Shader 2.0)
X### = X800 and X850 family of cards (DirectX 9.0b, Pixel Shader 2.0b)
X1### = X1800, X1900, and X1950 family of cards (DirectX 9.0c, Pixel Shader 3.0)
HD 2### = HD 2900 family of cards (DirectX 10.0, Pixel Shader 4.0)
HD 3### = HD 3800 family of cards (DirectX 10.1, Pixel Shader 4.1)
HD 4### = HD 4800 family of cards (DirectX 10.1, Pixel Shader 4.1)
HD 5### = HD 5800 family of cards (DirectX 11.0, Pixel Shader 5.0)

The X### Pixel Shader 2.0b cards debuted after GeForce 6 series which supported Pixel Shader 3.0. X1### didn't debut until late 2005, after the introduction of GeForce 7 series. ATI was an entire generation behind NVIDIA at that time.
 
Last edited:

cadaveca

My name is Dave
Joined
Apr 10, 2006
Messages
17,232 (2.61/day)
I would like you to show proofs of that, because I was never able to use AA on UT3 (nor Mass Effect, nor Mirror's Edge, Borderlands, Brothers in Arms....), except via Control Panel, something that you can do with Batman too. I'd really apreciate it, because I see many people claim all those things everywhere and I have yet to see a single proof of anything.

Well, the DX10 AA uses SuperSampling AA. As we all know, not all drivers support this properly, so that may be a major source of people not getting working right. It's also why it comes at such a huge performance penalty.


This is why you can't get it working in those games...they've simply turned off the switches in the GUI to enable it, but beleive me, it's natively supported by the engine. Current gen cards definately have the power to use SSAA, but mid-range and lower cards might not be so capable.

What it boils down to is that deffered rendering is hard to balance properly for most programmers. So they have to make choices as to where they spend their dollars, as many have mentioned, and AA is low on this list, due to the aforementioned performance problems.

However, deffered rendering is the way of the future for all engines, so programmers better get used to it, and get good at it too...and when things happen like nV stepping in, this prevents the need for them to do so. THAT's why I have issue with nV and them helping with AA...it's cool that nV users get a benefit, but it should not break the experience for others.
 
Last edited:

Benetanegia

New Member
Joined
Sep 11, 2009
Messages
2,680 (0.50/day)
Location
Reaching your left retina.
Well, the DX10 AA uses SuperSampling AA. As we all know, not all drivers support this properly, so that may be a major source of people not getting working right. It's also why it comes at such a huge performance penalty.


This is why you can't get it working in those games...they've simply turned off the switches in the GUI to enable it, but beleive me, it's natively supported by the engine. Current gen cards definately have the power to use SSAA, but mid-range and lower cards might not be so capable.

What it boils down to is that deffered rendering is hard to balance properly for most programmers. So they have to make choices as to where they spend there dollars, as many have mentioned, and AA is low on this list, due to the aforementioned performacne problems.

Yeah SSAA has always been an option for UE3 (always through CP) and it also is available in BM:AA. It's MSAA what it lacks and what all this thing is about. While not exactly MSAA, Nvidia's implementation is similar and that's why it doesn't have as large a penalty as SSAA. And that was always the thing, that you had to enable SSAA through CP for Ati cards, but Nvidia had this MSAA+edge detect thing. SSAA has always been available and yes it's something anything DX10 can do because it's not really part of the fragment stage of the renderer, but that's not what has ever been discussed. I think you are a little bit lost here. So much written and you didn't even understood the beginning of the story. SSAA is there for all cards, "MSAA" is not.
 

cadaveca

My name is Dave
Joined
Apr 10, 2006
Messages
17,232 (2.61/day)
Yes, but the engine has the ability to enable SSAA with the console...there's no need to use the CP to enable it, other than developer's choice.(works with the UDK).

I get what you are saying, but that's not the point. SSAA is broken too, in Batman. It should not be.
 
Joined
Jul 21, 2008
Messages
5,175 (0.90/day)
System Name [Daily Driver]
Processor [Ryzen 7 5800X3D]
Motherboard [Asus TUF GAMING X570-PLUS]
Cooling [be quiet! Dark Rock Slim]
Memory [64GB Corsair Vengeance LPX 3600MHz (16GBx4)]
Video Card(s) [PNY RTX 3070Ti XLR8]
Storage [1TB SN850 NVMe, 4TB 990 Pro NVMe, 2TB 870 EVO SSD, 2TB SA510 SSD]
Display(s) [2x 27" HP X27q at 1440p]
Case [Fractal Meshify-C]
Audio Device(s) [Steelseries Arctis Pro]
Power Supply [CORSAIR RMx 1000]
Mouse [Logitech G Pro Wireless]
Keyboard [Logitech G512 Carbon (GX-Brown)]
Software [Windows 11 64-Bit]
mods shoulda listening to me in the beginning... it didnt turn into a nvidia bashing thread, but an E-Peen war on who knows more about AA
 

sneekypeet

Retired Super Moderator
Joined
Apr 12, 2006
Messages
29,409 (4.46/day)
System Name EVA-01
Processor Intel i7 13700K
Motherboard Asus ROG Maximus Z690 HERO EVA Edition
Cooling ASUS ROG Ryujin III 360 with Noctua Industrial Fans
Memory PAtriot Viper Elite RGB 96GB @ 6000MHz.
Video Card(s) Asus ROG Strix GeForce RTX 3090 24GB OC EVA Edition
Storage Addlink S95 M.2 PCIe GEN 4x4 2TB
Display(s) Asus ROG SWIFT OLED PG42UQ
Case Thermaltake Core P3 TG
Audio Device(s) Realtek on board > Sony Receiver > Cerwin Vegas
Power Supply be quiet DARK POWER PRO 12 1500W
Mouse ROG STRIX Impact Electro Punk
Keyboard ROG STRIX Scope TKL Electro Punk
Software Windows 11
so? its still civil, quit poking or you will earn yourself points.

Either post on the topic or move along, thank you.
 

BababooeyHTJ

New Member
Joined
Apr 6, 2009
Messages
907 (0.16/day)
Processor i5 3570k
Motherboard Gigabyte Z77 UD5h
Cooling XSPC Raystrom
Memory 4x4GB DDR3 1600mhz
Video Card(s) 7950 crossfire
Storage a few
Display(s) 27" Catleap
Case Xigmatek Elysium
Audio Device(s) Xonar STX
Power Supply Seasonic X1050
Software Windows 7 X64
Well, the DX10 AA uses SuperSampling AA. As we all know, not all drivers support this properly, so that may be a major source of people not getting working right. It's also why it comes at such a huge performance penalty.

The quote that you posted by Tim Sweeny mentions Multi-sampling, btw.
 

Benetanegia

New Member
Joined
Sep 11, 2009
Messages
2,680 (0.50/day)
Location
Reaching your left retina.
Yes, but the engine has the ability to enable SSAA with the console...there's no need to use the CP to enable it, other than developer's choice.(works with the UDK).

I get what you are saying, but that's not the point. SSAA is broken too, in Batman. It should not be.

It's the first time I hear that SSAA is broken for BMAA. It certainly isn't a widespread issue (0 google resluts on this end).

Maybe:

As we all know, not all drivers support this properly, so that may be a major source of people not getting working right. It's also why it comes at such a huge performance penalty.

Many games have issues with SSAA, so I'm not doubting that BM:AA has some. I've had some issues with many games myself.

In any case the point comes home that Nvidia has nothing to do with the lack of proper AA in BMAA (in fact, if SSAA is broken the implementation of their AA is a blessing). And neither does with any of the many other things they've been accused for. But I guess it's easier to blame someone, even if it's not the correct one...

Yes. And what is the difference between MSAA and SSAA?

Are you serious?
 

ctrain

New Member
Joined
Jan 12, 2010
Messages
393 (0.08/day)
Your assumption that AA is something that exists in all graphics engines and game titles by default is wrong.

of course it exists by default in all graphics engines, the question / problem at hand is rather if any of the rendering techniques they use breaks hardware aa...

...like the case is with UE3, it's a DX9 limitation
 

cadaveca

My name is Dave
Joined
Apr 10, 2006
Messages
17,232 (2.61/day)
Are you serious?


No, but if you understand the differences, you know that Tim was refering to SSAA only. DX10.1 is required for MSAA in deferred rendering.;)

Well, at least it was, and that's what makes nV's code so great. :wtf:

Why that screws with SSAA, I don't know the precise details of, but i do have a couple of good possibilities. I need an nV card to find out the exact cause.:eek:

If it is using DX10.1-type commands...wow...there really is something to this...because according to nV, thier card's don't support 10.1. So how the heck did they get that working? ;)
 

enaher

New Member
Joined
Feb 21, 2009
Messages
416 (0.08/day)
System Name One does not simply have enough cash for INTEL
Processor Athlon II 450 3.8ghz
Motherboard Gigabyte 990FX-UD3
Cooling Cooler Master 212+
Memory 4x4gb Corsair DDR3
Video Card(s) 2x 5830 Sapphire 950c/1150m CF
Storage 2x Seagate 500GB Raid0 + Samsung 1TB
Display(s) BenQ G2220HDA
Case Xigmatek ASGARD PRO
Power Supply SeasonicX 750w
Software W7 x64 Ultimate
Well the use of proprietary technology sucks big time, but Nvidia's TWIMTP, is not to blame, Nvidia offers extra's and that's fine, I can't see nothing wrong with it, actually Id like Ati to work with Devs, and improve Stream and Opencl applications and promote open standards, I'm sure Ati hardware has some untapped potential.
 

Benetanegia

New Member
Joined
Sep 11, 2009
Messages
2,680 (0.50/day)
Location
Reaching your left retina.
DX10.1 is required for MSAA in deferred rendering.;)

:confused: It was always posible via a separate Z buffer (even on DX9), although was more tricky and maybe not as desirable. DX10.1 only made it faster and introduced a clearer interface.

edit: Yeah it wouldn't be exactly MSAA, but rather an edge detect algorithm + SSAA on the edges. So... it wouldn't be a Coke, it would be Pepsi, but essentially the same. It's probably exactly what it's being done in BM:AA.
 
Last edited:
Top