• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Criticism of Nvidia's TWIMTBP Program - HardOCP's Just Cause 2 Review

Joined
Apr 12, 2010
Messages
1,359 (0.27/day)
Processor Core i7 920
Motherboard Asus P6T v2
Cooling Noctua D-14
Memory OCZ Gold 1600
Video Card(s) Powercolor PCS+ 5870
Storage Samsung SpinPoint F3 1 TB
Display(s) Samsung LE-B530 37" TV
Case Lian Li PC-B25F
Audio Device(s) N/A
Power Supply Thermaltake Toughpower 700w
Software Windows 7 64-bit
I am not a fan of Nvidia’s marketing practices as I firmly believe they damage the PC gaming industry, creating division through the use of proprietary technology where open standards are available that can produce exactly the same effects. Many people criticise ATI for the scant attention it pays to developer relations, citing the tangible advances that Nvidia offers gamers through its TWIMTBP releases and the use of CUDA and PhysX. However, I do not want to see ATI respond in kind as Nvidia seems intent on creating a situation wherein the consumer will eventually be forced to ask whether a given game is an “ATI title” or an “Nvidia title” wherein performance is essentially crippled on the competitor’s cards. I did not buy, nor will I buy Batman Arkham Asylum as Nvidia paid the developer to turn off in-game AA on ATI cards. I find that reprehensible: it is one thing to optimise a title for your hardware; it is another thing to pay the developer to ensure that the performance of a given title is reduced when you use the competitor’s hardware. Again, whilst many people ask why ATI didn’t pay the developer to ensure that this function was enabled for its hardware, I firmly believe that development should be left to the developers and that certain aspects of a game, such as AA, should be available irrespective of the brand of graphics card that the consumer decides to purchase. By all accounts Arkham Asylum is an excellent game; however, my principles will not allow me to support such practices with my money – to each, his own.

I just finished reading the review of Just Cause 2 over on HardOCP. It was very refreshing to find a reviewer who, despite the obvious pressure placed on tech sites, was willing to openly criticise Nvidia’s TWIMTBP program and marketing practices:

http://www.hardocp.com/article/2010/05/04/just_cause_2_gameplay_performance_image_quality

The Way It’s Meant to be Played?
We have no doubt that the Bokeh filter and the GPU Water Simulation options could have been executed successfully on AMD’s Radeon HD 5000 series of GPUs. That the developers chose NVIDIA’s CUDA technology over Microsoft DirectCompute or even OpenCL is probably due to the fact that NVIDIA’s developer relations team worked with Avalanche Studios developers, and of course they like to promote their own products. (We would surely love to see the contract between the two, but that will never happen.) It is certainly their right to do so, just as it is Avalanche’s right to choose whatever API they want to use. We would certainly not presume to tell any independent game developer how to design their own game, but we would suggest that a more open alternative (such as OpenCL or DirectCompute) would have been preferred by us for those gamers without CUDA compatible hardware.
This is an old argument, and is basically analogous to the adoption of PhysX as opposed to a more broadly compatible physics library. NVIDIA wants to increase its side of the GPU business by giving its customers a "tangible" advantage in as many games as possible, while gamers without NVIDIA hardware would prefer that game developers had not forget about them.
As it stands for Just Cause 2, gamers without NVIDIA hardware are missing a couple of really nice graphics features, but those features are not critical to the enjoyment of the game. Just Cause 2 still looks just fine and is just as fun without them. But if you want the very best eye candy experience possible, NVIDIA's video cards, especially the GeForce GTX 480 and GTX 470, will give it to you.
When NVIDIA tells us that it will "Do no harm!" when it comes to gaming, that is really a bold faced lie, and we knew it when it was told to us. It will do no harm to PC gaming when it fits its agenda. NVIDIA is going to continue to glom onto its proprietary technologies so that it gains a marketing edge, which it very much does though its TWIMTBP program. And we have to assume that marketing edge is worth all the bad press it does generate. To say NVIDIA does not harm to PC gaming is a delusional at best. You AMD users just got shafted on these cool effects that could have been easily developed for all PC gamers instead of just those that purchase from one company.


This is another title that I refuse to buy. I doubt that this will cause much concern to Avalanche Studios, but if sufficient numbers avoid a developer’s titles because it allows Nvidia access to areas of development that should employ open standards, the developers may begin to take notice. Hopefully, TWIMTBP will become a thing of the past as it creates division and artificially introduced differences that are neither necessary nor desirable from the point of view of the consumer.
 
Last edited:

Binge

Overclocking Surrealism
Joined
Sep 15, 2008
Messages
6,979 (1.23/day)
Location
PA, USA
System Name Molly
Processor i5 3570K
Motherboard Z77 ASRock
Cooling CooliT Eco
Memory 2x4GB Mushkin Redline Ridgebacks
Video Card(s) Gigabyte GTX 680
Case Coolermaster CM690 II Advanced
Power Supply Corsair HX-1000
reported for flame/troll-tastic remarks

He's stated his opinion and cited an article that supports a similar view. How is that trolling? Let the man speak his mind, but if you disagree simply explain yourself. The issue here is that NV puts money and man-power into finding ways for games to run better on their GPUs. If it's bad marketing let it be known, but if it's just your average trick of the trade then this thread will die in a matter of hours.
 
Joined
Jul 21, 2008
Messages
5,174 (0.90/day)
System Name [Daily Driver]
Processor [Ryzen 7 5800X3D]
Motherboard [Asus TUF GAMING X570-PLUS]
Cooling [be quiet! Dark Rock Slim]
Memory [64GB Corsair Vengeance LPX 3600MHz (16GBx4)]
Video Card(s) [PNY RTX 3070Ti XLR8]
Storage [1TB SN850 NVMe, 4TB 990 Pro NVMe, 2TB 870 EVO SSD, 2TB SA510 SSD]
Display(s) [2x 27" HP X27q at 1440p]
Case [Fractal Meshify-C]
Audio Device(s) [Steelseries Arctis Pro]
Power Supply [CORSAIR RMx 1000]
Mouse [Logitech G Pro Wireless]
Keyboard [Logitech G512 Carbon (GX-Brown)]
Software [Windows 11 64-Bit]
it's an nvidia bashing thread... how is that not trolling and why is it in the games section
 

sneekypeet

Retired Super Moderator
Joined
Apr 12, 2006
Messages
29,409 (4.47/day)
System Name EVA-01
Processor Intel i7 13700K
Motherboard Asus ROG Maximus Z690 HERO EVA Edition
Cooling ASUS ROG Ryujin III 360 with Noctua Industrial Fans
Memory PAtriot Viper Elite RGB 96GB @ 6000MHz.
Video Card(s) Asus ROG Strix GeForce RTX 3090 24GB OC EVA Edition
Storage Addlink S95 M.2 PCIe GEN 4x4 2TB
Display(s) Asus ROG SWIFT OLED PG42UQ
Case Thermaltake Core P3 TG
Audio Device(s) Realtek on board > Sony Receiver > Cerwin Vegas
Power Supply be quiet DARK POWER PRO 12 1500W
Mouse ROG STRIX Impact Electro Punk
Keyboard ROG STRIX Scope TKL Electro Punk
Software Windows 11
I guess when the OS was loaded you were ok with bill gates and his "the way a PC is to be run?"

Same practices, and lets not forget Intels latest fiasco with OEM builders and what they did to AMD.

You are simply forgetting the golden rule....The one with the gold makes the rules!

I understand the frustration, but Nvidia is just following the footsteps of much larger entities that have set precedence before they got here;)

@ shib, the OP is directed at both TWIMTBP and Just Cause 2;) Also he IS allowed to bash and take sides to an article, its usually what follows those posts is where the trouble derives.
 
Last edited:

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,472 (4.23/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
I did not buy, nor will I buy Batman Arkham Asylum as Nvidia paid the developer to turn off in-game AA on ATI cards. I find that reprehensible: it is one thing to optimise a title for your hardware; it is another thing to pay the developer to ensure that the performance of a given title is reduced when you use the competitor’s hardware.

I believe you have a very incorrect view on this subject. Your claim is entirely untrue, and has been talked about many times on this forum and others. Batman Arkham Asylum is based on the Unreal Engine. By default the Unreal Engine does not have AA capabilities. What this means is that the developer has to add AA if it wants to have it built into the game. This is a feature that would likely not have existed in the game at all if nVidia had not had it added. In this case, nVidia paid to have AA added into the game for their hardware, they also paid for the testing to ensure it worked on their hardware. ATi on the other hand did not, so I fail to see why you think they should benefit from it. NVidia most certainly did NOT pay to have AA disabled on ATi hardware, they paid to have it enabled on their hardware. And in fact, when it is forced enabled on ATi hardware, the game doesn't work. Why? Because it wasn't tested or optimized for ATi hardware. Why wasn't it tested or optimized for ATi hardware, because nVidia was paying to have it developed, tested, and optimized on their hardware not their competitor's.

Your assumption that AA is something that exists in all graphics engines and game titles by default is wrong.


This is another title that I refuse to buy. I doubt that this will cause much concern to Avalanche Studios, but if sufficient numbers avoid a developer’s titles because it allows Nvidia access to areas of development that should employ open standards, the developers may begin to take notice. Hopefully, TWIMTBP will become a thing of the past as it creates division and artificially introduced differences that are neither necessary nor desirable from the point of view of the consumer.

Again, if nVidia pays for the developement of something, it has every right to not allow it to be used on the competitors hardware, and that is likely the case here. I see nothing wrong with that. Especially if it is something that likely wouldn't have been included in the game at all if not for nVidia paying to have it added(such is the case with AA in Batman). I'm not sure if this is the case with the effects in Just Cause 2 that they are talking about, I don't really see how either is necessary in the game, so I am actually very doubtful that either would be in the game at all if nVidia had not paid for the developement. I have no problem with Just Cuase 2 on my HD4890. It looks great and runs perfectly.

And oddly enough, you make a big stink about how Just Cause 2 should have used more open standards, and yet they used Havok for the physics engine instead of PhysX. I would think if nVidia was really that deap and paying them so well, they would have used nVidia's PhysX instead of Havok.
 
Last edited:

HeroPrinny

New Member
Joined
Feb 10, 2010
Messages
73 (0.01/day)
Again, if nVidia pays for the developement of something, it has every right to not allow it to be used on the competitors hardware, and that is likely the case here. I see nothing wrong with that. Especially if it is something that likely wouldn't have been included in the game at all if not for nVidia paying to have it added(such is the case with AA in Batman). I'm not sure if this is the case with the effects in Just Cause 2 that they are talking about, I don't really see how either is necessary in the game, so I am actually very doubtful that either would be in the game at all if nVidia had not paid for the developement. I have no problem with Just Cuase 2 on my HD4890. It looks great and runs perfectly.
Only issue with doing things like that, it that it doesn't booster fair play, why should those with an ATI card be hindered and those who have an nVidia aren't, pc gaming isn't like console gaming. The Batman thing I have no issue with due to unreal having pretty much no AA in the first place and nVidia added it, now if ATi ever decides to add AA it would be really nice.
 

sneekypeet

Retired Super Moderator
Joined
Apr 12, 2006
Messages
29,409 (4.47/day)
System Name EVA-01
Processor Intel i7 13700K
Motherboard Asus ROG Maximus Z690 HERO EVA Edition
Cooling ASUS ROG Ryujin III 360 with Noctua Industrial Fans
Memory PAtriot Viper Elite RGB 96GB @ 6000MHz.
Video Card(s) Asus ROG Strix GeForce RTX 3090 24GB OC EVA Edition
Storage Addlink S95 M.2 PCIe GEN 4x4 2TB
Display(s) Asus ROG SWIFT OLED PG42UQ
Case Thermaltake Core P3 TG
Audio Device(s) Realtek on board > Sony Receiver > Cerwin Vegas
Power Supply be quiet DARK POWER PRO 12 1500W
Mouse ROG STRIX Impact Electro Punk
Keyboard ROG STRIX Scope TKL Electro Punk
Software Windows 11
Ever think that when you buy that cheaper ATI card (price wise) development doesnt come into play? It isnt like Nvidia users arent getting stabbed in the backside in pricing to offset said R&D.
 

HeroPrinny

New Member
Joined
Feb 10, 2010
Messages
73 (0.01/day)
Ever think that when you buy that cheaper ATI card (price wise) development doesnt come into play? It isnt like Nvidia users arent getting stabbed in the backside in pricing to offset said R&D.
That's all pricing wise though and you have a choice to get an ATI or an Nvidia video card, you don't have a choice to have Just Cause 2 look the same as it does on Nvidia if you have an ATi. Then there's the whole metro 2033 issue, and it seems liek it always has phsyx on, as adding a 8600gt and doing the phsyx mod boosted my fps a lot to the point of it was playable fully cranked on my 5870.
 

Wile E

Power User
Joined
Oct 1, 2006
Messages
24,318 (3.79/day)
System Name The ClusterF**k
Processor 980X @ 4Ghz
Motherboard Gigabyte GA-EX58-UD5 BIOS F12
Cooling MCR-320, DDC-1 pump w/Bitspower res top (1/2" fittings), Koolance CPU-360
Memory 3x2GB Mushkin Redlines 1600Mhz 6-8-6-24 1T
Video Card(s) Evga GTX 580
Storage Corsair Neutron GTX 240GB, 2xSeagate 320GB RAID0; 2xSeagate 3TB; 2xSamsung 2TB; Samsung 1.5TB
Display(s) HP LP2475w 24" 1920x1200 IPS
Case Technofront Bench Station
Audio Device(s) Auzentech X-Fi Forte into Onkyo SR606 and Polk TSi200's + RM6750
Power Supply ENERMAX Galaxy EVO EGX1250EWT 1250W
Software Win7 Ultimate N x64, OSX 10.8.4
I have to agree with Newtekie. Optimizing a title for nVidia is not the same as crippling ATI. The "crippling ATI" argument has no grounds to stand on. Don't you think ATI would have sued nV for anti-trust by now if that was actually what was happening?

All in all, it's up to the devs to use CUDA over OpenCL and DirectCompute. And I'll give you a huge reason why they choose to do so: The developer tools are much easier to use. Not to mention the hands on support nV gives.

Nothing is stopping ATI from offering the same level of dev help.
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.63/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
it's an nvidia bashing thread... how is that not trolling and why is it in the games section
NVIDIA doesn't deserve it? :p They've been pretty shady since, oh, about 2004 or 2005 making deals with Microsoft (prevented X### cards from having Pixel Shader 3.0 support and breaking DX10 into two parts: 10.0 and 10.1 because NVIDIA couldn't meet the full requirements) to monopolistic behavior after acquiring Ageia PhysX to numerous repackaging of damn similar products to milk the dying cash cow (could be filed under false advertising).

This is just another example of NVIDIA pushing their weight around.


Don't you think ATI would have sued nV for anti-trust by now if that was actually what was happening?
There's only so much AMD can do though because they aren't exactly in the best shape to finance a massive lawsuit.

Edit: I'm sure there was a recent lawsuit but I can't find any information on it. The FTC suit against ATI/NVIDIA and FTC suit against Intel are taking all the Google hits. :(
 
Last edited:

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,472 (4.23/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
Only issue with doing things like that, it that it doesn't booster fair play, why should those with an ATI card be hindered and those who have an nVidia aren't, pc gaming isn't like console gaming. The Batman thing I have no issue with due to unreal having pretty much no AA in the first place and nVidia added it, now if ATi ever decides to add AA it would be really nice.

How are those with ATi cards hindered? They aren't.

If the features were something that would have never existed if nVidia didn't pay to have them implemented, then those with ATi cards are not missing out on anything they would have had if nVidia never stepped in.

I'll give an easy example:

Lets say the developers had planned to include features A, B, C, and D. Then nVidia comes along and says "we'll pay for you to develope and include feature E, but only for our hardware". Then ATi cards are not being hindered becuase they can't use feature E, they still get the original A, B, C, and D they would have gotten.

Now, if those features were going to be part of the game for everyone, and nVidia paid to have them disabled on ATi hardware, then yes I agree with you. However, I have yet to see a single piece of evidence pointing to that being the case. Everything shows that nVidia paid to have the features added, and TWIMTBP is a program designed to get developers to optimize their games for nVidia hardware. This means making them run better on nVidia hardware than they would have with no optimization, which is how they run on ATi hardware.

prevented X### cards from having Pixel Shader 3.0 support

Wait. How exactly is nVidia to blame for ATi not including PS 3.0 support in their hardware exactly?
 

Benetanegia

New Member
Joined
Sep 11, 2009
Messages
2,680 (0.50/day)
Location
Reaching your left retina.
Only issue with doing things like that, it that it doesn't booster fair play, why should those with an ATI card be hindered and those who have an nVidia aren't, pc gaming isn't like console gaming.

Because of what most people forget: if Nvidia didn't include those features they would have never been implemented. Nobody would enjoy those extra features if Nvidia didn'¡t push developers. What's best?

a- No one gets any features.
b- Some get the features for those cards where the features have been tested and developed for.

Because there really isn't any other option.

It's an extension of what most people know is happening in the gaming world, but choose to forget on these Nvidia bashing threads: developers don't really want to develop for PC and adding PC centric features is the last thing they would do without external help. Without any help all that we would get is a straight console port. Yeah I know what you are thinking, you think we already do, but you are wrong. We call them ports, but they could be much much worse and it's only thanks to Nvidia and AMD that ports are something close to a decent PC game.
 
Joined
Oct 27, 2007
Messages
1,132 (0.19/day)
System Name Grandpa
Processor i5 4690K
Motherboard Gigabyte Z97X-UD5H-BK
Cooling water
Memory 8GB Corsair Vengence 2400MHz
Video Card(s) Gigabyte 5850 x2
Storage Samsung SM951
Display(s) Catleap 27"
Case coolermaster stacker
Power Supply corsair AX860i
Mouse logitech g5 original
Keyboard Ducky
Software Windows 8.1

cadaveca

My name is Dave
Joined
Apr 10, 2006
Messages
17,232 (2.62/day)
Your assumption that AA is something that exists in all graphics engines and game titles by default is wrong.

Batman doesn't matter. It was a crap game of endless repeating tasks in a varied environment.

They matched the Batman flavour and ambience, but in actual gameplay, I could not have been more bored.

That said, AA in Unreal3 engines works in many other titles. The simple fact it doesn't work at all on ATi cards says something far greater than nVidia paying for it to NOT work on ATI cards...

Which, whether you want to argue that they paid for the development, so it's thier right to restrict it's use, or that the gpu architectures are different, so require different code, doesn't matter. The damning fact is that had nVidia NOT helped develop the AA code, the developers would have been forced to either write code for both, or none at all.

I mean, I more than welcome nVidia to get developers to work with Phys-X. But AA is a different story...they just simply shouldn't have messed in that part of the game. The act of helping develop the code forced a situation where the developer would naturally expect ATi to provide the same help, rather than relying on thier own in-house resources. The complaint has nothing to do with the code itself, but the actual effect on the companies' mentality when dealing with ATi that is the issue.

nVidia jsut shouldn't have developed code for something as basic as AA in unreal engine. It's really not that hard to do, or they'd be the only Unreal game with AA...and they certainly are not.

Really, nVidia's owner claims to not be a hardware company, but a software company that sells hardware. There are mixed interests here...nVidia is using thier clout in one market to effect another, and that, my friend, is illegal. M$ paid lots of money for the same basic business practices.

DX10 was delayed due to nV not supporting the API properly. DX10.1 was barely a whisper, again, thanks to a lack of nV support.

Heck, tesselation, nearly a decade old, has finally made into DirectX...only because of nV's support.

nVidia truly is ruining the industry, and in more ways than one. They are helping the software development-side stagnate...these developer's getting help for things like AA really should be more than capable of doing such things themselves, but due ot a lack of basic skills, they had to go elsewhere.

It's really sad that a developer can't even implement AA, because they are incapable of hiring the proper staff. It's a sad excuse for nVidia-only code...the entire house's management should be fired, as far as I am concerned, for delivery of a pathetic product.

I mean, really...99.9% of the AI does exactly the same thing. Sure, it's pretty...but my god, I cannot believe the hype this game got. But then again, I know how all that works, and why it was so critcally acclaimed. And I am NOT fooled.

Wait. How exactly is nVidia to blame for ATi not including PS 3.0 support in their hardware exactly?

LoL. Ati doesn't support PS3.0? You sure about that?
 
Joined
Aug 17, 2009
Messages
2,558 (0.48/day)
Location
United States
System Name Aluminum Mallard
Processor Ryzen 1900x
Motherboard AsRock Phantom 6
Cooling AIO
Memory 32GB
Video Card(s) EVGA 3080Ti FTW
Storage SSD
Display(s) Benq Zowie
Case Cosmos 1000
Audio Device(s) On Board
Power Supply Corsair CX750
VR HMD HTV Vive, Valve Index
Software Arch Linux
Benchmark Scores 31 FPS in Dalaran
nVidia would probably make a boat load of money if the released Ageia styled PPU cards with driver support for ATI GPUs.

I'd like to have official support to use an nVidia card as a PPU, but that'll never happen.
 

HeroPrinny

New Member
Joined
Feb 10, 2010
Messages
73 (0.01/day)
i'll point out metro again, it's a perfect example. My testing really did take the cake too a 8600gt running just phsyx boosted my fps from unplayable to playable, that shouldn't happen, why does metro force phsyx?
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.63/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
Wait. How exactly is nVidia to blame for ATi not including PS 3.0 support in their hardware exactly?
I wish I could remember where I read that... :(
 

the54thvoid

Intoxicated Moderator
Staff member
Joined
Dec 14, 2009
Messages
12,451 (2.38/day)
Location
Glasgow - home of formal profanity
Processor Ryzen 7800X3D
Motherboard MSI MAG Mortar B650 (wifi)
Cooling be quiet! Dark Rock Pro 4
Memory 32GB Kingston Fury
Video Card(s) Gainward RTX4070ti
Storage Seagate FireCuda 530 M.2 1TB / Samsumg 960 Pro M.2 512Gb
Display(s) LG 32" 165Hz 1440p GSYNC
Case Asus Prime AP201
Audio Device(s) On Board
Power Supply be quiet! Pure POwer M12 850w Gold (ATX3.0)
Software W10
I can see this thread getting lockdown!!

Any marketing strategy that aims to proactively diminish another companies standing is to be expected. There (unfortunately) is nothing extraordinary about that.
However, when a percentage of service users that fall into the opposing camp have their experience diminshed by said practice, then that is very wrong. To create deals with software developers that aim to 'handicap' the opposition and co-erce the consumer to purchase a certain brand is, under different conditions, illegal practice. At best, it is unfair.
In a capitalist economy, it is fair to undercut, aggressively market and generally hype your product and detract from the opposition. This is all fair.
On the other hand, to use financial incentive to persuade developors to create an uneven playing field is underhanded (regardless of who does it).

To defend what NV does and justify it by the preceding arguements is to wholesomely miss the OP's point. Just because 'A' did it and then 'B' did it, does not make 'B' right. If ATI started the same practice, it would still be just as bad.

The same could be said of coding (Metro 2033) but this can be overcome by intelligent and pragmatic software engineering (driver support). However, the act of diminishing standards for the opposition by hardware/architectural manipulation of closed standards is just plain wrong.

Remember, just because it's not technically illegal doesn't make it right. Hamstringing the opposition to co-erce consumers to buy another product is never a very savoury thing to do.

Defending this practice is a very bad road to take.
 
Joined
Apr 12, 2010
Messages
1,359 (0.27/day)
Processor Core i7 920
Motherboard Asus P6T v2
Cooling Noctua D-14
Memory OCZ Gold 1600
Video Card(s) Powercolor PCS+ 5870
Storage Samsung SpinPoint F3 1 TB
Display(s) Samsung LE-B530 37" TV
Case Lian Li PC-B25F
Audio Device(s) N/A
Power Supply Thermaltake Toughpower 700w
Software Windows 7 64-bit
it's an nvidia bashing thread... how is that not trolling and why is it in the games section

I am most certainly bashing Nvidia; however, I have attempted to justify and explain my stance. I did not simply state what I felt in order to annoy those who hold an opposing point of view. If you disagree wth me, please tell me why, in other words, attack the substance of my argument, rather than issuing accusations.
 

Benetanegia

New Member
Joined
Sep 11, 2009
Messages
2,680 (0.50/day)
Location
Reaching your left retina.
That said, AA in Unreal3 engines works in many other titles. The simple fact it doesn't work at all on ATicards says something far greater than nVidia paying for it to NOT work on ATI cards...

The damning fact is that had nVidia NOT helped develop the AA code, the developers would have been forced to either write code for both, or none at all.

No game using UE3 has built in AA, except Batman and a couple more I think, who chose to implement their own AA. Batman did implement it because Nvidia made the work, plain and simple. Fact of the matter is 90% of games using UE3 have no AA and Batman would have been just another one. I fail to see how implementing something is bad. And I find it funny that people would rather have it not implemented at all. Pretending that AA for none is better than AA for half the people is stupid.
 

cadaveca

My name is Dave
Joined
Apr 10, 2006
Messages
17,232 (2.62/day)
The choice of how it was implemented is the issue. Every other UE3 game that supports AA supports it on both side of the graphics pile.

I'm not saying that them using it was bad...but they have openly stated that they have restricted it to nV cards due to custom code developed by nV.

that's fine...but...

In other words, they couldn't do it. So they suck as programmers. THAT is how it's bad.

I only mention that fact that it wouldn't have been there at all to highlight how incapable they are. Just a simple misunderstanding between us.

I'm not knocking nVidia for helping with that either...other than that they shouldn't be messing with stuff outsdie thier closed APIs. Phys-X, CUDA, whatever...sure, go right ahead. AA code...nope.
 
Joined
Sep 25, 2007
Messages
5,965 (0.99/day)
Location
New York
Processor AMD Ryzen 9 5950x, Ryzen 9 5980HX
Motherboard MSI X570 Tomahawk
Cooling Be Quiet Dark Rock Pro 4(With Noctua Fans)
Memory 32Gb Crucial 3600 Ballistix
Video Card(s) Gigabyte RTX 3080, Asus 6800M
Storage Adata SX8200 1TB NVME/WD Black 1TB NVME
Display(s) Dell 27 Inch 165Hz
Case Phanteks P500A
Audio Device(s) IFI Zen Dac/JDS Labs Atom+/SMSL Amp+Rivers Audio
Power Supply Corsair RM850x
Mouse Logitech G502 SE Hero
Keyboard Corsair K70 RGB Mk.2
VR HMD Samsung Odyssey Plus
Software Windows 10
I say its kinda like getting mad at someone who studied for a test because they passed(they studied) and your friend(he didn't) failed, Nvidia studied, AMD didn't, and people with AMD cards are mad, but alot of the blame lies with AMD.
 
Joined
Apr 12, 2010
Messages
1,359 (0.27/day)
Processor Core i7 920
Motherboard Asus P6T v2
Cooling Noctua D-14
Memory OCZ Gold 1600
Video Card(s) Powercolor PCS+ 5870
Storage Samsung SpinPoint F3 1 TB
Display(s) Samsung LE-B530 37" TV
Case Lian Li PC-B25F
Audio Device(s) N/A
Power Supply Thermaltake Toughpower 700w
Software Windows 7 64-bit
I guess when the OS was loaded you were ok with bill gates and his "the way a PC is to be run?"

Same practices, and lets not forget Intels latest fiasco with OEM builders and what they did to AMD.

You are simply forgetting the golden rule....The one with the gold makes the rules!

I understand the frustration, but Nvidia is just following the footsteps of much larger entities that have set precedence before they got here;)

@ shib, the OP is directed at both TWIMTBP and Just Cause 2;) Also he IS allowed to bash and take sides to an article, its usually what follows those posts is where the trouble derives.

My only argument is that two wrongs don't make a right, but everything you have stated is not subject to debate. How long would Microsoft last if Linux supported DirectX?

My intention is not to start a flame war, so if you feel that this post will bring that about, go ahead and lock it up with my blessing.

Cheers
 

sneekypeet

Retired Super Moderator
Joined
Apr 12, 2006
Messages
29,409 (4.47/day)
System Name EVA-01
Processor Intel i7 13700K
Motherboard Asus ROG Maximus Z690 HERO EVA Edition
Cooling ASUS ROG Ryujin III 360 with Noctua Industrial Fans
Memory PAtriot Viper Elite RGB 96GB @ 6000MHz.
Video Card(s) Asus ROG Strix GeForce RTX 3090 24GB OC EVA Edition
Storage Addlink S95 M.2 PCIe GEN 4x4 2TB
Display(s) Asus ROG SWIFT OLED PG42UQ
Case Thermaltake Core P3 TG
Audio Device(s) Realtek on board > Sony Receiver > Cerwin Vegas
Power Supply be quiet DARK POWER PRO 12 1500W
Mouse ROG STRIX Impact Electro Punk
Keyboard ROG STRIX Scope TKL Electro Punk
Software Windows 11
My point was that while I understand the frustration, its nothing new. As long as things continue to stay civil, I have no intentions to lock it up.
 

BababooeyHTJ

New Member
Joined
Apr 6, 2009
Messages
907 (0.17/day)
Processor i5 3570k
Motherboard Gigabyte Z77 UD5h
Cooling XSPC Raystrom
Memory 4x4GB DDR3 1600mhz
Video Card(s) 7950 crossfire
Storage a few
Display(s) 27" Catleap
Case Xigmatek Elysium
Audio Device(s) Xonar STX
Power Supply Seasonic X1050
Software Windows 7 X64
Ever think that when you buy that cheaper ATI card (price wise) development doesnt come into play? It isnt like Nvidia users arent getting stabbed in the backside in pricing to offset said R&D.

:laugh: What was that about never selling a graphics card for more than $599? :rolleyes:
 
Top