ChillyMyst
LOL I'm not denying your sources, I'm denying the conclusions you take and the analisys you do out of them. Examples:
Quote:
HDR and Anti-aliasing
Question
Does NVIDIA support HDR and Anti Aliasing at the same time?
Answer
Two key hardware features present in GeForce 6 and GeForce 7 GPUs that accelerate and enable great HDR effects include FP16 texture filtering and FP16 frame-buffer blending. There are many ways to accomplish AA and HDR simultaneously in applications. Some games like Half Life 2 Lost Coast use an integer-based HDR technique, and in such games, the GeForce 6 and GeForce 7 chips can render HDR with MSAA simultaneously using that method. The GeForce 6 and GeForce 7 series GPUs do not support simultaneous multisampled AA (MSAA) and FP16 blending in hardware (the method used in 3dMark 2006).
Nvidia Faq
that pretty much prooves what i was saying, also prooves your talking our ur ass about fp64
(dont even think their is such a thing yet!!!)
Haha, this is funny. Indeed it's FP16, so you are right there, but it is also (mistakenly) known as FP64, because as you very well know (
), color scheme used by graphics cards is RGBA (Red, Green, Blue, Alpha, just to be sure...) and HDR uses 16 bits in floating point for each channel, thus FP16 x 4 = 64 Floating Point bits. What it is funny about this, is how you go and say he's inventing it, without even thinking just a bit about it, and came to the same conclusion as I did.
Anyway, I think Half-Life 2 used a custom made hdr, with different bit depth for color and alpha, and a total of 40 or 48 bits used, but don't quote me saying this, since I may be wrong (most likely than not), it's just something that I remember not very clearly TBH from when the game was released long ago. Newer games from Valve use the standard HDR so far.
Quote:
The Way It's Meant To Be Played (TWIMTBP) is a program that helps game developers to optimize and incorporate exclusive features in their games and applications exclusively for NVIDIA's graphics cards. The deal also adds a splash screen to "the way it's meant to be played" games as well as branding within the game; this is widely considered as a promotion campaign for NVIDIA. This program was launched 2003 by NVIDIA, a graphics card producer. The program aims at providing the best experience possible for users of NVIDIA GeForce graphics cards, and more particularly provides extensive guidelines on game performance optimizations for the GeForce graphics cards.
http://en.wikipedia.org/wiki/The_Way_It's_Meant_to_be_Played
and that program does cover PAYING developers and publishers to make games run better on nvidia cards or in some cases even causing artificial performance and feture loss if your videocard is not nvidia based.
Links says nothing, but you are fast to say your BS paragraph.
Quote:
GeForce 6 series and later
Image:Nvidia Logo.svg
The old Nvidia logo, in use until the release of the Geforce 7 series.
With the GeForce 6 series, Nvidia had clearly moved beyond the DX9 performance problems that plagued the previous generation. The GeForce 6 series not only performed competitively where Direct 3D shaders were concerned, but also supported DirectX Shader Model 3.0, while ATI's competing X800 series chips only supported the previous 2.0 specification. This proved an insignificant advantage, mainly because games of that period did not employ extensions for Shader Model 3.0. However, it demonstrated Nvidia's desire to design and follow through with the newest features and deliver them in a specific timeframe. What became more apparent during this time was that the products of the two firms, ATI and Nvidia, offered equivalent performance. The two firms traded blows in specific titles and specific criteria — resolution, image quality, anisotropic filtering/anti-aliasing — but differences were becoming more abstract, and the reigning concern became price-to-performance. The mid-range offerings of the two firms demonstrated the consumers' appetite for affordable, high-performance graphics cards, and it is now this price segment in which much of the firms' profitability is determined. The GeForce 6 series were released in a very interesting period: the game Doom 3 was just released where ATI's Radeon 9700 struggled at the OpenGL performance. In 2005, the Geforce 6800 performed excellently, while the GeForce 6600GT remained as important to Nvidia as the GeForce2 MX a few years previously. The Geforce 6600GT enabled users of the card to play Doom 3 at very high resolutions and graphical settings, which was thought to be highly unlikely considering its selling price. The Geforce 6 series also introduced SLI (which is similar to what 3dfx was using on the Voodoo 2). A combination of SLI and the performance gain as a result returned Nvidia to market leadership.
The GeForce 7 series represented a heavily beefed-up extension of the reliable 6-series. The industry's introduction of the PCI Express bus standard allowed Nvidia to release "SLI", a solution that employs two similar cards to share the workload in rendering. While these solutions do not equate to double the performance, and require more electricity (two cards vis-à-vis one), they can make a huge difference as higher resolutions and settings are enabled and, more importantly, offer more upgrade flexibility. ATI responded with the X1000 series, and their own dual-rendering solution called "Crossfire".
FUD, crossfire was avalable with the x800/x850 cards first not the x1000 cards.
part of the performance problems with the 9500/9600/9700/9800 cards was that nvidia PAYED Id to use shader code that would run poorly on current ATI drivers, but that ran better on nvidia drivers. this was disscovered to be true by some people who changed the shader code from one mode to another(to decimal or something)
Wrong, they say Ati responded with both x1000 and Crossfire, not that one came first than the other or not. Pff ffs.
That, my friend, is what optimization stands for, to change some specific code to make the game run better. But saying Nvidia payed so they did that way is BS. Where was Ati then? Where are your proofs?
The truth is they did it that way because they thought it was the best way to achieve the effect they wanted, and not thinking it would jurt Ati's performance. Once they knew about the problem they fixed it on the patch, isn't it? Anyway that alternative method you mention, in apparency changes a texture call with a constant variable. They change a texture with a constant color ffs!! The end result may be similar, but sure it's not doing the same thing.
An example contrary to yours: Call of Juarez. AFAIK this game is/was under TWIMTBP. They use an Antialiasing method that severely hurts Nvidia's performance. They get paid by Ati, of course. Or Ati/Amd don't do such things? Share your thoughts about this, I'm willing to read your response.
Another example, STALKER: This game ran poorly on every card, be it Ati or Nvidia. Some community made mods to shaders (including adding phong shading instead of blinn) made the game run almost a 100% better depending on circunstancies, while looking better. There were many of those mods or hacks. Oblivion is another example of the same nature.
Now since both Ati and Nvidia ran like crap in the first place and those mods were the proof that the game could run better, there's no other than to think that Sony paid them, because they wanted to destroy PC gaming. Indeed why not go and say that Sony has been doing this for years and blame Sony for the decadency in optimization that the PC has been suffering as a whole in the late years? Oooh, thank you Chilly, now I see the light!!
About [H], don't worry I never had that as a reliable source. Nor I do about wikipedia since I saw many changes that make both companies and people and that lasted weeks before they corrected them back. I have seen so many times something like: "And according to Mr. XXXX a reliable scientist in the matter... blah blah...
Mr. XXXXX is a moron, and a motherfucker, niggars are lower than apes in the evolutive chain." <-- This lasting weeks is unnaceptable for me, so I can't have it as a reliable source, sorry.
And sorry for the language, but how can I express the thing if I don't use the languaje they used?