• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Why the FX line sucks/sucked so bad.

AshenSugar

New Member
Joined
Sep 20, 2006
Messages
1,998 (0.31/day)
Location
ashentech.com
Processor Athlon64 3500+(2.2gz)@2.94gz(3.03gz)
Motherboard Biostar Tforce550 (RMA) (m2n-sli delux)
Cooling PIB cooler
Memory 2gb ocz 533 +1gb samsung 533 4-4-4-12
Video Card(s) x1900xtx 512mb+zalman vf900 cooler(kicks stock coolers arse)
Storage 80gb,200gb,250gb,160gb
Display(s) 20.1 in dell 2001fp + KDS visual sensations 19"
Case Codegen briza seirse
Audio Device(s) ADI SoundMax HD audio onboard,using Ket's driver pack
Power Supply FSP 400watt SAGA seirse w/noise killer
Software Windows 2003 ent server as workstation(kills xp in perf and stab)
ok to start, im posting this to clear up some missconseptions/fud i have seen people posting about the fx cards "emulating dx9" and why they sucked.
http://techreport.com/news_reply.x/4782/4/
its got 2 links neer the top in comments that are usefull.

from wikipedia : http://en.wikipedia.org/wiki/GeForce_FX
Specifications
NVIDIA's GeForce FX series is the fifth generation in the GeForce line. With GeForce 3, NVIDIA introduced programmable shader units into their 3D rendering capabilities, in line with the release of Microsoft's DirectX 8.0 release, and the GeForce 4 Ti was an optimized version of the GeForce 3. With real-time 3D graphics technology continually advancing, the release of DirectX 9.0 ushered in a further refinement of programmable pipeline technology with the arrival of Shader Model 2.0. The GeForce FX series brings to the table NVIDIA's first generation of Shader Model 2 hardware support..

The FX features DDR, DDR2 or GDDR-3 memory, a 130 nm fabrication process, and Shader Model 2.0/2.0A compliant vertex and pixel shaders. The FX series is fully compliant and compatible with DirectX 9.0b. The GeForce FX also included an improved VPE (Video Processing Engine), which was first deployed in the GeForce4 MX. Its main upgrade was per pixel video-deinterlacing — a feature first offered in ATI's Radeon, but seeing little use until the maturation of Microsoft's DirectX-VA and VMR (video mixing renderer) APIs. Among other features was an improved anisotropic filtering algorithm which was not angle-dependent (unlike its competitor, the Radeon 9700/9800 series) and offered better quality, but affected performance somewhat. Though NVIDIA reduced the filtering quality in the drivers for a while, the company eventually got the quality up again, and this feature remains one of the highest points of the GeForce FX family to date (However, this method of anisotropic filtering was dropped by NVIDIA with the GeForce 6 series for performance reasons).

Disappointment

Analysis of the hardware

Hardware enthusiasts saw the GeForce FX series as a disappointment as it did not live up to expectations. NVIDIA had aggressively hyped the card up throughout the summer and autumn of 2002, to combat ATI Technologies' autumn release of the powerful Radeon 9700. ATI's very successful Shader Model 2 card had arrived several months earlier than NVIDIA's first NV30 board, the GeForce FX 5800.

When the FX 5800 finally launched, it was discovered after testing and research on the part of hardware analysts that the NV30 was not a match for Radeon 9700's R300 core. This was especially true when pixel shading was involved. Additionally, the 5800 had roughly a 30% memory bandwidth deficit caused by the use of a comparatively narrow 128-bit memory bus (ATI and other companies moved to 256-bit). NVIDIA planned to use the new, state-of-the-art GDDR-2 instead because of its support for much higher clock rates. It couldn't clock high enough to make up for the bandwidth of a 256-bit bus, however.

While the NV30's direct competitor, the R300 core, was capable of 8 pixels per clock with its 8 pipelines, the NV30 was an 8 pipeline chip that could only render 4 color pixels per clock. This dramatically limited its pixel fillrate in the majority of game titles. However, in games with large use of stencil shadows, such as Doom3, NV30 could perform 8 pixels per clock on the shadow rendering pass. This did help its performance in this relatively rare rendering situation. Fortunately NVIDIA's use of 130 nm manufacturing technology allowed them to clock the GPU rather highly compared to ATI's 150 nm R300. This allowed NVIDIA to close the gap somewhat. Still, the fact that ATI's solution was more architecturally effective across the board caused the FX 5800 to remain well behind the older Radeon 9700 in many situations.

The initial version of the GeForce FX (the 5800) was one of the first cards to come equipped with a large dual-slot cooling solution. Called "Flow FX", the cooler was stunningly apparent in comparison to ATI's small cooler on the 9700 series. Not only that, but it was very loud and garnered complaints from gamers and developers alike. It was even jokingly coined the 'Dustbuster' and graphics cards which happen to be loud are often compared to the GeForce FX 5800 for this reason.

With regard to the much-touted Shader Model 2 capabilities of the NV3x series and the related marketing point of "cinematic effects" capabilities of the chip, the actual performance was shockingly poor. A combination of unfortunate factors combined to really hamper how well NV3x could perform these calculations.

Firstly, the chips were designed for use with a mixed precision fragment (pixel) programming methodology, using 48-bit integer ("FX12") precision and also (to a lesser extent) a 64-bit "FP16" for situations where high precision math was unnecessary to maintain image quality, and using the 128-bit "FP32" mode only when absolutely necessary. The R300-based cards from ATI did not benefit from partial precision in any way because these chips were designed purely for Direct3D 9's required minimum of 96-bit FP24 for full precision pixel shaders. For a game title to use FP16, the programmer had to specify which pixel shader instructions used the lower precision by placing "hints" in the shader code. Because ATI didn't benefit from the lower precision and the R300 performed far better on shaders overall, and because it took significant effort to set up pixel shaders to work well with the lower precision calculations, the NV3x hardware was usually crippled to running full precision pixel shaders all the time.

Additionally, The NV30, NV31, and NV34 also were handicapped because they contained a mixture of DirectX 7 fixed-function T&L units, DirectX 8 integer pixel shaders, and DirectX 9 floating point pixel shaders.[citation needed] The R300 chips emulated these older functions on their pure Shader Model 2 hardware allowing the SM2 hardware to use far more transistors for SM2 performance with the same transistor budget. For NVIDIA, with their mixture of hardware, this resulted in non-optimal performance of pure SM2 programming, because only a portion of the chip could calculate this math.

The NV3x chips used a processor architecture that relied heavily on the effectiveness of the video card driver's shader compiler. Proper ordering of shader code could dramatically boost the chip's shader computational efficiency. Compiler development is a long and difficult task and this was a major challenge that NVIDIA tried to overcome during most of NV3x's lifetime. NVIDIA released several guidelines for creating GeForce FX-optimized code and worked with Microsoft to create a special shader model called "Shader Model 2.0A", which generated the optimal code for the GeForce FX. NVIDIA also controversially would rewrite game shader code and force the game to user their shader code instead. NVIDIA engineers could tailor the arithmetic structure of the code and adjust its precision so it would optimally run on their hardware. However, such code would often bring with it lower final image quality.

Still, it was later found that even with the use of partial precision, Shader Model 2.0A, and shader code replacements, that the GeForce FX's performance in shader-heavy applications trailed behind the competition. GeForce FX still remained competitive in OpenGL applications, which can be attributed to both the fact that most OpenGL applications use manufacturer-specific extensions and that NVIDIA's OpenGL drivers were still generally superior to the competition's at this point in time.

The FX series was a moderate success but because of its delayed introduction and flaws, NVIDIA ceded market leadership to ATI's Radeon 9700. Due to market demand and the FX's deficiency as a worthy successor, NVIDIA extended the production life of the aging GeForce 4, keeping both the FX and 4 series in production for some time.

[edit]
Valve's presentation

Soon after the introduction of the Geforce FX, synthetic benchmarks (such as 3DMark 2003) revealed potential weakpoints in PS 2.0 shaders. But outside of the developer community and tech-savvy computer gamers, few mainstream users were aware of such issues. Then in late 2003, Valve Corporation (developer of the popular PC game-series 'Half Life') presented a series of inhouse benchmarks pitting the Geforce FX against the Radeon R300. Based on a pre-release build of the highly anticipated game Half-Life 2, Valve's game-engine benchmarks placed Nvidia's FX product-line a full generation behind ATI's R300 product-line. In Shader 2.0 enabled game-levels, NVIDIA's top-of-the-line FX 5900 Ultra performed about as fast as ATI's mainstream Radeon 9600, which cost a third as much as the NVIDIA card.

Valve had initially planned to pursue partial floating point precision (FP16) optimizations specifically for the FX family. The optimizations required detailed analysis on a case by case basis for each of many visual-effects shader routines, yet offered limited benefit; only the small fraction of gamers with Geforce FX 5700/5900 cards would reasonably expect to playable frame-rates. ATI's R300 cards did not need and did not benefit at all from the optimizations, and the substantial majority of gamers (with DirectX8 hardware) could not use any DirectX9 game-effects regardless. Based on their assessment, Valve programmed Half Life 2 to default to DirectX 8 shaders on all GeForce FX hardware, thereby sidestepping the Geforce FX's poor PS2.0 performance.

Players could tweak a game configuration file to force Half Life 2 to run in DirectX 9 mode. But doing so on NV3x cards results in a significant loss of performance, with the top of the line models (FX/5900 and FX/5950) performing comparably to ATI's entry-level Radeon 9600. An unofficial fan-patch later allowed GeForce FX owners to comfortably play the game in DirectX 9 mode, by selectively replacing original shaders with optimized routines. This allowed FX owners to play Half Life 2 in DirectX 9 mode with improved visuals at a cost of some speed.

haxxxx!!!!

Questionable tactics

NVIDIA's GeForce FX era was one of great controversy for the company. The competition had soundly beaten them on the technological front and the only way to get the FX chips competitive with the Radeon R300 chips was to optimize the drivers to the extreme.

This took several forms. NVIDIA historically has been known for their impressive OpenGL driver performance and quality, and the FX series certainly maintained this. However, with image quality in both Direct3D and OpenGL, they aggressively began various questionable optimization techniques not seen before. They started with filtering optimizations by changing how trilinear filtering operated on game textures, reducing its accuracy, and thus quality, visibly. Anisotropic filtering also saw dramatic tweaks to limit its use on as many textures as possible to save memory bandwidth and fillrate. Tweaks to these types of texture filtering can often be spotted in games from a shimmering phenomenon that occurs with floor textures as the player moves through the environment (often signifying poor transitions between mip-maps). Changing the driver settings to "High Quality" can alleviate this occurrence at the cost of performance.

NVIDIA also began to clandestinely replace pixel shader code in software with hand-coded optimized versions with lower accuracy, through detecting what program was being run. These "tweaks" were especially noticed in benchmark software from Futuremark. In 3DMark03 it was found that NVIDIA had gone to extremes to limit the complexity of the scenes through driver shader changeouts and aggressive hacks that prevented parts of the scene from even rendering at all. This artificially boosted the scores the FX series received. Side by side analysis of screenshots in games and 3DMark03 showed vast differences between what a Radeon 9800/9700 displayed and what the FX series was doing. NVIDIA also publicly attacked the usefulness of these programs and the techniques used within them in order to undermine their influence upon consumers.

In essence, NVIDIA programmed their driver to look for specific software and apply aggressive optimizations tailored to the limitations of the NV3x hardware. Upon discovery of these tweaks there was a very vocal uproar from the enthusiast community, and from several popular hardware analysis websites. Unfortunately, disabling most of these optimizations showed that NVIDIA's hardware was dramatically incapable of rendering the scenes on a level of detail similar to what ATI's hardware was displaying. So most of the optimizations stayed, except in 3DMark where the Futuremark company began updates to their software and screening driver releases for hacks.

Both NVIDIA and ATI are guilty of optimizing drivers like this historically. However, NVIDIA went to a new extreme with the FX series. Both companies optimize their drivers for specific applications even today (2006), but a tight rein and watch is kept on the results of these optimizations by a now more educated and aware user community.
more haxxxx!!!!!!!


Competitive response

By early 2003, ATI had captured a considerable chunk of the high-end graphics market and their popular Radeon 9600 was dominating the mid-high performance segment as well. In the meantime, NVIDIA introduced the mid-range 5600 and low-end 5200 models to address the mainstream market. With conventional single-slot cooling and a more affordable price-tag, the 5600 had respectable performance but failed to measure up to its direct competitor, Radeon 9600. As a matter of fact, the mid-range GeForce FX parts did not even advance performance over the chips they were designed to replace, the GeForce 4 Ti and MX440. In DirectX 8 applications, the 5600 lost to or matched the Ti 4200. Likewise, the entry-level FX 5200 did not perform as well as the DirectX 7.0 generation GeForce 4 MX440, despite the FX 5200 possessing a far better 'checkbox' feature-set. FX 5200 was easily matched in value by ATI's older R200-based Radeon 9000-9250 series and outperformed by the even older Radeon 8500.

With the launch of the GeForce FX 5900, NVIDIA fixed many of the problems of the 5800. While the 5800 used fast but hot and expensive GDDR-2 and had a 128-bit memory bus, the 5900 reverted to the slower and cheaper DDR, but it more than made up for it with a wider 256-bit memory bus. The 5900 performed somewhat better than the Radeon 9800 in everything not heavily using shaders, and had a quieter cooling system than the 5800, but most cards based on the 5900 still occupied two slots (the Radeon 9700 and 9800 were both single-slot cards). By mid-2003, ATI's top product (Radeon 9800) was outselling NVIDIA's top-line FX 5900, perhaps the first time that ATI had been able to displace NVIDIA's position as market leader.

NVIDIA later attacked ATI's mid-range card, the Radeon 9600, with the GeForce FX 5700 and 5900XT. The 5700 was a new chip sharing the architectural improvements found in the 5900's NV35 core. The FX 5700's use of GDDR-2 memory kept product prices expensive, leading NVIDIA to introduce the FX 5900XT. The 5900XT was identical to the 5900, but was clocked slower, and used slower memory.

The final GeForce FX model released was the 5950 Ultra, which was a 5900 Ultra with higher clockspeeds. This model did not prove particularly popular, as it was not much faster than the 5900 Ultra, yet commanded a considerable price premium over it. The board was fairly competitive with the Radeon 9800XT, again as long as pixel shaders were lightly used
9600 256mb i had stomped the 5800ultra i had at EVERY SINGEL THING and it was around 1/3-1/4 the price!!!!

Windows Vista and GeForce FX PCI cards

Although ATI's competitive cards clearly surpassed the GeForce FX series among many gamers, NVIDIA may regain some market share with the release of Windows Vista, which requires DirectX 9 for its signature Windows Aero interface. Many users with systems with an integrated graphics processor (IGP) but without AGP or PCIe slots, that are otherwise powerful enough for Vista, may demand DirectX 9 PCI video cards for Vista upgrades, though the size of this niche market is unknown.

To date, the most common such cards use GeForce FX-series chips; most use the FX 5200, but some use the FX 5500 (a slightly overclocked 5200) or the FX 5700 LE, (which has similar speeds to the 5200, but has a few more pixel pipelines.) For some time, the only other PCI cards that were Aero-capable were two GeForce 6200 PCI cards made by BFG Technologies and its 3D Fuzion division. The XGI Technology Volari V3XT was also DirectX 9 on PCI, but with XGI's exit from the graphics card business in early 2006, it is apparently not being supported in Vista as of RTM. [1]

For a long time, ATI's PCI line-up was limited to the Radeon R200-based Radeon 9000, 9200, and 9250 cards, which are not capable of running Aero because of their DirectX 8.1 lineage. Indeed, ATI may have helped assure NVIDIA's initial dominance of the DirectX 9-on-PCI niche by buying XGI's graphics card assets [2]. However, in June 2006 a Radeon X1300-based PCI card was spotted in Japan [3], so it now appears ATI will try to contest the GeForce FX's dominance of this niche. Nonetheless, ATI's deployment of a later-generation GPU in what is likely to be a low-end, non-gamer niche may still leave NVIDIA with the majority of units sold.
pwned again!!!!

see the thumb for specs of these cards, i will also link specs of ati's r300 core
http://en.wikipedia.org/wiki/Radeon_R300
thumb is of the r300 range cards specs, cards that totaly stomp the nvidia equivlants!!!
 

Attachments

  • fx specs.jpg
    fx specs.jpg
    153 KB · Views: 1,164
  • r300.gif
    r300.gif
    79.4 KB · Views: 1,158

AshenSugar

New Member
Joined
Sep 20, 2006
Messages
1,998 (0.31/day)
Location
ashentech.com
Processor Athlon64 3500+(2.2gz)@2.94gz(3.03gz)
Motherboard Biostar Tforce550 (RMA) (m2n-sli delux)
Cooling PIB cooler
Memory 2gb ocz 533 +1gb samsung 533 4-4-4-12
Video Card(s) x1900xtx 512mb+zalman vf900 cooler(kicks stock coolers arse)
Storage 80gb,200gb,250gb,160gb
Display(s) 20.1 in dell 2001fp + KDS visual sensations 19"
Case Codegen briza seirse
Audio Device(s) ADI SoundMax HD audio onboard,using Ket's driver pack
Power Supply FSP 400watt SAGA seirse w/noise killer
Software Windows 2003 ent server as workstation(kills xp in perf and stab)

Jarman

New Member
Joined
Jan 28, 2007
Messages
387 (0.06/day)
Location
Wrexham UK
Processor Opty 165 @ 3011 MHz
Motherboard DFI NF4 SLI-D
Cooling Koolance Exos, DD TDX/Maze4 GPU
Memory 2GB G.Skill HZ DDR 500 Kit
Video Card(s) XFX Geforce 7900GT
Storage > 1.5 TB
Display(s) XEROX 19" TFT
Case Coolermaster ATCS 101
Audio Device(s) Creative X-FI
Power Supply OCZ 850W
Software XP serv pack 2
Didnt know if you mentioned it in your copy/paste there, didnt see it anyway. From what i remember Nvidia went down the route of thinking games were going to use heavy vertex shading and so integrated far more vertex processors than shader processors. ATI went the other way and integrated more shader processors. Games became shader heavy and this helped the R300 cores no end.

The nvidia core was technically superior to the R300 in many ways, better manufacturing processes 128bit DX9 support and partial (64bit??) support. From what i remember again it was the 128bit support instead of the lower 96bit support required by DX9 that caused a lot of wasted clock cycles.

But who really cares that Nvidia released some crap cards 80 years ago :S Hell, i had a 5900XT and it sucked, my older GF4 4600TI beat it in several situations, but it doesnt really bother me that much in 2007 :wtf:
 

AshenSugar

New Member
Joined
Sep 20, 2006
Messages
1,998 (0.31/day)
Location
ashentech.com
Processor Athlon64 3500+(2.2gz)@2.94gz(3.03gz)
Motherboard Biostar Tforce550 (RMA) (m2n-sli delux)
Cooling PIB cooler
Memory 2gb ocz 533 +1gb samsung 533 4-4-4-12
Video Card(s) x1900xtx 512mb+zalman vf900 cooler(kicks stock coolers arse)
Storage 80gb,200gb,250gb,160gb
Display(s) 20.1 in dell 2001fp + KDS visual sensations 19"
Case Codegen briza seirse
Audio Device(s) ADI SoundMax HD audio onboard,using Ket's driver pack
Power Supply FSP 400watt SAGA seirse w/noise killer
Software Windows 2003 ent server as workstation(kills xp in perf and stab)
no nvidias design wasnt better in any way, they use partial persission shaders 48bit as much as possable 64bit when needed 128bit when force 64/128bit when there was no other choice read the quotes its all showed in there

read the artical

Firstly, the chips were designed for use with a mixed precision fragment (pixel) programming methodology, using 48-bit integer ("FX12") precision and also (to a lesser extent) a 64-bit "FP16" for situations where high precision math was unnecessary to maintain image quality, and using the 128-bit "FP32" mode only when absolutely necessary. The R300-based cards from ATI did not benefit from partial precision in any way because these chips were designed purely for Direct3D 9's required minimum of 96-bit FP24 for full precision pixel shaders. For a game title to use FP16, the programmer had to specify which pixel shader instructions used the lower precision by placing "hints" in the shader code. Because ATI didn't benefit from the lower precision and the R300 performed far better on shaders overall, and because it took significant effort to set up pixel shaders to work well with the lower precision calculations, the NV3x hardware was usually crippled to running full precision pixel shaders all the time.

Additionally, The NV30, NV31, and NV34 also were handicapped because they contained a mixture of DirectX 7 fixed-function T&L units, DirectX 8 integer pixel shaders, and DirectX 9 floating point pixel shaders.[citation needed] The R300 chips emulated these older functions on their pure Shader Model 2 hardware allowing the SM2 hardware to use far more transistors for SM2 performance with the same transistor budget. For NVIDIA, with their mixture of hardware, this resulted in non-optimal performance of pure SM2 programming, because only a portion of the chip could calculate this math.

and it matters because they got away with it once, and it may happen again, the r80 curently dosnt have working vista drivers(no dx10 support) yet thats one of its main selling points.

EDIT: please dont reply to posts i make without at least reading the artical/post, its very dissrespectfull.

and hey look everybody an nvidia fanboi!!!!! ;)
 
Joined
Dec 5, 2006
Messages
7,704 (1.22/day)
System Name Back to Blue
Processor i9 14900k
Motherboard Asrock Z790 Nova
Cooling Corsair H150i Elite
Memory 64GB Corsair Dominator DDR5-6400 @ 6600
Video Card(s) EVGA RTX 3090 Ultra FTW3
Storage 4TB WD 850x NVME, 4TB WD Black, 10TB Seagate Barracuda Pro
Display(s) 1x Samsung Odyssey G7 Neo and 1x Dell u2518d
Case Lian Li o11 DXL w/custom vented front panel
Audio Device(s) Focusrite Saffire PRO 14 -> DBX DriveRack PA+ -> Mackie MR8 and MR10 / Senn PX38X -> SB AE-5 Plus
Power Supply Corsair RM1000i
Mouse Logitech G502x
Keyboard Corsair K95 Platinum
Software Windows 11 x64 Pro
Benchmark Scores 31k multicore Cinebench - CPU limited 125w
Nvidia, Ati power will continue to change hands back and forth, one will upstep the other and it will flop back and forth for a long time.. Same with Intel and AMD... It just keeps going.
 
Joined
Oct 1, 2006
Messages
4,883 (0.76/day)
Location
Hong Kong
Processor Core i7-12700k
Motherboard Z690 Aero G D4
Cooling Custom loop water, 3x 420 Rad
Video Card(s) RX 7900 XTX Phantom Gaming
Storage Plextor M10P 2TB
Display(s) InnoCN 27M2V
Case Thermaltake Level 20 XT
Audio Device(s) Soundblaster AE-5 Plus
Power Supply FSP Aurum PT 1200W
Software Windows 11 Pro 64-bit
Nvidia, Ati power will continue to change hands back and forth, one will upstep the other and it will flop back and forth for a long time.. Same with Intel and AMD... It just keeps going.
Hopefully so, if either of these titans fall, it will be a catastrophe :shadedshu
I can't imagine how much a Core 2 Duo will cost if AMD is not there.
 
Joined
Nov 10, 2006
Messages
4,665 (0.73/day)
Location
Washington, US
System Name Rainbow
Processor Intel Core i7 8700k
Motherboard MSI MPG Z390M GAMING EDGE AC
Cooling Corsair H115i, 2x Noctua NF-A14 industrialPPC-3000 PWM
Memory G. Skill TridentZ RGB 4x8GB (F4-3600C16Q-32GTZR)
Video Card(s) ZOTAC GeForce RTX 3090 Trinity
Storage 2x Samsung 950 Pro 256GB | 2xHGST Deskstar 4TB 7.2K
Display(s) Samsung C27HG70
Case Xigmatek Aquila
Power Supply Seasonic 760W SS-760XP
Mouse Razer Deathadder 2013
Keyboard Corsair Vengeance K95
Software Windows 10 Pro
Benchmark Scores 4 trillion points in GmailMark, over 144 FPS 2K Facebook Scrolling (Extreme Quality preset)
Nvidia, Ati power will continue to change hands back and forth, one will upstep the other and it will flop back and forth for a long time.. Same with Intel and AMD... It just keeps going.

True. Nvidia is all like "Hurr! Our 8800GTX stomps the x1950XTX." and now ATI is going to be all like "Hurr! Our R600 (rumored to be aka: X2800) stomps the 8800GTX.", then Nvidia will come out with a 8900, and ATI with a (still rumored) x2900, etc...

Intel and AMD, like you said, are the same. Intel is all like "Hurr! My Pentium 3 is better." then AMD waas like "Well, hurr.. My Duron is better." then Intel is all like "Hurr! My Pentium 4 is better." then AMD is like "Hurr! My Athlon XP is better.", then Intel is like "Hurr! My Core 2 Duo is better." and now AMD is going to be all like "Hurr! My K8L is better." And then Intel will stick their memory controllers on the processor and implement their own version of HyperTransport and be all like "Hurr! My whatever is better!"..

It always happens. All you have to do is choose a side and wait for that side to jump ahead before you build a new computer.
 

AshenSugar

New Member
Joined
Sep 20, 2006
Messages
1,998 (0.31/day)
Location
ashentech.com
Processor Athlon64 3500+(2.2gz)@2.94gz(3.03gz)
Motherboard Biostar Tforce550 (RMA) (m2n-sli delux)
Cooling PIB cooler
Memory 2gb ocz 533 +1gb samsung 533 4-4-4-12
Video Card(s) x1900xtx 512mb+zalman vf900 cooler(kicks stock coolers arse)
Storage 80gb,200gb,250gb,160gb
Display(s) 20.1 in dell 2001fp + KDS visual sensations 19"
Case Codegen briza seirse
Audio Device(s) ADI SoundMax HD audio onboard,using Ket's driver pack
Power Supply FSP 400watt SAGA seirse w/noise killer
Software Windows 2003 ent server as workstation(kills xp in perf and stab)
acctualy intel from what i hear plans a dual nortbrige solution to give the fsb more bandiwith(haxxxx)

and the x2900 is about ready for market pix are showing up now, check the new section, the diffrance is amd/ati didnt rush out their card, nvidia did, and the 8900 from all evedance is going to be to try and save face against the x28/900 cards, its a g80(same exect core as the 8800) with driver tweaks and higher clocks, woo, big upgrade their, maby they should acctualy get some decent drivers out for the 8800, maby some working vista drivers that would acctualy make the 8800 into a dx10 card :p
 

Ketxxx

Heedless Psychic
Joined
Mar 4, 2006
Messages
11,521 (1.75/day)
Location
Kingdom of gods
System Name Ravens Talon
Processor AMD R7 3700X @ 4.4GHz 1.3v
Motherboard MSI X570 Tomahawk
Cooling Modded 240mm Coolermaster Liquidmaster
Memory 2x16GB Klevv BoltX 3600MHz & custom timings
Video Card(s) Powercolor 6800XT Red Devil
Storage 250GB Asgard SSD, 1TB Integral SSD, 2TB Seagate Barracuda
Display(s) 27" BenQ Mobiuz
Case NZXT Phantom 530
Audio Device(s) Asus Xonar DX 7.1 PCI-E
Power Supply 1000w Supernova
Software Windows 10 x64
Benchmark Scores Fast. I don't need epeen.
The driver "optimizations" part is inaccurate. While nVidia did indeed use blatantly obvious aggressive hacks and were indeed cheating, ATi cant, under any circumstances, be accused of cheating as their driver "optimizations" were categorically PROVEN to not drastically effect (if at all) image quality and the scenes were still fully rendered.
 

AshenSugar

New Member
Joined
Sep 20, 2006
Messages
1,998 (0.31/day)
Location
ashentech.com
Processor Athlon64 3500+(2.2gz)@2.94gz(3.03gz)
Motherboard Biostar Tforce550 (RMA) (m2n-sli delux)
Cooling PIB cooler
Memory 2gb ocz 533 +1gb samsung 533 4-4-4-12
Video Card(s) x1900xtx 512mb+zalman vf900 cooler(kicks stock coolers arse)
Storage 80gb,200gb,250gb,160gb
Display(s) 20.1 in dell 2001fp + KDS visual sensations 19"
Case Codegen briza seirse
Audio Device(s) ADI SoundMax HD audio onboard,using Ket's driver pack
Power Supply FSP 400watt SAGA seirse w/noise killer
Software Windows 2003 ent server as workstation(kills xp in perf and stab)
ati admited they optimized the drivers, and nvidia fans did accuse these optimizations as being the same thing nvidia was doing(even thouth they wherent even close to the same things nv did, because they didnt effect quility)
 

Ketxxx

Heedless Psychic
Joined
Mar 4, 2006
Messages
11,521 (1.75/day)
Location
Kingdom of gods
System Name Ravens Talon
Processor AMD R7 3700X @ 4.4GHz 1.3v
Motherboard MSI X570 Tomahawk
Cooling Modded 240mm Coolermaster Liquidmaster
Memory 2x16GB Klevv BoltX 3600MHz & custom timings
Video Card(s) Powercolor 6800XT Red Devil
Storage 250GB Asgard SSD, 1TB Integral SSD, 2TB Seagate Barracuda
Display(s) 27" BenQ Mobiuz
Case NZXT Phantom 530
Audio Device(s) Asus Xonar DX 7.1 PCI-E
Power Supply 1000w Supernova
Software Windows 10 x64
Benchmark Scores Fast. I don't need epeen.
Thats the point, ATI' optimizations were genuine, nVidias were not.
 
Joined
Nov 10, 2006
Messages
4,665 (0.73/day)
Location
Washington, US
System Name Rainbow
Processor Intel Core i7 8700k
Motherboard MSI MPG Z390M GAMING EDGE AC
Cooling Corsair H115i, 2x Noctua NF-A14 industrialPPC-3000 PWM
Memory G. Skill TridentZ RGB 4x8GB (F4-3600C16Q-32GTZR)
Video Card(s) ZOTAC GeForce RTX 3090 Trinity
Storage 2x Samsung 950 Pro 256GB | 2xHGST Deskstar 4TB 7.2K
Display(s) Samsung C27HG70
Case Xigmatek Aquila
Power Supply Seasonic 760W SS-760XP
Mouse Razer Deathadder 2013
Keyboard Corsair Vengeance K95
Software Windows 10 Pro
Benchmark Scores 4 trillion points in GmailMark, over 144 FPS 2K Facebook Scrolling (Extreme Quality preset)
The driver "optimizations" part is inaccurate. While nVidia did indeed use blatantly obvious aggressive hacks and were indeed cheating, ATi cant, under any circumstances, be accused of cheating as their driver "optimizations" were categorically PROVEN to not drastically effect (if at all) image quality.

Hah! Another person that still says nVidia (and not Nvidia). Thank you! And he's right.

acctualy intel from what i hear plans a dual nortbrige solution to give the fsb more bandiwith(haxxxx)
I heard something about a "CSI" bus? Or Q-something? Basically a ripoff competitor of HyperTransport. And now they want to stick a memory controller on the processor?! I've heard of copying, but isn't this a bit... bold?
 

AshenSugar

New Member
Joined
Sep 20, 2006
Messages
1,998 (0.31/day)
Location
ashentech.com
Processor Athlon64 3500+(2.2gz)@2.94gz(3.03gz)
Motherboard Biostar Tforce550 (RMA) (m2n-sli delux)
Cooling PIB cooler
Memory 2gb ocz 533 +1gb samsung 533 4-4-4-12
Video Card(s) x1900xtx 512mb+zalman vf900 cooler(kicks stock coolers arse)
Storage 80gb,200gb,250gb,160gb
Display(s) 20.1 in dell 2001fp + KDS visual sensations 19"
Case Codegen briza seirse
Audio Device(s) ADI SoundMax HD audio onboard,using Ket's driver pack
Power Supply FSP 400watt SAGA seirse w/noise killer
Software Windows 2003 ent server as workstation(kills xp in perf and stab)
intel dosnt want to move the memcontroler onto the chip, it would be loosing face since they still claim that chipset based is more vercitile and blah blah blah, hence using 2 or more nortbriges and possably some kind of quad data rate ram, this to widen the FSB so it isnt saturated with data.
 
Joined
Nov 10, 2006
Messages
4,665 (0.73/day)
Location
Washington, US
System Name Rainbow
Processor Intel Core i7 8700k
Motherboard MSI MPG Z390M GAMING EDGE AC
Cooling Corsair H115i, 2x Noctua NF-A14 industrialPPC-3000 PWM
Memory G. Skill TridentZ RGB 4x8GB (F4-3600C16Q-32GTZR)
Video Card(s) ZOTAC GeForce RTX 3090 Trinity
Storage 2x Samsung 950 Pro 256GB | 2xHGST Deskstar 4TB 7.2K
Display(s) Samsung C27HG70
Case Xigmatek Aquila
Power Supply Seasonic 760W SS-760XP
Mouse Razer Deathadder 2013
Keyboard Corsair Vengeance K95
Software Windows 10 Pro
Benchmark Scores 4 trillion points in GmailMark, over 144 FPS 2K Facebook Scrolling (Extreme Quality preset)
intel dosnt want to move the memcontroler onto the chip, it would be loosing face since they still claim that chipset based is more vercitile and blah blah blah, hence using 2 or more nortbriges and possably some kind of quad data rate ram, this to widen the FSB so it isnt saturated with data.

http://www.theinquirer.net/default.aspx?article=37373

http://www.theinquirer.net/default.aspx?article=37392

And finally...
http://www.theinquirer.net/default.aspx?article=37432

I'll try searching for articles outside of the Inq, but until then...
 

Jarman

New Member
Joined
Jan 28, 2007
Messages
387 (0.06/day)
Location
Wrexham UK
Processor Opty 165 @ 3011 MHz
Motherboard DFI NF4 SLI-D
Cooling Koolance Exos, DD TDX/Maze4 GPU
Memory 2GB G.Skill HZ DDR 500 Kit
Video Card(s) XFX Geforce 7900GT
Storage > 1.5 TB
Display(s) XEROX 19" TFT
Case Coolermaster ATCS 101
Audio Device(s) Creative X-FI
Power Supply OCZ 850W
Software XP serv pack 2
AshenSugar i did read the article, i just didnt take in every last word...any idea how big that thing is?

As for being an Nvidia fanboy id like to think not. Although my main card is a 7900GT (voltmod next wk) I also own an X800XT PE and the chipset on the RDX200 MB in this machine is also ATI, although the southbridge sucks and i wish i hadnt taken my NF4 board out to put this in, but there we go i cant be assed changing them over again. Im also not a fan of the power hungry cards ATI have released of late. Was one of the main reasons for me choosing the 7900GT when i did. The 45W or so power consumption was exceptional for such a card.
 

AshenSugar

New Member
Joined
Sep 20, 2006
Messages
1,998 (0.31/day)
Location
ashentech.com
Processor Athlon64 3500+(2.2gz)@2.94gz(3.03gz)
Motherboard Biostar Tforce550 (RMA) (m2n-sli delux)
Cooling PIB cooler
Memory 2gb ocz 533 +1gb samsung 533 4-4-4-12
Video Card(s) x1900xtx 512mb+zalman vf900 cooler(kicks stock coolers arse)
Storage 80gb,200gb,250gb,160gb
Display(s) 20.1 in dell 2001fp + KDS visual sensations 19"
Case Codegen briza seirse
Audio Device(s) ADI SoundMax HD audio onboard,using Ket's driver pack
Power Supply FSP 400watt SAGA seirse w/noise killer
Software Windows 2003 ent server as workstation(kills xp in perf and stab)
Jarman, the only people i have ever seen say that the fx line was supperior in anyway to the r300 range are INSAIN fanboys.

and yes the old sb400/450 soundbriges sucked ass for usb perf, the ULI southy was alot better, now the 600 kicks arse tho.

as to rated watts/volts/amps, my exp is personaly that ati and nv caculate this diffrently, just as they caculate transistors diffrently, ati tends to be litteral with their caculations, nvidia well nobodys figuared out how they caculate such things yet.

i worrie more about amps draw since watts isnt really a very good method to use in rating pc devices imho.
mainly because theres no set standreds for rating psu's or cards.

they could rate their stuff at concervitive numbers and blow past that with overclocking or they could rate the cards in "worse case" it leaves to much room for screwing around IMHO, everybody needs to rate their cards/parts WORST POSSABLE CASE, this way we can be sure what we are really getting.

and yes the 7900 is slitly lower power then the x1900 range, but then again, the x1900 rang slaps the 7 seirse around like red headed step child when it comes to raw power(not to mention better drivers)

look at the 8800, things an aircraft carryer!!!
 
Joined
Sep 26, 2006
Messages
6,959 (1.09/day)
Location
Australia, Sydney
True. Nvidia is all like "Hurr! Our 8800GTX stomps the x1950XTX." and now ATI is going to be all like "Hurr! Our R600 (rumored to be aka: X2800) stomps the 8800GTX.", then Nvidia will come out with a 8900, and ATI with a (still rumored) x2900, etc...

Intel and AMD, like you said, are the same. Intel is all like "Hurr! My Pentium 3 is better." then AMD waas like "Well, hurr.. My Duron is better." then Intel is all like "Hurr! My Pentium 4 is better." then AMD is like "Hurr! My Athlon XP is better.", then Intel is like "Hurr! My Core 2 Duo is better." and now AMD is going to be all like "Hurr! My K8L is better." And then Intel will stick their memory controllers on the processor and implement their own version of HyperTransport and be all like "Hurr! My whatever is better!"..

It always happens. All you have to do is choose a side and wait for that side to jump ahead before you build a new computer.

Dude, G80 is just a ultra beefed N7x chip.

In truth G80 is just a compilation of old technology, it emulates DX10 with difficulty.
 
Joined
Sep 9, 2006
Messages
994 (0.16/day)
Location
SoCal
Processor Intel C2D E6420 (3.2 gHz @ 1.365v)
Motherboard Gigabyte 965P-DS3
Cooling Zalman CNPS9500 LED
Memory 4 x 512mb Corsair XMS2 DDR2 667
Video Card(s) Visiontek HD 4870 512mb
Storage 200gb Maxtor SATA, 400gb WD SATA
Audio Device(s) Audigy 2 ZS
Software Windows XP Pro SP2
Dude, G80 is just a ultra beefed N7x chip.

In truth G80 is just a compilation of old technology, it emulates DX10 with difficulty.

Case in point: Crysis.
Even if you weigh in the fact that the benchmark was done with unreleased beta drivers, those are some pretty abysmal numbers...
 
Joined
Sep 26, 2006
Messages
6,959 (1.09/day)
Location
Australia, Sydney
lol yeh... G80 dies in that because its not a shader-based GPU anyway. It has power at the cost of heat like the R600 does though but R600 is just nearly as crazy as the "AMDTI FX12000"
 

AshenSugar

New Member
Joined
Sep 20, 2006
Messages
1,998 (0.31/day)
Location
ashentech.com
Processor Athlon64 3500+(2.2gz)@2.94gz(3.03gz)
Motherboard Biostar Tforce550 (RMA) (m2n-sli delux)
Cooling PIB cooler
Memory 2gb ocz 533 +1gb samsung 533 4-4-4-12
Video Card(s) x1900xtx 512mb+zalman vf900 cooler(kicks stock coolers arse)
Storage 80gb,200gb,250gb,160gb
Display(s) 20.1 in dell 2001fp + KDS visual sensations 19"
Case Codegen briza seirse
Audio Device(s) ADI SoundMax HD audio onboard,using Ket's driver pack
Power Supply FSP 400watt SAGA seirse w/noise killer
Software Windows 2003 ent server as workstation(kills xp in perf and stab)
the r600 tho is im quite sure shader heavy, as will its lesser variants be, the g80 is a tweaked and slitly updated g70, they made it so it can do hdr+aa, and then added a crapload of pipes, wooo thats.....well imho thats lazy, and nv fanboi's like to blert out that the r420(x800 range) was basickly just a beefed up r300/350 (it was but they did ALOT of tweaking to the chip not just adding more pipes )

the g80 is whats known as a refresh product they updated and tweaked what they already had to make a very powerfull card, but they didnt acctualy make the feture set much more robust, i wouldnt expect anything truely new from nvidia till the 9900 range or even the 10900 range
 
Joined
Oct 1, 2006
Messages
4,883 (0.76/day)
Location
Hong Kong
Processor Core i7-12700k
Motherboard Z690 Aero G D4
Cooling Custom loop water, 3x 420 Rad
Video Card(s) RX 7900 XTX Phantom Gaming
Storage Plextor M10P 2TB
Display(s) InnoCN 27M2V
Case Thermaltake Level 20 XT
Audio Device(s) Soundblaster AE-5 Plus
Power Supply FSP Aurum PT 1200W
Software Windows 11 Pro 64-bit
the r600 tho is im quite sure shader heavy, as will its lesser variants be, the g80 is a tweaked and slitly updated g70, they made it so it can do hdr+aa, and then added a crapload of pipes, wooo thats.....well imho thats lazy, and nv fanboi's like to blert out that the r420(x800 range) was basickly just a beefed up r300/350 (it was but they did ALOT of tweaking to the chip not just adding more pipes )

the g80 is whats known as a refresh product they updated and tweaked what they already had to make a very powerfull card, but they didnt acctualy make the feture set much more robust, i wouldnt expect anything truely new from nvidia till the 9900 range or even the 10900 range
I wonder how will nVidia name the GF10900:p
ATi went on to use X = 10 (roman numerals)
 

AshenSugar

New Member
Joined
Sep 20, 2006
Messages
1,998 (0.31/day)
Location
ashentech.com
Processor Athlon64 3500+(2.2gz)@2.94gz(3.03gz)
Motherboard Biostar Tforce550 (RMA) (m2n-sli delux)
Cooling PIB cooler
Memory 2gb ocz 533 +1gb samsung 533 4-4-4-12
Video Card(s) x1900xtx 512mb+zalman vf900 cooler(kicks stock coolers arse)
Storage 80gb,200gb,250gb,160gb
Display(s) 20.1 in dell 2001fp + KDS visual sensations 19"
Case Codegen briza seirse
Audio Device(s) ADI SoundMax HD audio onboard,using Ket's driver pack
Power Supply FSP 400watt SAGA seirse w/noise killer
Software Windows 2003 ent server as workstation(kills xp in perf and stab)
knowing nvidia they will go for numbers that look more impressive so 10**0 and up, i would fallover if they called it the 88800 :p
 
Joined
Sep 9, 2006
Messages
994 (0.16/day)
Location
SoCal
Processor Intel C2D E6420 (3.2 gHz @ 1.365v)
Motherboard Gigabyte 965P-DS3
Cooling Zalman CNPS9500 LED
Memory 4 x 512mb Corsair XMS2 DDR2 667
Video Card(s) Visiontek HD 4870 512mb
Storage 200gb Maxtor SATA, 400gb WD SATA
Audio Device(s) Audigy 2 ZS
Software Windows XP Pro SP2
Or the GeForce 66666? :D
 
Joined
Sep 26, 2006
Messages
6,959 (1.09/day)
Location
Australia, Sydney
Yes! ATI should patent the X numeral in lettering! Time for ATI/AMD to be a dickhead for the right things (like they always do) >=D, gosh I love ATI/AMD, they had it all planned out, Nvidia is probably going to file a lawsuit for monopolising the GPU industry. Therefore:

There would be no such thing as the:

-Geforce 9600
-Geforce 9800
-Geforce 8500

Nvidia seriously needs to consider a new naming code, or else they are screwed >=D.
I will literally ROFHASWL (Rolling on the floor having a seizure while laughing) if they make the 10800; thats seriously stupid looking, I would say if they wanted to be the "best" they should name it Geforce "TO INFINITY AND BEYOND"

I don't get how the X800 was criticised as a revision of the 9800, thats FUD.
 

AshenSugar

New Member
Joined
Sep 20, 2006
Messages
1,998 (0.31/day)
Location
ashentech.com
Processor Athlon64 3500+(2.2gz)@2.94gz(3.03gz)
Motherboard Biostar Tforce550 (RMA) (m2n-sli delux)
Cooling PIB cooler
Memory 2gb ocz 533 +1gb samsung 533 4-4-4-12
Video Card(s) x1900xtx 512mb+zalman vf900 cooler(kicks stock coolers arse)
Storage 80gb,200gb,250gb,160gb
Display(s) 20.1 in dell 2001fp + KDS visual sensations 19"
Case Codegen briza seirse
Audio Device(s) ADI SoundMax HD audio onboard,using Ket's driver pack
Power Supply FSP 400watt SAGA seirse w/noise killer
Software Windows 2003 ent server as workstation(kills xp in perf and stab)
well it is in a way, the cores an updated 9800/r350, more advanced, more pipes, the advantege of it being fairly closely related to the r300/350 cores was driver dev was easyer and the cards could more easly share drivers.

ati updated the ps2.0b to version f, added more pipes, better memory controler, the list goes on and on, but it is an evolution and a theme/design.
just as the 6/7/8 are all evolutions of the same core, just a tweak here and there and higher clocks/more or less pipes/shaders thie x1k are a new design, modular by design, ati/amd could make a new version of the x1900 core that had 32 pipes and 128 shaders if they wanted (imagins that and drools) or 256shaders, because they can add/remove shaders pretty easly, just imagin if amd/ati are smart, they make a pci-e 1x card thats not even a video card, using the x1k design, 4-8 pipes, and a HUGE number of shader units(32-48-64.....) with its own 128/256mb ram but make the card very small and because theres no video out they could have the card exaust the heat out the back, design it to fit under/next to/above the 1900 seirse cards :), i hope this happens, it would rock hard imho
 
Top