1. Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Why the FX line sucks/sucked so bad.

Discussion in 'Graphics Cards' started by AshenSugar, Feb 11, 2007.

  1. AshenSugar

    AshenSugar New Member

    Joined:
    Sep 20, 2006
    Messages:
    1,998 (0.70/day)
    Thanks Received:
    0
    Location:
    ashentech.com
    ok to start, im posting this to clear up some missconseptions/fud i have seen people posting about the fx cards "emulating dx9" and why they sucked.
    http://techreport.com/news_reply.x/4782/4/
    its got 2 links neer the top in comments that are usefull.

    from wikipedia : http://en.wikipedia.org/wiki/GeForce_FX
    haxxxx!!!!

    more haxxxx!!!!!!!


    9600 256mb i had stomped the 5800ultra i had at EVERY SINGEL THING and it was around 1/3-1/4 the price!!!!

    pwned again!!!!

    see the thumb for specs of these cards, i will also link specs of ati's r300 core
    http://en.wikipedia.org/wiki/Radeon_R300
    thumb is of the r300 range cards specs, cards that totaly stomp the nvidia equivlants!!!

    Attached Files:

  2. AshenSugar

    AshenSugar New Member

    Joined:
    Sep 20, 2006
    Messages:
    1,998 (0.70/day)
    Thanks Received:
    0
    Location:
    ashentech.com
  3. Jarman New Member

    Joined:
    Jan 28, 2007
    Messages:
    388 (0.14/day)
    Thanks Received:
    71
    Location:
    Wrexham UK
    Didnt know if you mentioned it in your copy/paste there, didnt see it anyway. From what i remember Nvidia went down the route of thinking games were going to use heavy vertex shading and so integrated far more vertex processors than shader processors. ATI went the other way and integrated more shader processors. Games became shader heavy and this helped the R300 cores no end.

    The nvidia core was technically superior to the R300 in many ways, better manufacturing processes 128bit DX9 support and partial (64bit??) support. From what i remember again it was the 128bit support instead of the lower 96bit support required by DX9 that caused a lot of wasted clock cycles.

    But who really cares that Nvidia released some crap cards 80 years ago :S Hell, i had a 5900XT and it sucked, my older GF4 4600TI beat it in several situations, but it doesnt really bother me that much in 2007 :wtf:
  4. AshenSugar

    AshenSugar New Member

    Joined:
    Sep 20, 2006
    Messages:
    1,998 (0.70/day)
    Thanks Received:
    0
    Location:
    ashentech.com
    no nvidias design wasnt better in any way, they use partial persission shaders 48bit as much as possable 64bit when needed 128bit when force 64/128bit when there was no other choice read the quotes its all showed in there

    read the artical

    and it matters because they got away with it once, and it may happen again, the r80 curently dosnt have working vista drivers(no dx10 support) yet thats one of its main selling points.

    EDIT: please dont reply to posts i make without at least reading the artical/post, its very dissrespectfull.

    and hey look everybody an nvidia fanboi!!!!! ;)
  5. niko084

    niko084

    Joined:
    Dec 5, 2006
    Messages:
    7,636 (2.75/day)
    Thanks Received:
    729
    Nvidia, Ati power will continue to change hands back and forth, one will upstep the other and it will flop back and forth for a long time.. Same with Intel and AMD... It just keeps going.
  6. Zubasa

    Zubasa

    Joined:
    Oct 1, 2006
    Messages:
    3,980 (1.40/day)
    Thanks Received:
    457
    Location:
    Hong Kong
    Hopefully so, if either of these titans fall, it will be a catastrophe :shadedshu
    I can't imagine how much a Core 2 Duo will cost if AMD is not there.
  7. xvi

    xvi

    Joined:
    Nov 10, 2006
    Messages:
    1,863 (0.66/day)
    Thanks Received:
    984
    Location:
    Washington, US
    True. Nvidia is all like "Hurr! Our 8800GTX stomps the x1950XTX." and now ATI is going to be all like "Hurr! Our R600 (rumored to be aka: X2800) stomps the 8800GTX.", then Nvidia will come out with a 8900, and ATI with a (still rumored) x2900, etc...

    Intel and AMD, like you said, are the same. Intel is all like "Hurr! My Pentium 3 is better." then AMD waas like "Well, hurr.. My Duron is better." then Intel is all like "Hurr! My Pentium 4 is better." then AMD is like "Hurr! My Athlon XP is better.", then Intel is like "Hurr! My Core 2 Duo is better." and now AMD is going to be all like "Hurr! My K8L is better." And then Intel will stick their memory controllers on the processor and implement their own version of HyperTransport and be all like "Hurr! My whatever is better!"..

    It always happens. All you have to do is choose a side and wait for that side to jump ahead before you build a new computer.
    Crunching for Team TPU
  8. AshenSugar

    AshenSugar New Member

    Joined:
    Sep 20, 2006
    Messages:
    1,998 (0.70/day)
    Thanks Received:
    0
    Location:
    ashentech.com
    acctualy intel from what i hear plans a dual nortbrige solution to give the fsb more bandiwith(haxxxx)

    and the x2900 is about ready for market pix are showing up now, check the new section, the diffrance is amd/ati didnt rush out their card, nvidia did, and the 8900 from all evedance is going to be to try and save face against the x28/900 cards, its a g80(same exect core as the 8800) with driver tweaks and higher clocks, woo, big upgrade their, maby they should acctualy get some decent drivers out for the 8800, maby some working vista drivers that would acctualy make the 8800 into a dx10 card :p
  9. Ketxxx

    Ketxxx Heedless Psychic

    Joined:
    Mar 4, 2006
    Messages:
    11,510 (3.77/day)
    Thanks Received:
    570
    Location:
    Kingdom of gods
    The driver "optimizations" part is inaccurate. While nVidia did indeed use blatantly obvious aggressive hacks and were indeed cheating, ATi cant, under any circumstances, be accused of cheating as their driver "optimizations" were categorically PROVEN to not drastically effect (if at all) image quality and the scenes were still fully rendered.
  10. AshenSugar

    AshenSugar New Member

    Joined:
    Sep 20, 2006
    Messages:
    1,998 (0.70/day)
    Thanks Received:
    0
    Location:
    ashentech.com
    ati admited they optimized the drivers, and nvidia fans did accuse these optimizations as being the same thing nvidia was doing(even thouth they wherent even close to the same things nv did, because they didnt effect quility)
  11. Ketxxx

    Ketxxx Heedless Psychic

    Joined:
    Mar 4, 2006
    Messages:
    11,510 (3.77/day)
    Thanks Received:
    570
    Location:
    Kingdom of gods
    Thats the point, ATI' optimizations were genuine, nVidias were not.
  12. xvi

    xvi

    Joined:
    Nov 10, 2006
    Messages:
    1,863 (0.66/day)
    Thanks Received:
    984
    Location:
    Washington, US
    Hah! Another person that still says nVidia (and not Nvidia). Thank you! And he's right.

    I heard something about a "CSI" bus? Or Q-something? Basically a ripoff competitor of HyperTransport. And now they want to stick a memory controller on the processor?! I've heard of copying, but isn't this a bit... bold?
    Crunching for Team TPU
  13. AshenSugar

    AshenSugar New Member

    Joined:
    Sep 20, 2006
    Messages:
    1,998 (0.70/day)
    Thanks Received:
    0
    Location:
    ashentech.com
    intel dosnt want to move the memcontroler onto the chip, it would be loosing face since they still claim that chipset based is more vercitile and blah blah blah, hence using 2 or more nortbriges and possably some kind of quad data rate ram, this to widen the FSB so it isnt saturated with data.
  14. xvi

    xvi

    Joined:
    Nov 10, 2006
    Messages:
    1,863 (0.66/day)
    Thanks Received:
    984
    Location:
    Washington, US
    http://www.theinquirer.net/default.aspx?article=37373

    http://www.theinquirer.net/default.aspx?article=37392

    And finally...
    http://www.theinquirer.net/default.aspx?article=37432

    I'll try searching for articles outside of the Inq, but until then...
    Crunching for Team TPU
  15. Jarman New Member

    Joined:
    Jan 28, 2007
    Messages:
    388 (0.14/day)
    Thanks Received:
    71
    Location:
    Wrexham UK
    AshenSugar i did read the article, i just didnt take in every last word...any idea how big that thing is?

    As for being an Nvidia fanboy id like to think not. Although my main card is a 7900GT (voltmod next wk) I also own an X800XT PE and the chipset on the RDX200 MB in this machine is also ATI, although the southbridge sucks and i wish i hadnt taken my NF4 board out to put this in, but there we go i cant be assed changing them over again. Im also not a fan of the power hungry cards ATI have released of late. Was one of the main reasons for me choosing the 7900GT when i did. The 45W or so power consumption was exceptional for such a card.
  16. AshenSugar

    AshenSugar New Member

    Joined:
    Sep 20, 2006
    Messages:
    1,998 (0.70/day)
    Thanks Received:
    0
    Location:
    ashentech.com
    Jarman, the only people i have ever seen say that the fx line was supperior in anyway to the r300 range are INSAIN fanboys.

    and yes the old sb400/450 soundbriges sucked ass for usb perf, the ULI southy was alot better, now the 600 kicks arse tho.

    as to rated watts/volts/amps, my exp is personaly that ati and nv caculate this diffrently, just as they caculate transistors diffrently, ati tends to be litteral with their caculations, nvidia well nobodys figuared out how they caculate such things yet.

    i worrie more about amps draw since watts isnt really a very good method to use in rating pc devices imho.
    mainly because theres no set standreds for rating psu's or cards.

    they could rate their stuff at concervitive numbers and blow past that with overclocking or they could rate the cards in "worse case" it leaves to much room for screwing around IMHO, everybody needs to rate their cards/parts WORST POSSABLE CASE, this way we can be sure what we are really getting.

    and yes the 7900 is slitly lower power then the x1900 range, but then again, the x1900 rang slaps the 7 seirse around like red headed step child when it comes to raw power(not to mention better drivers)

    look at the 8800, things an aircraft carryer!!!
  17. tkpenalty New Member

    Joined:
    Sep 26, 2006
    Messages:
    6,958 (2.44/day)
    Thanks Received:
    345
    Location:
    Australia, Sydney
    Dude, G80 is just a ultra beefed N7x chip.

    In truth G80 is just a compilation of old technology, it emulates DX10 with difficulty.
  18. anticlutch

    anticlutch New Member

    Joined:
    Sep 9, 2006
    Messages:
    995 (0.35/day)
    Thanks Received:
    32
    Location:
    SoCal
    Case in point: Crysis.
    Even if you weigh in the fact that the benchmark was done with unreleased beta drivers, those are some pretty abysmal numbers...
  19. tkpenalty New Member

    Joined:
    Sep 26, 2006
    Messages:
    6,958 (2.44/day)
    Thanks Received:
    345
    Location:
    Australia, Sydney
    lol yeh... G80 dies in that because its not a shader-based GPU anyway. It has power at the cost of heat like the R600 does though but R600 is just nearly as crazy as the "AMDTI FX12000"
  20. AshenSugar

    AshenSugar New Member

    Joined:
    Sep 20, 2006
    Messages:
    1,998 (0.70/day)
    Thanks Received:
    0
    Location:
    ashentech.com
    the r600 tho is im quite sure shader heavy, as will its lesser variants be, the g80 is a tweaked and slitly updated g70, they made it so it can do hdr+aa, and then added a crapload of pipes, wooo thats.....well imho thats lazy, and nv fanboi's like to blert out that the r420(x800 range) was basickly just a beefed up r300/350 (it was but they did ALOT of tweaking to the chip not just adding more pipes )

    the g80 is whats known as a refresh product they updated and tweaked what they already had to make a very powerfull card, but they didnt acctualy make the feture set much more robust, i wouldnt expect anything truely new from nvidia till the 9900 range or even the 10900 range
  21. Zubasa

    Zubasa

    Joined:
    Oct 1, 2006
    Messages:
    3,980 (1.40/day)
    Thanks Received:
    457
    Location:
    Hong Kong
    I wonder how will nVidia name the GF10900:p
    ATi went on to use X = 10 (roman numerals)
  22. AshenSugar

    AshenSugar New Member

    Joined:
    Sep 20, 2006
    Messages:
    1,998 (0.70/day)
    Thanks Received:
    0
    Location:
    ashentech.com
    knowing nvidia they will go for numbers that look more impressive so 10**0 and up, i would fallover if they called it the 88800 :p
  23. anticlutch

    anticlutch New Member

    Joined:
    Sep 9, 2006
    Messages:
    995 (0.35/day)
    Thanks Received:
    32
    Location:
    SoCal
    Or the GeForce 66666? :D
  24. tkpenalty New Member

    Joined:
    Sep 26, 2006
    Messages:
    6,958 (2.44/day)
    Thanks Received:
    345
    Location:
    Australia, Sydney
    Yes! ATI should patent the X numeral in lettering! Time for ATI/AMD to be a dickhead for the right things (like they always do) >=D, gosh I love ATI/AMD, they had it all planned out, Nvidia is probably going to file a lawsuit for monopolising the GPU industry. Therefore:

    There would be no such thing as the:

    -Geforce 9600
    -Geforce 9800
    -Geforce 8500

    Nvidia seriously needs to consider a new naming code, or else they are screwed >=D.
    I will literally ROFHASWL (Rolling on the floor having a seizure while laughing) if they make the 10800; thats seriously stupid looking, I would say if they wanted to be the "best" they should name it Geforce "TO INFINITY AND BEYOND"

    I don't get how the X800 was criticised as a revision of the 9800, thats FUD.
  25. AshenSugar

    AshenSugar New Member

    Joined:
    Sep 20, 2006
    Messages:
    1,998 (0.70/day)
    Thanks Received:
    0
    Location:
    ashentech.com
    well it is in a way, the cores an updated 9800/r350, more advanced, more pipes, the advantege of it being fairly closely related to the r300/350 cores was driver dev was easyer and the cards could more easly share drivers.

    ati updated the ps2.0b to version f, added more pipes, better memory controler, the list goes on and on, but it is an evolution and a theme/design.
    just as the 6/7/8 are all evolutions of the same core, just a tweak here and there and higher clocks/more or less pipes/shaders thie x1k are a new design, modular by design, ati/amd could make a new version of the x1900 core that had 32 pipes and 128 shaders if they wanted (imagins that and drools) or 256shaders, because they can add/remove shaders pretty easly, just imagin if amd/ati are smart, they make a pci-e 1x card thats not even a video card, using the x1k design, 4-8 pipes, and a HUGE number of shader units(32-48-64.....) with its own 128/256mb ram but make the card very small and because theres no video out they could have the card exaust the heat out the back, design it to fit under/next to/above the 1900 seirse cards :), i hope this happens, it would rock hard imho

Currently Active Users Viewing This Thread: 1 (0 members and 1 guest)

Share This Page