1. Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce 4XX Series Discussion

Discussion in 'NVIDIA' started by qubit, Sep 25, 2009.

Thread Status:
Not open for further replies.
  1. bobzilla2009 New Member

    Joined:
    Oct 7, 2009
    Messages:
    455 (0.19/day)
    Thanks Received:
    39
    indeed, hd5870's came out in a trickle, and i'm in no doubt they could probably have hit 4 million or so by now, but also the recession will have had an impact on sales too. Isn't there another foundry that amd may be moving to for 32nm though? Since that would help the production of cards immensely for both companies.
     
  2. Benetanegia

    Benetanegia New Member

    Joined:
    Sep 11, 2009
    Messages:
    2,683 (1.10/day)
    Thanks Received:
    694
    Location:
    Reaching your left retina.
    I'm talking about shader efficiency. I've said that plenty of times. AMD has gone to silicon usage efficiency, which is good to obtain better specs and they have!! But after that they have a relatively hard time actually using those specs.

    On the contrary Nvidia goes with a scalar design, which is problematic to make the actual silicon and guess what? Yes, they've had problems. BUT once silicon is out and specs are defined, it's not difficult to scale on performance.

    2 paths similar results. Which is better? WE don't know and neither do they 100% sure probably. For example if we compared the 9600GT to the HD3870 the 9600 GT wins hands down. 504 million transistors versus 667 million and it's faster. And don't compare it to the HD3850! That shows things are just not black and white like many people like to point out solely based on GT200, which was a bad chip to begin with and with 10% of the die dedicated to CUDA apps.

    EDIT: lol now that I think about it. GT200 vs. Evergreen is actually another example. 1.4 billion transistor versus 2.1 billion transistor. 50% more transistors and it's not that fast except in a few games. Just want to prove that it's not black and white.
     
    Last edited: Jan 11, 2010
  3. bobzilla2009 New Member

    Joined:
    Oct 7, 2009
    Messages:
    455 (0.19/day)
    Thanks Received:
    39
    isn't quite a chunk of die CUDA dedicated in fermi too?
     
  4. Benetanegia

    Benetanegia New Member

    Joined:
    Sep 11, 2009
    Messages:
    2,683 (1.10/day)
    Thanks Received:
    694
    Location:
    Reaching your left retina.
    No.

    Well it depends on how you look at it. the answer could be yes, of course: more cache, the circuitry to make it semi-coherent, ECC... But not as much as they did on GT200 where they had 30 64-bit SP that were completely unused in games... Imagine the difference if GT200 had 270 SPs (probably more 64 does take ome space.)...
     
  5. ShiBDiB

    ShiBDiB

    Joined:
    Jul 21, 2008
    Messages:
    4,400 (1.53/day)
    Thanks Received:
    1,040
    Location:
    Clifton Park, NY
    This thread makes my head hurt..
     
  6. bobzilla2009 New Member

    Joined:
    Oct 7, 2009
    Messages:
    455 (0.19/day)
    Thanks Received:
    39
    haha, anyways as long as the hd5870 runs bfbc2 & AvP great i'll be happy :) and if fermi comes out and is the mother of all gpu's then i guess i'll buy it when crysis 2 comes out and kicks all the other squarely in the crotch.
     
  7. wolf

    wolf Performance Enthusiast

    Joined:
    May 7, 2007
    Messages:
    5,547 (1.68/day)
    Thanks Received:
    847
    I have to agree that Nvidia's cards have generally scaled very well in terms of the performance gained, compared to the rise in ROPS and Shader units, and that a few times now ATi's cards have spun me around with some somewhat erratic advances.

    what are the given's at this stage? I am to understand were dealing with...

    512 sp's
    48 ROPS
    384 bit GDDR5
    and a stab in the dark at clocks around 700 core, 1500-1600 shaders, and 4ghz memory.

    IMO that could quite easily double the GTX280, especially given the massive overhaul on how the chip actually functions, these arent G80/G92/GT200 sp's anymore, they're something seemingly much more powerful and efficient.
     
    driver66 and qubit say thanks.
  8. zwawy New Member

    Joined:
    Sep 28, 2006
    Messages:
    40 (0.01/day)
    Thanks Received:
    4
    Location:
    Egypt
  9. bobzilla2009 New Member

    Joined:
    Oct 7, 2009
    Messages:
    455 (0.19/day)
    Thanks Received:
    39
    we can hope, but i'm not holding my breath. Maybe someone will win the competition for one by holding the gf100 in front of their nose, but whenever they try to grab it, it gets tugged away, over and over and over and over...


    On the bright side, it's at least an 80% chance of a March release.
     
    zwawy says thanks.
  10. driver66

    driver66

    Joined:
    Jun 4, 2007
    Messages:
    1,053 (0.32/day)
    Thanks Received:
    115
    Location:
    indiana
    The GF100 will be Nvidia's C2d ;)
     
  11. bobzilla2009 New Member

    Joined:
    Oct 7, 2009
    Messages:
    455 (0.19/day)
    Thanks Received:
    39
    I don't see it being that dominant, lol, it'll be way too expensive for that. Still, maybe they'll actually MAKE a mid range this time (oh wait, they have to this time, they need dx11 support! OMG the first REAL cards in the nvidia mid range lineup since the 9600GT!!!!!!!), or maybe they'll just say 'dx11 is so overrated, mid range cards don't need it' to rehash the damned 8 series again XD (and be promptly overwhelmed by the amd midrange).

    Source for me thinking gf100 will be expensive: Every high end nvidia release for years.
     
  12. erocker

    erocker Super Moderator Staff Member

    Joined:
    Jul 19, 2006
    Messages:
    41,203 (11.45/day)
    Thanks Received:
    16,390
    We have no word yet on Nvidia's next mainstream GPU yet which is the bread and butter of any GPU manufacturer. Even if GF100 is a great card, we won't be seeing OEM's scrambling to put it in a mid-range computer. Nvidia needs low to mid range models GT210/20/40 isn't cutting it. When will we be seeing them?
     
  13. bobzilla2009 New Member

    Joined:
    Oct 7, 2009
    Messages:
    455 (0.19/day)
    Thanks Received:
    39
    probably not even this year, nvidia haven't said anything at all about them and have obviously been struggling with fermi anyway (Source: 6 month delay). There's no way they're going to cut it down to compete with the HD5670/50 soon, since mid range cards that don't require pci-e power are where it's at for the masses. They may get a competitor to the hd57xxs out before they completely lose all market share though.

    Also, the geforce 310 doesn't support dx11, so i wouldn't be surprised if most of the geforce 3xx midrange don't, reserving dx11 for the high end cards. Nvidia will be screwed big time if that is the case, not even joe idiot who works at pc world will think dx10 is better than dx11 with all the hype of w7.

    I get the feeling they don't really care about the normal customer any more, the rebranding of the 8 series twice is what caused me to think again about them, being somewhat of an nvidia fan at the time (my very first gaming computer had an nvidia card). If they fall, it's of their own doing.

    In the end, i'm happy to remain optimistic for gtx380 performance, but i know their mid range will either lack dx11 &/or be refreshes or will be too late to the game. ATi have already won the generation in my opinion due to the release of the mid range cards months and months ahead of nvidia. Which gives them time to permeate the market, increase the efficiency of production and reduce prices. it's up to nvidia to cut losses when fermi arrives by getting the mid range out before August at minimum, because i think the hd6xxx will hit around mid 2012, meaning if nvidia can beat the hd5xxx by a decent margin, ati will come straight back at them within a couple of months.
     
    Last edited: Jan 12, 2010
  14. a_ump

    a_ump

    Joined:
    Nov 21, 2007
    Messages:
    3,676 (1.18/day)
    Thanks Received:
    398
    Location:
    Smithfield, WV
    really? u think its going to take almost 3 years for ATI to get out their next gen? that'd probly be the longest time between 2 generation releases if so. I expect a Q4 2010 release for the HD 6XXX series personally. lol i hope u meant 2011...2012 is 3 years...long ass time for a gen to gen release
     
    Last edited: Jan 12, 2010
  15. bobzilla2009 New Member

    Joined:
    Oct 7, 2009
    Messages:
    455 (0.19/day)
    Thanks Received:
    39
    i meant 2011 lol >< mega fail, but a early to mid 2011 would be around 16 months or so, which would be bang on target, however i wonder how nvidia will respond if that's the case.
     
  16. DaedalusHelios

    DaedalusHelios

    Joined:
    Feb 21, 2008
    Messages:
    4,980 (1.65/day)
    Thanks Received:
    828
    Location:
    Greensboro, NC, USA
    I would predict Nvidia's mid range offerings in about 4 months. 2011 would be only if the world enters WW3. :laugh:
     
  17. subhendu

    subhendu New Member

    Joined:
    Jan 26, 2009
    Messages:
    488 (0.18/day)
    Thanks Received:
    33
  18. a_ump

    a_ump

    Joined:
    Nov 21, 2007
    Messages:
    3,676 (1.18/day)
    Thanks Received:
    398
    Location:
    Smithfield, WV
    u kno there's one thing that i could see coming good of waiting for the midrange. And that's that Nvidia's midrange will probly outperform previous fastest single GPU card, GTX 285. Whereas ATI's midrange card, HD 5770, isn't even as fast as the HD 4870. That was one of the biggest dissapointments imo for the HD 57XX series was being slower than the HD 4870/90. But Nvidia's high end having 512 shaders, that could leave room for their midrange GTS 350? at say 320 or 384 shaders. I'm very interested to see nvidia's cards this gen as it'll be the first time in almost 4 years that nvidia has actually remade dam near their entire line up.....hopefully they will anyways for dx11 support in all market segments.

    dual fermi? dam board is probly gonna be longer than the HD 5970, either that or they're going to have to make the PCB like 12 layers thick lol
     
  19. Binge

    Binge Overclocking Surrealism

    Joined:
    Sep 15, 2008
    Messages:
    6,981 (2.48/day)
    Thanks Received:
    1,754
    Location:
    PA, USA
    I have to disagree. The 295 first and 2nd gen were both shorter than the 4870x2. NV has a way with design.
     
  20. Steevo

    Steevo

    Joined:
    Nov 4, 2005
    Messages:
    9,208 (2.39/day)
    Thanks Received:
    1,738
    Coming up on the fifth month of this thread and past due promises for this monstrocity what do green users think about this card looking back at all the broken promises Nvidia has made have to say.


    When will your beloved ATI killer be here, it is looking sort of dismal for your future of bashing ATI, and the only arguments that Nvidia has given for not going red have started becoming just as empty as the promises they have given. No DX11 games, no need for tesselation,we will have a hotter better product, it will be released in october, november, december, january, february, mebey march but only on paper.
     
    bobzilla2009, zwawy, Mussels and 3 others say thanks.
    10 Year Member at TPU 10 Million points folded for TPU
  21. phanbuey

    phanbuey

    Joined:
    Nov 13, 2007
    Messages:
    5,257 (1.69/day)
    Thanks Received:
    1,008
    Location:
    Miami
    They will come up with the COOLEST name EVER.
     
  22. jaggerwild

    jaggerwild

    Joined:
    Oct 30, 2008
    Messages:
    579 (0.21/day)
    Thanks Received:
    211
    Location:
    Reality
    They gonna run another self made test so as to give no real world results again, this thread starts off with a posting that says "I hope there here by Christmas" lol may be this Christmas?
     
  23. mastrdrver

    mastrdrver

    Joined:
    Feb 24, 2009
    Messages:
    3,359 (1.27/day)
    Thanks Received:
    656
    Some on B3D are saying there are slides floating around with 1/13 NDA on them. Suppose to give a lot more info but still no clocks. Will see.....
     
  24. Binge

    Binge Overclocking Surrealism

    Joined:
    Sep 15, 2008
    Messages:
    6,981 (2.48/day)
    Thanks Received:
    1,754
    Location:
    PA, USA
    You make it sound like ATI made something so good that NV's massively superior software support doesn't already own the souls of a number of gamers/posters on TPU.

    Patience.
     
    Last edited: Jan 13, 2010
    driver66 and HammerON say thanks.
  25. Bo_Fox

    Bo_Fox New Member

    Joined:
    May 29, 2009
    Messages:
    480 (0.19/day)
    Thanks Received:
    57
    Location:
    Barack Hussein Obama-Biden's Nation
    Yep, patience.. patience.. or else, this thread will start driving everybody nuts!!! It's the hidden tune of the rhyme that sends all the posters to asylums.
     

Currently Active Users Viewing This Thread: 1 (0 members and 1 guest)

Thread Status:
Not open for further replies.