1. Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Next-gen NVIDIA GeForce Specifications Unveiled

Discussion in 'News' started by malware, May 22, 2008.

  1. malware New Member

    Joined:
    Nov 7, 2004
    Messages:
    5,476 (1.52/day)
    Thanks Received:
    956
    Location:
    Bulgaria
    After we already know what AMD/ATI are planning on their camp, it's NVIDIA's turn to show us what we should be prepared for. Verified by DailyTech, NVIDIA plans on refreshing its GPU line-up on June 18th with two new video cards that will feature the first CUDA-enabled graphics core, codenamed D10U. Two models are expected to be launched simultaneously, the flagship GeForce GTX 280 (D10U-30) and GeForce GTX 260 (D10U-20). The first chip will utilize 512-bit memory bus width, 240 stream processors (128 on the 9800 GTX) and support for up to 1GB memory. GTX 260 will be trimmed down version with 192 stream processors, 448-bit bus and up to 896MB graphics memory. Both cards will use the PCI-E version 2.0 interface, and will support NVIDIA's 3-way SLI technology. NVIDIA also promises that the unified shaders of both cards are to perform 50% faster than previous generation cards. Compared to the upcoming AMD Radeon 4000 series, the D10U GPU lacks of DirectX 10.1 support and is also limited to GDDR3 only memory. NVIDIA's documentation does not list an estimated street price for the new cards.

    Source: DailyTech
     
    candle_86 says thanks.
  2. HaZe303

    HaZe303 New Member

    Joined:
    Feb 12, 2006
    Messages:
    305 (0.10/day)
    Thanks Received:
    3
    Location:
    Sweden (08-Stockholm)
    Sounds like im getting ATI card this time, to me the new ATI R700 cards sound much better on paper than the GT200?? I might be wrong, but sounds like the nV card is just a revolution of G92 and not a new gpu? I mean GDDR3 still, they could atleast go over to 4, preferably to 5 as ATI. And still no dx 10.1?? No im getting me a 4870 this summer, sounds like ill be getting it for cheap as well. Maybe finallly I can afford a Xfire system?? :)
     
  3. tkpenalty New Member

    Joined:
    Sep 26, 2006
    Messages:
    6,958 (2.39/day)
    Thanks Received:
    345
    Location:
    Australia, Sydney
    Malware, why did you ignore my links that had similar information posted (release date), sent to you half a month ago?
     
  4. spud107

    spud107

    Joined:
    Feb 12, 2007
    Messages:
    1,194 (0.43/day)
    Thanks Received:
    131
    Location:
    scotland
    would have thought 10.1 would have been implemented, bit like having dx9.0b over dx9.0c?
     
  5. largon New Member

    Joined:
    May 6, 2005
    Messages:
    2,778 (0.81/day)
    Thanks Received:
    432
    Location:
    Tre, Suomi Finland
    As if RV770 was a "new GPU"... Whatever that means. The architecture of G80/G92 is superior than anything out there. Why on earth would you think nV should dump it?
    And DX10.1 is hardly worth mentioning, how many DX10.1 titles are there again?
     
  6. malware New Member

    Joined:
    Nov 7, 2004
    Messages:
    5,476 (1.52/day)
    Thanks Received:
    956
    Location:
    Bulgaria
    The only recent PM I have from you is the one with the GIGABYTE Extreme motherboard?
     
  7. Edito

    Edito

    Joined:
    Mar 13, 2007
    Messages:
    346 (0.13/day)
    Thanks Received:
    13
    Location:
    Maputo-Mozambique
    Maybe they just don't see any performance improve from GDDR4 over GDDR3 either i, look at the 8800GTS G92 has a spectaluar performance but still use GDDR3 when the time comes they will make a good use of it i believe cause i think ATI is using it but they are nothing using it well cause we just can't see any performance improvement... Don't get me wrong its what i think...
     
  8. btarunr

    btarunr Editor & Senior Moderator Staff Member

    Joined:
    Oct 9, 2007
    Messages:
    28,546 (11.25/day)
    Thanks Received:
    13,644
    Location:
    Hyderabad, India
    That's always the case. We're drenched into amazing numbers such as "512bit", "1 GB GDDR4", "320 SP's".

    No, I don't think the HD4870 can beat the GTX 280 in raw performance at least, maybe price, power and other factors.
     
  9. spud107

    spud107

    Joined:
    Feb 12, 2007
    Messages:
    1,194 (0.43/day)
    Thanks Received:
    131
    Location:
    scotland
    there would probably be more if nv was using 10.1,


     
  10. largon New Member

    Joined:
    May 6, 2005
    Messages:
    2,778 (0.81/day)
    Thanks Received:
    432
    Location:
    Tre, Suomi Finland
    GDDR4 could have been a smart move as it is much more power efficient than GDDR3. Those 16/14 chips of GDDR3 on GTX280/GTX260 are going to suck stupid amounts of power, something like freaking 60-80W for the GDDR3 alone...

    65nm - instead of 55nm - is another problem and causes more unnecessary power consumption.

    And yet again, nV fails in creating a practical PCB layout. The board used for GTX280/260 is pure horror.
     
  11. kylew

    kylew New Member

    Joined:
    Nov 19, 2007
    Messages:
    604 (0.24/day)
    Thanks Received:
    70
    Location:
    Liverpool
    Well, you KNOW why there's very little DX10.1 implementation, look at Assassin's Creed, NV moaned, stamped their feet, and so on to get it removed. Dx10.1 is "insignificant" because NV want it to be. In reality, NV can't implement it whereas DX10.1 on the 3800s shows massive performance gains when enabled.
     
  12. Animalpak

    Animalpak

    Joined:
    Feb 8, 2008
    Messages:
    2,096 (0.87/day)
    Thanks Received:
    566



    I agree :toast:

    GT200 rocks ! :rockout:
     
    candle_86 says thanks.
  13. JAKra New Member

    Joined:
    May 7, 2005
    Messages:
    10 (0.00/day)
    Thanks Received:
    0
    Location:
    Budapest, Hungary
    Dx10.1 Upgrade?

    Hi!

    I have one question. If DX10.1 can be removed by a patch, does it mean that it works the other way around? Like upgrade Crysis to DX10.1? Or any other DX10 title.
    That would be nice, and I presume not to hard to accomplish(technically).
     
  14. Animalpak

    Animalpak

    Joined:
    Feb 8, 2008
    Messages:
    2,096 (0.87/day)
    Thanks Received:
    566
    Completley wrong.


    GT200 is a FULL new GPU, and the GDDR3 works alright better than the GDDR5.At the end you get the same results but the GDDR3s they are more exploitable.


    The differences betwheen DX10 and DX10.1 are least ! The games have just begun to use the DX10s and they are little of it !!
     
    candle_86 says thanks.
  15. Valdez

    Joined:
    Sep 2, 2005
    Messages:
    294 (0.09/day)
    Thanks Received:
    25
    Location:
    Szekszárd, Hungary
    nvidia owns crytek now, so there will be no dx10.1 support (crytek officialy confirmed this).
     
  16. FilipM

    FilipM New Member

    Joined:
    Dec 30, 2007
    Messages:
    802 (0.33/day)
    Thanks Received:
    94
    Location:
    Bitola, Macedonia
    It looks great on paper, but, how will the wallet look like when you buy one of these?

    Anyone knows the price or has a hint?
     
  17. largon New Member

    Joined:
    May 6, 2005
    Messages:
    2,778 (0.81/day)
    Thanks Received:
    432
    Location:
    Tre, Suomi Finland
    JAKra,
    Unlikely, but really, anything's possible but ofcourse it's way easier to cut rather than add something.

    DX10.1 in AC allows performance boost when AA is used. Sure.
    But then again, it also causes incompatibility with nV GPUs that only support DX10.

    Choose now, which would you fix?
    Link please.
     
  18. Valdez

    Joined:
    Sep 2, 2005
    Messages:
    294 (0.09/day)
    Thanks Received:
    25
    Location:
    Szekszárd, Hungary
    The gt200 is not new it's just an improved g80. The memory controller in g80 is not flexible, so they have to use gddr3 in gt200 too.
     
  19. Valdez

    Joined:
    Sep 2, 2005
    Messages:
    294 (0.09/day)
    Thanks Received:
    25
    Location:
    Szekszárd, Hungary
    it's not incompatible, just when vista sp1 installed, nv gpus don't use dx10.1 features, but there is no incompatibility.
     
  20. Animalpak

    Animalpak

    Joined:
    Feb 8, 2008
    Messages:
    2,096 (0.87/day)
    Thanks Received:
    566

    Sure? Then when a new GPU will go out? They had everybody confirmed that it was new !!

    DAMN :shadedshu:mad::banghead:
     
  21. Valdez

    Joined:
    Sep 2, 2005
    Messages:
    294 (0.09/day)
    Thanks Received:
    25
    Location:
    Szekszárd, Hungary
    Don't be sad, the gt200 will be the fastest gpu ever released, 9900gtx will be a brutal card, much more faster than 8800ultra/9800gtx :)
     
  22. Exavier

    Exavier New Member

    Joined:
    Dec 12, 2007
    Messages:
    982 (0.40/day)
    Thanks Received:
    81
    Location:
    Bath, UK
    I very much doubt it's a new g80 as the most recent cards are g92..

    I would also discourage the fanboy attitudes already emerging in this thread...get whichever is best, they're both unreleased yet..

    also, this comes out on my birthday
    mega lol
     
  23. largon New Member

    Joined:
    May 6, 2005
    Messages:
    2,778 (0.81/day)
    Thanks Received:
    432
    Location:
    Tre, Suomi Finland
    Exavier,
    G200 is evolved G92 which is evolved G80. So it's more like a "new G80" as it's targeted for ultra high-end rather than performance-sector as G92.
    Well obviously nV chips are incompatible with Ubisoft's DX10.1 as removing that removes problems with nV GPUs.
     
  24. Valdez

    Joined:
    Sep 2, 2005
    Messages:
    294 (0.09/day)
    Thanks Received:
    25
    Location:
    Szekszárd, Hungary
    g92 is just a revised g80, as rv670 is a revised r600, and rv770 is an improved rv670.
     
  25. Valdez

    Joined:
    Sep 2, 2005
    Messages:
    294 (0.09/day)
    Thanks Received:
    25
    Location:
    Szekszárd, Hungary
    http://hardocp.com/article.html?art=MTQ5MywxLCxoZW50aHVzaWFzdA==

    when sp1 installed nvidia cards perform the same as no sp1 installed. There is no incompatibility, they run fine with dx10.1, but don't use it's features. So they run in dx10 mode even if dx10.1 isntalled.
     

Currently Active Users Viewing This Thread: 1 (0 members and 1 guest)

Share This Page