1. Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Next-gen NVIDIA GeForce Specifications Unveiled

Discussion in 'News' started by malware, May 22, 2008.

  1. largon

    largon

    Joined:
    May 6, 2005
    Messages:
    2,782 (0.80/day)
    Thanks Received:
    433
    Location:
    Tre, Suomi Finland
    ...with ~4x the performance.
     
  2. Valdez

    Joined:
    Sep 2, 2005
    Messages:
    294 (0.09/day)
    Thanks Received:
    25
    Location:
    Szekszárd, Hungary
    Maybe you're right. But samsung and hynix are going to produce gddr5 too, not only qimonda. I would bet, we will not see a nv card with ddr5 later :) I don't think that the gddr5 is just a marketing strategy (gddr4 was this), but we will see when there will be benchmarks on the net :)

    Meanwhile i edited my previous post :)
     
    Last edited: May 22, 2008
  3. DarkMatter New Member

    Joined:
    Oct 5, 2007
    Messages:
    1,714 (0.66/day)
    Thanks Received:
    184
    But availability NOW is low, and they have to release NOW. In the near future I don't know. They have already said there's going to be a 55nm refresh soon and they could add GDDR5 support then, once availability is better.
    I know that's something that's going to piss off some people, as if the fact that a 55nm version comes out would make their cards worse, but it's going to happen. People will buy >> people will enjoy >> new 55 nm card will launch >> people will complain "why they didn't release 55 nm in the first place? ****ng nvidiots". Even though they already know it will happen before launch...
     
  4. Darkrealms

    Joined:
    Feb 26, 2007
    Messages:
    852 (0.30/day)
    Thanks Received:
    23
    Location:
    USA
    Nvidia sounds like they are getting comfortable where they are as far as designs go. I don't know about the availability of GDDR5 but I do remember the performance increase of GDDR4 wasn't that much better than GDDR3 so Nvidia may not even see it as worth it untill GDDR5 is standard/common.

    Anyone ever think that Nvidia may never go to DX10.1, there are a lot of companys these days that don't like/want to work with MS. 2¢ But I think some of the industry is trying to get away from MS controlled graphics.
     
  5. Assimilator

    Assimilator

    Joined:
    Feb 18, 2005
    Messages:
    623 (0.17/day)
    Thanks Received:
    105
    Location:
    South Africa
    IMO, by the time we see games that require DirectX 10.1, the D10 GPU will be ancient history.
     
  6. Urbklr

    Urbklr

    Joined:
    May 16, 2007
    Messages:
    2,346 (0.85/day)
    Thanks Received:
    247
    Location:
    Nova Scotia, Canada
    So, i don't get why people are saying the R770 is a just a beefed up R670...

    The R7xx was in development before the R600 was even released, AMD said they were taking all there focus to the R7xx. The R770 is all new....AMD has confirmed the above.

    And the GT200 is also all-new, the cards both look amazing on paper. Just like the G80 and R600 did. Remember how many people thought the R600 was gonna lay down the law:p, when they saw the specs? This is no different, the specs look much better, just as the R600 looked better on paper....that doesn't always transfer to real-world performance. All we can do is wait and see.

    PS: "R770/GT200 rulezzz!!"....is just 97% fanyboy crap...
     
    Last edited: May 22, 2008
  7. Haytch

    Haytch New Member

    Joined:
    Apr 7, 2008
    Messages:
    510 (0.21/day)
    Thanks Received:
    28
    Location:
    Australia
    Latency Vs Bandwith ? Its a wait and see story.
    I have to purchase 2 x 4870x2's because i decided that a single 3870x2 would do the job in the ATi system i have. That wont stop me from upgrading my Nv system. I wouldnt mind playing with the CUDA on the EN8800GTX before i throw the card away.

    I look forward to the GDDR5 bandwith to be utillized efficiently by AMD/ATi because its the way of the future! And suspect the reason Nvidia havnt moved onto GDDR5 is due to;
    * Cheaper ram modules for a well aged technology with better latency, hoping to keep price competative with AMD/ATi's cards.
    * To allow themselves to make as much money as possible off GDDR3 technology now that they got CUDA working before the public designs crazy C based software for the rest of us that might give them a greater advantage in sales the next round of releases.

    Either way, im looking at big bucks, we all are. . .
     
  8. MrMilli

    MrMilli

    Joined:
    Mar 1, 2008
    Messages:
    216 (0.09/day)
    Thanks Received:
    35
    Location:
    Antwerp, Belgium
    Are the GT200 and R700 new gpu's or not?
    Well the basic designs aren't. The actual gpu's are new ofcourse.

    History:
    R300 => R420 => R520
    ATI used the basic R300 design from august 2002 until R600 was released (may 2007 but should have been on the market 6 months earlier without delay).

    NV40 => G70
    nVidia used NV40 technology from april 2004 until november 2006.

    So it's quiet common to use certain technology for a couple generations. This will be even more profound with current generation of gpu's because of the increased complexity of unified shaders.
    It takes 2 to 3 years to design a gpu like the R300, NV40, R600 or G80. After that you get the usual updates. Even a process shrink, let's say 65nm to 45nm, takes almost a year without even touching the design. These companies manage to hide this time because they have multiple design teams working in parallel.
    The same thing happens with cpu's. Look at K8 and K10. Look at Conroe and Penryn.
    Expect really new architectures from ATI and nVidia somewhere in 2009, maybe even later and they will be DX11.
     
    newconroer says thanks.
  9. warhammer New Member

    Joined:
    Jan 10, 2008
    Messages:
    204 (0.08/day)
    Thanks Received:
    25
    Wich ever new card ATI or Nvidia can crack the 100+ FPS in CRYSIS maxred out will be the winner.
    The price is going to be the KILLER.:cry:
     
  10. Rebo&Zooty

    Rebo&Zooty New Member

    Joined:
    May 17, 2008
    Messages:
    490 (0.21/day)
    Thanks Received:
    30
    acctualy the conroe was based on core that was based on pentium-m that was based on p3, if u check some cpu id apps the core2 chips come up as pentium3 multi core chips, imagin if they had stuck with p3 insted of moving to pentium4..........
     
  11. sethk New Member

    Joined:
    Apr 14, 2007
    Messages:
    63 (0.02/day)
    Thanks Received:
    5
    The combination of GDDR5 and 512bit would have been too much for the consumer to bear, cost wise. There's plenty of GDDR3, and with twice the bandwidth no need to clock the memory particularly high. Think about it, who's going to be supply constrained?

    Once (if?) GDDR5 is plentiful, Nvidia will come out with a lower cost redesign that's GDDR5 and 256bit or some odd bit depth like 320 or 384. Just like G92 was able to take the much more expensive G80 design and get equivalent performance at 256bit, we will see something similar for the GT200. In the meanwhile make no mistake, this is the true successor the G80, going for the high end crown.

    I'm sure we'll also see the GX2 version of this before year's end.
     
    DarkMatter says thanks.
  12. bill_d New Member

    Joined:
    Mar 9, 2008
    Messages:
    35 (0.01/day)
    Thanks Received:
    0
    how you going to get 500+ watts to a 280gx2
     
  13. [I.R.A]_FBi

    [I.R.A]_FBi New Member

    Joined:
    May 19, 2007
    Messages:
    7,664 (2.79/day)
    Thanks Received:
    540
    Location:
    c:\programs\kitteh.exe
    its own built in psu maybe?
     
  14. btarunr

    btarunr Editor & Senior Moderator Staff Member

    Joined:
    Oct 9, 2007
    Messages:
    28,860 (11.08/day)
    Thanks Received:
    13,715
    Location:
    Hyderabad, India
    NVidia never made a dual-GPU card using G80, ATI never made one using R600 either. I think there will be a tone-down GPU derived from GT200 that will make it to the next dual-GPU card from NV. By 'tone-down' I'm refering to what G92 and RV670 were to G80 and R600.
     
  15. bill_d New Member

    Joined:
    Mar 9, 2008
    Messages:
    35 (0.01/day)
    Thanks Received:
    0
    maybe, but i think till they move to 55nm and if they get power savings like ati did from 2900 to 3870 you won't see a new gx2
     
  16. lemonadesoda

    lemonadesoda

    Joined:
    Aug 30, 2006
    Messages:
    6,267 (2.08/day)
    Thanks Received:
    968
  17. Rebo&Zooty

    Rebo&Zooty New Member

    Joined:
    May 17, 2008
    Messages:
    490 (0.21/day)
    Thanks Received:
    30
    damn u lemonadesoda, you stole my bit, i been using that 2nd link for weeks/months now, well versions of it.........:)
     
  18. DarkMatter New Member

    Joined:
    Oct 5, 2007
    Messages:
    1,714 (0.66/day)
    Thanks Received:
    184
    and damn you Rebo, I felt the urge to click on that second link after your post, even though I didn't want to do so. :D

    It's been removed BTW.

    Exactly what I was saying. For Nvidia supply is a very important thing. 8800 GT was an exception in a long history of delibering plenty of cards at launch. Paper launch is Ati's bussines, not Nvidia's, don't forget this guys.
     
  19. Rebo&Zooty

    Rebo&Zooty New Member

    Joined:
    May 17, 2008
    Messages:
    490 (0.21/day)
    Thanks Received:
    30
    whats been removed, both links work perfectly for me
     
  20. Rebo&Zooty

    Rebo&Zooty New Member

    Joined:
    May 17, 2008
    Messages:
    490 (0.21/day)
    Thanks Received:
    30
  21. laszlo

    laszlo

    Joined:
    Jan 11, 2005
    Messages:
    891 (0.25/day)
    Thanks Received:
    105
    Location:
    66 feet from the ground
    is normal that Amd-Ati will use DDR5 and Nvidia not mainly because Ati has promoted and invested in the research and production and all ram manufacturers will serve 1st ati with the new tech. and this 2 step distance will remain between them i think
     
  22. InnocentCriminal

    InnocentCriminal Resident Grammar Amender

    Joined:
    Feb 21, 2005
    Messages:
    6,484 (1.82/day)
    Thanks Received:
    847
    It's interesting to see that the 280 will be using a 512Mbit memory bus, that alone should help on the performance. ATi should have implemented it in the 4870(X2).
     
  23. MrMilli

    MrMilli

    Joined:
    Mar 1, 2008
    Messages:
    216 (0.09/day)
    Thanks Received:
    35
    Location:
    Antwerp, Belgium
    Well i didn't want to go into details of cpu's. I'm just making a reference.
     
  24. DarkMatter New Member

    Joined:
    Oct 5, 2007
    Messages:
    1,714 (0.66/day)
    Thanks Received:
    184
    That's something that has been debated a lot. IMO P4 was a good decision at that time, but it stayed too long. P3 had reached a wall and P4 was the only way they saw to overcome this. It's like a jam on the highway, sometimes your lane does not move forward and you find that the next one does, so you change lanes. A bit later your lane stops and you see that your previous lane is moving faster, but you can not change right away again. At the end you always come back, but the question remains whether you would have advanced more if you had stayed in it in the first place. Usually if you're smart and lucky enough, you advance more changing lanes. It didn't work for Intel, or yes, actually there is no way of knowing.
     
  25. btarunr

    btarunr Editor & Senior Moderator Staff Member

    Joined:
    Oct 9, 2007
    Messages:
    28,860 (11.08/day)
    Thanks Received:
    13,715
    Location:
    Hyderabad, India
    Right, it's back to discussing about two companies that are pregnant and of whose baby is better. Like babies, things look better after birth, scans and graphs are always blurry.
     

Currently Active Users Viewing This Thread: 1 (0 members and 1 guest)

Share This Page