1. Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

More GeForce GTX 295 Details Trickle-in

Discussion in 'News' started by btarunr, Dec 13, 2008.

  1. btarunr

    btarunr Editor & Senior Moderator Staff Member

    Joined:
    Oct 9, 2007
    Messages:
    28,953 (11.02/day)
    Thanks Received:
    13,757
    Location:
    Hyderabad, India
    Slated for CES '09, the GeForce GTX 295 would spearhead NVIDIA's quest for performance supremacy. The dual-GPU card consists of two G200b graphics processors working in an internal multi-GPU mode. VR-Zone collected a few more details about this card.

    To begin with, the two GPUs will offer all their 240 stream processors unlike what earlier reports suggested. On the other hand, the memory subsystem of this card is peculiar. The card features a total of 1792 MB of memory (896 MB x 2), indicating that the memory configurations of the cores resemble those of the GeForce GTX 260, while the shader domains resemble those of the GTX 280 (240 SPs). The entire card is powered by an 8-pin and a 6-pin power connector. The construction resembles that of the GeForce 9800 GX2 in many aspects, where a monolithic cooler is sandwiched between two PCBs holding a GPU system each. The total power draw of the card is rated at 289W. The card has a single SLI bridge finger, indicating that it supports Quad-SLI in the same way the GeForce 9800 GX2 did (a maximum of two cards can be used in tandem).

    [​IMG] [​IMG]

    Source: VR-Zone
     
  2. Lionheart

    Lionheart

    Joined:
    Apr 30, 2008
    Messages:
    4,077 (1.68/day)
    Thanks Received:
    837
    Location:
    Milky Way Galaxy
    hmmmmm that is a bit odd, 1792mb but a total of 480 shaders, probably couldnt fit anymore memory modules on or too save costs to compete with the hd4870x2, but its gonna be one kick arse buggy beast of a card! :slap:
     
  3. btarunr

    btarunr Editor & Senior Moderator Staff Member

    Joined:
    Oct 9, 2007
    Messages:
    28,953 (11.02/day)
    Thanks Received:
    13,757
    Location:
    Hyderabad, India
    Notice in the first pic that there are no mem modules on a portion of the top PCB (outer side). The ROPs are decoupled from the shaders. What indicates it's going to be buggy?
     
  4. newtekie1

    newtekie1 Semi-Retired Folder

    Joined:
    Nov 22, 2005
    Messages:
    20,216 (6.10/day)
    Thanks Received:
    6,257
    You know, I'm starting to wonder if nVidia didn't cut down the memory bus in the G200b to just 448-bit. First, there is the new G200b PCB that nVidia released, with no room for the extra memory required to use the 512-bit bus, then there is this card which is also only using the 448-bit bus, and doesn't look from the pictures like it has any room for the extra memory either.

    Is it possible than nVidia permanantly reduced the memory bus on the G200b to just 448-bit to help reduce die size, and production costs? I mean, it isn't like the 512-bit bus really helped these cards all that much.

    Edit: Nevermind, I just saw the news on the GTX285 that will have the 512-bit bus.
     
    Last edited: Dec 13, 2008
    Crunching for Team TPU 50 Million points folded for TPU
  5. J-Man

    J-Man New Member

    Joined:
    May 14, 2007
    Messages:
    2,248 (0.81/day)
    Thanks Received:
    81
    Location:
    Oakham, UK
    This is already stated on the front page (homepage).
     
  6. CDdude55

    CDdude55 Crazy 4 TPU!!!

    Joined:
    Jul 12, 2007
    Messages:
    8,179 (3.01/day)
    Thanks Received:
    1,277
    Location:
    Virginia
    289W power draw at full load!

    But these in Quad SLI will kill!
     
  7. btarunr

    btarunr Editor & Senior Moderator Staff Member

    Joined:
    Oct 9, 2007
    Messages:
    28,953 (11.02/day)
    Thanks Received:
    13,757
    Location:
    Hyderabad, India
    Haha. This thread is the forum part of that frontpage item. Click on "Discuss" on that frontpage item, and you'll be directed to this thread.
     
  8. a_ump

    a_ump

    Joined:
    Nov 21, 2007
    Messages:
    3,620 (1.40/day)
    Thanks Received:
    376
    Location:
    Smithfield, WV
    interesting, though i hope it fails :p i want ATI to live and get more market share before nvidia tramples them like they did with the 88XX series. can't wait to see benchmarks i thk it'll be as good or a little worse than the HD 4870x2, ATI has had a lot of time to perfect or improve upon their dual chip design for a while now whereas nvidia has been doing monolithic design and then making teh dual chip/PCB just to try and take the performance crown. ATI has and i think will have better xfire scaling than SLI, plus they have a better xfire connection on their board than before and idk if nvidia will have improved upon their design looks like the same as 9800GX2 only different PCB/GPU/ etc.
     
  9. blastboy

    blastboy New Member

    Joined:
    Nov 30, 2007
    Messages:
    45 (0.02/day)
    Thanks Received:
    6
    would be nice if they wouda released the card already.. wont be able to step up.. $#@%!
     
  10. PVTCaboose1337

    PVTCaboose1337 Graphical Hacker

    Joined:
    Feb 1, 2006
    Messages:
    9,512 (2.93/day)
    Thanks Received:
    1,143
    Location:
    San Antonio, Texas
    Am I to assume the 2 pin slot near the 6 pin power connector is for a fan?
     
  11. newtekie1

    newtekie1 Semi-Retired Folder

    Joined:
    Nov 22, 2005
    Messages:
    20,216 (6.10/day)
    Thanks Received:
    6,257
    Nope, it is to connect a SPDIF connector from the sound card to pass the sound through for HDMI. It has been on nVidia cards since the G92 cards.
     
    Crunching for Team TPU 50 Million points folded for TPU
  12. Nick89

    Nick89

    Joined:
    Jun 1, 2006
    Messages:
    1,742 (0.56/day)
    Thanks Received:
    176
    Location:
    The Nevada Wasteland
    896+896=1792, a GTX260 has 896mb memory.
     
  13. Exavier

    Exavier New Member

    Joined:
    Dec 12, 2007
    Messages:
    982 (0.38/day)
    Thanks Received:
    81
    Location:
    Bath, UK
    if I could buy a gfx card all over again, I'd still go with an ATI solution purely because of the onboard HDMI audio solution...it's the little touches in high-£££ things that mark it apart..
     
  14. Binge

    Binge Overclocking Surrealism

    Joined:
    Sep 15, 2008
    Messages:
    6,982 (3.06/day)
    Thanks Received:
    1,752
    Location:
    PA, USA
    I'm glad they've got such good power management! Peak performance never looked so good :) Now let's hear news on the heat. With 240 shaders in two die being cooled by one fan with two dinky looking apertures. That seems to me like a recipe for disaster given my current 280's heat.
     
  15. btarunr

    btarunr Editor & Senior Moderator Staff Member

    Joined:
    Oct 9, 2007
    Messages:
    28,953 (11.02/day)
    Thanks Received:
    13,757
    Location:
    Hyderabad, India
    With 240 shaders per core for the GTX 295, that looks a very tough ask for the HD 4870 X2.

    They would've, if they could've. ATI won't be able to bring in much with the same RV770 cores in a dual-GPU boards, at least not pit it against virtually two GTX 280 cards in SLI. We need to see what the immediate successors for RV770 have in store.
     
  16. newtekie1

    newtekie1 Semi-Retired Folder

    Joined:
    Nov 22, 2005
    Messages:
    20,216 (6.10/day)
    Thanks Received:
    6,257
    That makes little to no sense.

    1.) nVidia has been doing the dual GPU cards for longer than ATi has. nVidia started it with the 7900GX2 which came out more than a year before ATi's first dual GPU card. And even then, it wasn't actually an ATi card, it was just a Sapphire exclusive designed by Sapphire.

    2.) nVidia has taken their original idea, and continued to refine it. The dual PCB design of the 7900GX2 has evolved into GTX295. ATi has done the same.

    3.) Yes, the GTX295 is similar in design to the 9800GX2, but how different is the HD4870x2 from the HD3870x2? Look at this picture, and tell me which is the HD4870x2 and which is the HD3870x2.

    4.) ATi has been the one that has needed to create dual GPU cards to take the performance crown. For the bast 2 generations, this has been the only way ATi has been able to outperform nVidia's single GPU cards.
     
    a_ump says thanks.
    Crunching for Team TPU 50 Million points folded for TPU
  17. btarunr

    btarunr Editor & Senior Moderator Staff Member

    Joined:
    Oct 9, 2007
    Messages:
    28,953 (11.02/day)
    Thanks Received:
    13,757
    Location:
    Hyderabad, India
    Should have masked that AMD logo as well :)
    The Fury Maxx is such an artifact.
     
  18. Binge

    Binge Overclocking Surrealism

    Joined:
    Sep 15, 2008
    Messages:
    6,982 (3.06/day)
    Thanks Received:
    1,752
    Location:
    PA, USA
    bottom one is most definitely the 4870x2 because of the placement of the ram and vregs
     
  19. Solaris17

    Solaris17 Creator Solaris Utility DVD

    Joined:
    Aug 16, 2005
    Messages:
    17,413 (5.10/day)
    Thanks Received:
    3,714
    Location:
    Florida
    bottom is 4870x2
     
  20. AsRock

    AsRock TPU addict

    Joined:
    Jun 23, 2007
    Messages:
    11,217 (4.10/day)
    Thanks Received:
    1,796
    Location:
    US
    Did not ATI do dual GPU quite some time ago ?.
    http://www.xbitlabs.com/images/video/crossfire/radeon_maxx.jpg

    Anyways wish both companys would hurry up and bring the next lot of cards in like the 5870 and 380?.
     
  21. a_ump

    a_ump

    Joined:
    Nov 21, 2007
    Messages:
    3,620 (1.40/day)
    Thanks Received:
    376
    Location:
    Smithfield, WV
    :p a lot of good points that my brain never thought of at the time, i was mearly speculating that nvidia hasn't improved their SLI bridge/connection w/e(to our knowledge) where as AMD has and their original xfire chip was superior to how the 9800GX2 was connected anyway, though it's going to be interesting since this will probly come out around the same time as ATI's HD 5870 or a little before it is released, i wonder what kind of performance improvements that will have over the HD 4870 and how it'll compare to the current HD 4870x2, and if it'll be as or more powerful than the GTX 280
     
  22. Haytch

    Haytch New Member

    Joined:
    Apr 7, 2008
    Messages:
    510 (0.21/day)
    Thanks Received:
    28
    Location:
    Australia
    When i changed from a 7600GT to a 7800GTX i thought i was the best! I soon came to realize that the size of the card is a joke. 2 slots for a gfx card is stupid and unacceptable. That didnt stop me from purchasing a pair 7900GTX's which were majorly flawed and replaced for 2 7950's.

    The 7950's were ok, but nowhere near as good as the single 8800GTX i picked up. Ever since then the 2slot gfx cards have become a standard and with the new X58+ series motherboards we come to realize that the loss of that extra slot is a waste.

    Ye, ye, im happy for the 295GTX, and i suppose ill be picking up a pair of them 2, but the fact remains that it will suck, like the 4870x2 sucks. We are many many many years away from decent gfx cards. Please keep inmind that my perception of decent may vary from your own.
     
  23. CDdude55

    CDdude55 Crazy 4 TPU!!!

    Joined:
    Jul 12, 2007
    Messages:
    8,179 (3.01/day)
    Thanks Received:
    1,277
    Location:
    Virginia
    Why are you buying them if they suck?
     
  24. Castiel

    Castiel

    Joined:
    May 5, 2008
    Messages:
    3,319 (1.37/day)
    Thanks Received:
    310
    Isn't this going to shown at CES next year?


    Would there be some TPU guys at CES next year?
     
  25. newconroer

    newconroer

    Joined:
    Jun 20, 2007
    Messages:
    3,253 (1.19/day)
    Thanks Received:
    351
    As much as I'd like to see Nvidia offer up something worthwhile with this...there's two things that bother me.

    1. Video ram/ram dac is still shared. Yes, having a total texture pool of near 2gb is helpful, but more so in theory, not in practice. If it was independent, thus being true 2gb, that would be another story. I'm wondering when dual processed GPUs are going to break that trend.

    2. The most previous dual process solution, the 4870 X2(yes 4850 is more 'previous' sue me...)is, nothing to shake a stick at, but I've said it before and will always continue to say it - for the amount of horespower under it's hood, I feel it almost falls completely on it's face. It should perform twice as well as it does; but like a vehicle motor, slapping on a super charger can only take you so far, while the rest of the factory parts drag you down or limit your potential and real-world performance.

    I don't think this Nvidia product is going to break either of those trends. It might be fast, in fact I'm fairly certain it will be, but if it doesn't perform at least 1 1/2 the amount of a normal 280, then...bleh.
     

Currently Active Users Viewing This Thread: 1 (0 members and 1 guest)

Share This Page