Saturday, December 13th 2008

More GeForce GTX 295 Details Trickle-in

Slated for CES '09, the GeForce GTX 295 would spearhead NVIDIA's quest for performance supremacy. The dual-GPU card consists of two G200b graphics processors working in an internal multi-GPU mode. VR-Zone collected a few more details about this card.

To begin with, the two GPUs will offer all their 240 stream processors unlike what earlier reports suggested. On the other hand, the memory subsystem of this card is peculiar. The card features a total of 1792 MB of memory (896 MB x 2), indicating that the memory configurations of the cores resemble those of the GeForce GTX 260, while the shader domains resemble those of the GTX 280 (240 SPs). The entire card is powered by an 8-pin and a 6-pin power connector. The construction resembles that of the GeForce 9800 GX2 in many aspects, where a monolithic cooler is sandwiched between two PCBs holding a GPU system each. The total power draw of the card is rated at 289W. The card has a single SLI bridge finger, indicating that it supports Quad-SLI in the same way the GeForce 9800 GX2 did (a maximum of two cards can be used in tandem).
Source: VR-Zone
Add your own comment

51 Comments on More GeForce GTX 295 Details Trickle-in

#1
Lionheart
hmmmmm that is a bit odd, 1792mb but a total of 480 shaders, probably couldnt fit anymore memory modules on or too save costs to compete with the hd4870x2, but its gonna be one kick arse buggy beast of a card! :slap:
Posted on Reply
#2
btarunr
Editor & Senior Moderator
Notice in the first pic that there are no mem modules on a portion of the top PCB (outer side). The ROPs are decoupled from the shaders. What indicates it's going to be buggy?
Posted on Reply
#3
newtekie1
Semi-Retired Folder
You know, I'm starting to wonder if nVidia didn't cut down the memory bus in the G200b to just 448-bit. First, there is the new G200b PCB that nVidia released, with no room for the extra memory required to use the 512-bit bus, then there is this card which is also only using the 448-bit bus, and doesn't look from the pictures like it has any room for the extra memory either.

Is it possible than nVidia permanantly reduced the memory bus on the G200b to just 448-bit to help reduce die size, and production costs? I mean, it isn't like the 512-bit bus really helped these cards all that much.

Edit: Nevermind, I just saw the news on the GTX285 that will have the 512-bit bus.
Posted on Reply
#4
J-Man
This is already stated on the front page (homepage).
Posted on Reply
#5
CDdude55
Crazy 4 TPU!!!
289W power draw at full load!

But these in Quad SLI will kill!
Posted on Reply
#6
btarunr
Editor & Senior Moderator
J-ManThis is already stated on the front page (homepage).
Haha. This thread is the forum part of that frontpage item. Click on "Discuss" on that frontpage item, and you'll be directed to this thread.
Posted on Reply
#7
a_ump
interesting, though i hope it fails :p i want ATI to live and get more market share before nvidia tramples them like they did with the 88XX series. can't wait to see benchmarks i thk it'll be as good or a little worse than the HD 4870x2, ATI has had a lot of time to perfect or improve upon their dual chip design for a while now whereas nvidia has been doing monolithic design and then making teh dual chip/PCB just to try and take the performance crown. ATI has and i think will have better xfire scaling than SLI, plus they have a better xfire connection on their board than before and idk if nvidia will have improved upon their design looks like the same as 9800GX2 only different PCB/GPU/ etc.
Posted on Reply
#8
blastboy
would be nice if they wouda released the card already.. wont be able to step up.. $#@%!
Posted on Reply
#9
PVTCaboose1337
Graphical Hacker
Am I to assume the 2 pin slot near the 6 pin power connector is for a fan?
Posted on Reply
#10
newtekie1
Semi-Retired Folder
PVTCaboose1337Am I to assume the 2 pin slot near the 6 pin power connector is for a fan?
Nope, it is to connect a SPDIF connector from the sound card to pass the sound through for HDMI. It has been on nVidia cards since the G92 cards.
Posted on Reply
#11
Nick89
CHAOS_KILLAhmmmmm that is a bit odd, 1792mb but a total of 480 shaders, probably couldnt fit anymore memory modules on or too save costs to compete with the hd4870x2, but its gonna be one kick arse buggy beast of a card! :slap:
896+896=1792, a GTX260 has 896mb memory.
Posted on Reply
#12
Exavier
if I could buy a gfx card all over again, I'd still go with an ATI solution purely because of the onboard HDMI audio solution...it's the little touches in high-£££ things that mark it apart..
Posted on Reply
#13
Binge
Overclocking Surrealism
I'm glad they've got such good power management! Peak performance never looked so good :) Now let's hear news on the heat. With 240 shaders in two die being cooled by one fan with two dinky looking apertures. That seems to me like a recipe for disaster given my current 280's heat.
Posted on Reply
#14
btarunr
Editor & Senior Moderator
a_umpi thk it'll be as good or a little worse than the HD 4870x2,
With 240 shaders per core for the GTX 295, that looks a very tough ask for the HD 4870 X2.
a_umpATI has had a lot of time to perfect or improve upon their dual chip design for a while now whereas nvidia has been doing monolithic design and then making teh dual chip/PCB just to try and take the performance crown.
They would've, if they could've. ATI won't be able to bring in much with the same RV770 cores in a dual-GPU boards, at least not pit it against virtually two GTX 280 cards in SLI. We need to see what the immediate successors for RV770 have in store.
Posted on Reply
#15
newtekie1
Semi-Retired Folder
a_umpinteresting, though i hope it fails :p i want ATI to live and get more market share before nvidia tramples them like they did with the 88XX series. can't wait to see benchmarks i thk it'll be as good or a little worse than the HD 4870x2, ATI has had a lot of time to perfect or improve upon their dual chip design for a while now whereas nvidia has been doing monolithic design and then making teh dual chip/PCB just to try and take the performance crown. ATI has and i think will have better xfire scaling than SLI, plus they have a better xfire connection on their board than before and idk if nvidia will have improved upon their design looks like the same as 9800GX2 only different PCB/GPU/ etc.
That makes little to no sense.

1.) nVidia has been doing the dual GPU cards for longer than ATi has. nVidia started it with the 7900GX2 which came out more than a year before ATi's first dual GPU card. And even then, it wasn't actually an ATi card, it was just a Sapphire exclusive designed by Sapphire.

2.) nVidia has taken their original idea, and continued to refine it. The dual PCB design of the 7900GX2 has evolved into GTX295. ATi has done the same.

3.) Yes, the GTX295 is similar in design to the 9800GX2, but how different is the HD4870x2 from the HD3870x2? Look at this picture, and tell me which is the HD4870x2 and which is the HD3870x2.

4.) ATi has been the one that has needed to create dual GPU cards to take the performance crown. For the bast 2 generations, this has been the only way ATi has been able to outperform nVidia's single GPU cards.
Posted on Reply
#16
btarunr
Editor & Senior Moderator
newtekie1Look at this picture, and tell me which is the HD4870x2 and which is the HD3870x2.
Should have masked that AMD logo as well :)
The Fury Maxx is such an artifact.
Posted on Reply
#17
Binge
Overclocking Surrealism
bottom one is most definitely the 4870x2 because of the placement of the ram and vregs
Posted on Reply
#18
Solaris17
Super Dainty Moderator
newtekie1That makes little to no sense.

1.) nVidia has been doing the dual GPU cards for longer than ATi has. nVidia started it with the 7900GX2 which came out more than a year before ATi's first dual GPU card. And even then, it wasn't actually an ATi card, it was just a Sapphire exclusive designed by Sapphire.

2.) nVidia has taken their original idea, and continued to refine it. The dual PCB design of the 7900GX2 has evolved into GTX295. ATi has done the same.

3.) Yes, the GTX295 is similar in design to the 9800GX2, but how different is the HD4870x2 from the HD3870x2? Look at this picture, and tell me which is the HD4870x2 and which is the HD3870x2.

4.) ATi has been the one that has needed to create dual GPU cards to take the performance crown. For the bast 2 generations, this has been the only way ATi has been able to outperform nVidia's single GPU cards.
bottom is 4870x2
Posted on Reply
#19
AsRock
TPU addict
newtekie1That makes little to no sense.

1.) nVidia has been doing the dual GPU cards for longer than ATi has. nVidia started it with the 7900GX2 which came out more than a year before ATi's first dual GPU card. And even then, it wasn't actually an ATi card, it was just a Sapphire exclusive designed by Sapphire.

2.) nVidia has taken their original idea, and continued to refine it. The dual PCB design of the 7900GX2 has evolved into GTX295. ATi has done the same.

3.) Yes, the GTX295 is similar in design to the 9800GX2, but how different is the HD4870x2 from the HD3870x2? Look at this picture, and tell me which is the HD4870x2 and which is the HD3870x2.

4.) ATi has been the one that has needed to create dual GPU cards to take the performance crown. For the bast 2 generations, this has been the only way ATi has been able to outperform nVidia's single GPU cards.
Did not ATI do dual GPU quite some time ago ?.
www.xbitlabs.com/images/video/crossfire/radeon_maxx.jpg

Anyways wish both companys would hurry up and bring the next lot of cards in like the 5870 and 380?.
Posted on Reply
#20
a_ump
newtekie1That makes little to no sense.

1.) nVidia has been doing the dual GPU cards for longer than ATi has. nVidia started it with the 7900GX2 which came out more than a year before ATi's first dual GPU card. And even then, it wasn't actually an ATi card, it was just a Sapphire exclusive designed by Sapphire.

2.) nVidia has taken their original idea, and continued to refine it. The dual PCB design of the 7900GX2 has evolved into GTX295. ATi has done the same.

3.) Yes, the GTX295 is similar in design to the 9800GX2, but how different is the HD4870x2 from the HD3870x2? Look at this picture, and tell me which is the HD4870x2 and which is the HD3870x2.

4.) ATi has been the one that has needed to create dual GPU cards to take the performance crown. For the bast 2 generations, this has been the only way ATi has been able to outperform nVidia's single GPU cards.
:p a lot of good points that my brain never thought of at the time, i was mearly speculating that nvidia hasn't improved their SLI bridge/connection w/e(to our knowledge) where as AMD has and their original xfire chip was superior to how the 9800GX2 was connected anyway, though it's going to be interesting since this will probly come out around the same time as ATI's HD 5870 or a little before it is released, i wonder what kind of performance improvements that will have over the HD 4870 and how it'll compare to the current HD 4870x2, and if it'll be as or more powerful than the GTX 280
Posted on Reply
#21
Haytch
When i changed from a 7600GT to a 7800GTX i thought i was the best! I soon came to realize that the size of the card is a joke. 2 slots for a gfx card is stupid and unacceptable. That didnt stop me from purchasing a pair 7900GTX's which were majorly flawed and replaced for 2 7950's.

The 7950's were ok, but nowhere near as good as the single 8800GTX i picked up. Ever since then the 2slot gfx cards have become a standard and with the new X58+ series motherboards we come to realize that the loss of that extra slot is a waste.

Ye, ye, im happy for the 295GTX, and i suppose ill be picking up a pair of them 2, but the fact remains that it will suck, like the 4870x2 sucks. We are many many many years away from decent gfx cards. Please keep inmind that my perception of decent may vary from your own.
Posted on Reply
#22
CDdude55
Crazy 4 TPU!!!
HaytchWhen i changed from a 7600GT to a 7800GTX i thought i was the best! I soon came to realize that the size of the card is a joke. 2 slots for a gfx card is stupid and unacceptable. That didnt stop me from purchasing a pair 7900GTX's which were majorly flawed and replaced for 2 7950's.

The 7950's were ok, but nowhere near as good as the single 8800GTX i picked up. Ever since then the 2slot gfx cards have become a standard and with the new X58+ series motherboards we come to realize that the loss of that extra slot is a waste.

Ye, ye, im happy for the 295GTX, and i suppose ill be picking up a pair of them 2, but the fact remains that it will suck, like the 4870x2 sucks. We are many many many years away from decent gfx cards. Please keep inmind that my perception of decent may vary from your own.
Why are you buying them if they suck?
Posted on Reply
#23
Castiel
Isn't this going to shown at CES next year?


Would there be some TPU guys at CES next year?
Posted on Reply
#24
newconroer
As much as I'd like to see Nvidia offer up something worthwhile with this...there's two things that bother me.

1. Video ram/ram dac is still shared. Yes, having a total texture pool of near 2gb is helpful, but more so in theory, not in practice. If it was independent, thus being true 2gb, that would be another story. I'm wondering when dual processed GPUs are going to break that trend.

2. The most previous dual process solution, the 4870 X2(yes 4850 is more 'previous' sue me...)is, nothing to shake a stick at, but I've said it before and will always continue to say it - for the amount of horespower under it's hood, I feel it almost falls completely on it's face. It should perform twice as well as it does; but like a vehicle motor, slapping on a super charger can only take you so far, while the rest of the factory parts drag you down or limit your potential and real-world performance.

I don't think this Nvidia product is going to break either of those trends. It might be fast, in fact I'm fairly certain it will be, but if it doesn't perform at least 1 1/2 the amount of a normal 280, then...bleh.
Posted on Reply
#25
newtekie1
Semi-Retired Folder
btarunrShould have masked that AMD logo as well :)
The Fury Maxx is such an artifact.
I had forgot about that, but what I meant was dual GPU cards as they are implemented today using Crossfire and SLi.
Solaris17bottom is 4870x2
You sure?
AsRockDid not ATI do dual GPU quite some time ago ?.
www.xbitlabs.com/images/video/crossfire/radeon_maxx.jpg

Anyways wish both companys would hurry up and bring the next lot of cards in like the 5870 and 380?.
I forgot about those cards, but I meant in modern iderations using Crossfire and SLi, too bad those cards were not supported on anything beyond Windows ME.
a_ump:p a lot of good points that my brain never thought of at the time, i was mearly speculating that nvidia hasn't improved their SLI bridge/connection w/e(to our knowledge) where as AMD has and their original xfire chip was superior to how the 9800GX2 was connected anyway, though it's going to be interesting since this will probly come out around the same time as ATI's HD 5870 or a little before it is released, i wonder what kind of performance improvements that will have over the HD 4870 and how it'll compare to the current HD 4870x2, and if it'll be as or more powerful than the GTX 280
NVidia didn't need to improve their SLi bridge connection used in the 9800GX2, it worked perfectly fine. AMD had to improve their bridge chip because it didn't support PCI-E 2.0. There really isn't anything superior about the way ATi does the internal crossfire connector vs. nVidia's SLi bridge cable.
HaytchWhen i changed from a 7600GT to a 7800GTX i thought i was the best! I soon came to realize that the size of the card is a joke. 2 slots for a gfx card is stupid and unacceptable. That didnt stop me from purchasing a pair 7900GTX's which were majorly flawed and replaced for 2 7950's.

The 7950's were ok, but nowhere near as good as the single 8800GTX i picked up. Ever since then the 2slot gfx cards have become a standard and with the new X58+ series motherboards we come to realize that the loss of that extra slot is a waste.

Ye, ye, im happy for the 295GTX, and i suppose ill be picking up a pair of them 2, but the fact remains that it will suck, like the 4870x2 sucks. We are many many many years away from decent gfx cards. Please keep inmind that my perception of decent may vary from your own.
Dual slot Graphics cards are wonderful, and I will always buy a dual slot card when available over the single slot card. There really aren't any benefit to single slot cards anymore, the lost PCI slots aren't worth anything.

Of course 7950's are not going to be as good as a single 8800GTX, the 8800GTX is a generation newer, why would you expect the 7950s to even hold a candle to the 8800GTX?
Posted on Reply
Add your own comment
Apr 24th, 2024 18:11 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts