Wednesday, March 25th 2009

Single-PCB GeForce GTX 295 in the Works

Traditionally, NVIDIA designs dual-GPU accelerators with two PCBs holding a GPU system each. With the GeForce GTX 295 and its competitive pricing, NVIDIA found itself in a difficult position, as it faces direct competition from ATI with its now competitively priced Radeon HD 4870 X2. On the one hand, escalating manufacturing costs due to extreme competition with the sub-$300 graphics card market, is making it difficult for NVIDIA to keep up with GTX 295 stocks, on the other its repercussions that include bad press and losses due to not being able to keep up with demand, have pushed NVIDIA to rethink a way to make the GeForce GTX 295.

Enter innovation. The company is reportedly redesigning the GeForce GTX 295, this time on a single PCB design, which ATI has been using for its dual-GPU accelerators. Both GPU systems of the GTX 295 will be placed on a single PCB. This is expected to significantly bring down manufacturing costs, allowing the company to keep up with demands and competitive pricing. Expreview sourced the drawings of one of the prototypes, which shows a long single PCB card, with a central fan. You will also observe that there is a back-plate in place. It shows that a number of memory chips will be populated on the back, and both GPU systems on the front. It will be an engineering challenge, to populate five major heat-producing components (two G200b GPUs, two NVIO2 processors, and one BR-03 bridge chip), 28 GDDR3 memory chips, and the VRM area to power it all. The new redesigned card may surface internally in April, and may enter production by May.
Source: Expreview
Add your own comment

74 Comments on Single-PCB GeForce GTX 295 in the Works

#27
BazookaJoe
I like the idea of ram / bridge / voltage chips on the back and GPU (s) on the front...

the very very LARGE majority of the market only use 1 gfx card with a tiny portion of the total gaming market actually using SLI /Crossfire.

Why not have coolers on the back and front? - yeah the card wont be SLI /Crossfire compatible but so what - make that for single card users, who ARE the large majority of the market.

And in fact it would still be able to SLIX2 on a SLIX3 supporting board and for ppl who really need 2 gfx cards , and ppl who NEED 3 gfx cards can actually just go sit outside, because they simply DO NOT represent any significant share of the common gaming market, (No-matter how much they try and flame after this post) and they can use other cards if they really need to.

I'm coming across wrong here - What I'm trying to say is that Single card gamers are the biggest share of the market - there should be cards designed for that market (IE : Using space on the top and the bottom)
Posted on Reply
#28
newtekie1
Semi-Retired Folder
lemonadesodaI looked again at the "artists impression" of the new card. Look carefully at the top of the card and the fan position, and then the backplate. The location of the fan and the location of 2 sets of holes to mount a GPU heatsink indicate that there is no room on the face side for any VRM. So the VRM must be on the rear, or on a daughter VRM card... OR... the artist impression is just wrong.

Note also I could not spot the power connectors for the card on the picture... so parhaps we are all reading too much into this. LOL.
There are two power connectes in the second/right hand image of the card, right where you would expect them to be.

And there is plenty of room for VRMs in the middle of the card, it would definitely be a tight squeeze, but I think it could all fit.
Posted on Reply
#29
DrPepper
The Doctor is in the house
BazookaJoeI like the idea of ram / bridge / voltage chips on the back and GPU (s) on the front...

the very very LARGE majority of the market only use 1 gfx card with a tiny portion of the total gaming market actually using SLI /Crossfire.

Why not have coolers on the back and front? - yeah the card wont be SLI /Crossfire compatible but so what - make that for single card users, who ARE the large majority of the market.

And in fact it would still be able to SLIX2 on a SLIX3 supporting board and for ppl who really need 2 gfx cards , and ppl who NEED 3 gfx cards can actually just go sit outside, because they simply DO NOT represent any significant share of the common gaming market, (No-matter how much they try and flame after this post) and they can use other cards if they really need to.

I'm coming across wrong here - What I'm trying to say is that Single card gamers are the biggest share of the market - there should be cards designed for that market (IE : Using space on the top and the bottom)
Problem with that is in reverse atx design where the cards fan is facing upwards there is clearance issues with heatsinks and the processor. It can't be done without breaking standards which would lead to issue's and be a complete mess. All motherboards would need to be changed to facilitate two sided heatsinks. Also some cases won't be compatible with that because the first pci-e slot maybe the first out of the seven available expansion slots.
Posted on Reply
#30
Unregistered
Oh how well the original dual pcb design did.Its a forgotten relic now of bad design.A single pcb is better,its easier to design after market cooling for starters.I know all you nv fanboys like the dual pcb design but imo its cack,and more costly to design and make.How many after market coolers do you think there will be for a single card design? lots i reckon cause cooler manufacturers are much more likely to make a cooler for a single pcb card than the dual pcb monstrosity.Also for the people with water cooling,it is much easier for manufacturers to make a waterblock for a single card then the dual pcb card.
Posted on Edit | Reply
#31
DarkMatter
DrPepperProblem with that is in reverse atx design where the cards fan is facing upwards there is clearance issues with heatsinks and the processor. It can't be done without breaking standards which would lead to issue's and be a complete mess. All motherboards would need to be changed to facilitate two sided heatsinks. Also some cases won't be compatible with that because the first pci-e slot maybe the first out of the seven available expansion slots.
And that's the same reason that VRMs are probably on the front side. It would be logical to put the associated capacitors in the same side of the VRM chips and capacitors are tall enough (even solid state ones) + the backplate to exceed the standards, as you said. IMO they are either in the center as Newtekie said (the something blue in the fan intake??) or simply in the back of the card. I think there is place regardless of where the holes are. Anyway, holes can be like that just to meet standards, to make life easier for aftermarket coolers, but doesn't necessarily represent the exact center of where the GPU is, IMO.
Posted on Reply
#32
X-TeNDeR
Your local AMD fanboi here,
Once again, ATi's original concept of 2 gpu's on the same PCB has proven to be the better solution. Sandwich cards are for hungry people lulz. now let NVidia struggle with heat dissipation on this one. :)
Posted on Reply
#33
BazookaJoe
DrPepperProblem with that is in reverse atx design where the cards fan is facing upwards there is clearance issues with heatsinks and the processor. It can't be done without breaking standards which would lead to issue's and be a complete mess. All motherboards would need to be changed to facilitate two sided heatsinks. Also some cases won't be compatible with that because the first pci-e slot maybe the first out of the seven available expansion slots.
You are dead right - but Fluff it I say! :) - make the card, and stick a sticker on it to the effect of "This may not fit in EVERY case on earth - Sorry for you!" - Because we all know it will fit in most cases.

I'm just getting sick of this backwards theory that plagues the IT industry in general : "Oh it might not work perfectly in every situation - so lets rather not even attempt to innovate whatsoever"

I dunno - Maybe I'm just pissed because my lotto numbers never come in...
Posted on Reply
#34
DrPepper
The Doctor is in the house
BazookaJoeYou are dead right - but Fluff it I say! :) - make the card, and stick a sticker on it to the effect of "This may not fit in EVERY case on earth - Sorry for you!" - Because we all know it will fit in most cases.

I'm just getting sick of this backwards theory that plagues the IT industry in general : "Oh it might not work perfectly in every situation - so lets rather not even attempt to innovate whatsoever"

I dunno - Maybe I'm just pissed because my lotto numbers never come in...
Think of the consequences though ... billions if not trillions of people complaining they ignored the sticker and it didn't fit :laugh:
Posted on Reply
#35
eidairaman1
The Exiled Airman
hmm odd, i made a post and it disappeared, hmm probably the work of bta here, and all i said is this move is to cut overall production costs, instead of 2 PCBs needed for 1 card they can have like double the cards produced.
Posted on Reply
#36
BazookaJoe
DrPepperThink of the consequences though ... billions if not trillions of people complaining they ignored the sticker and it didn't fit :laugh:
And probably trying to lodge legal proceedings -.-
Posted on Reply
#38
Tatty_Two
Gone Fishing
X-TeNDeRYour local AMD fanboi here,
Once again, ATi's original concept of 2 gpu's on the same PCB has proven to be the better solution. Sandwich cards are for hungry people lulz. now let NVidia struggle with heat dissipation on this one. :)
That defies logic unless someone is a bit more specific with the term "better" ........... yes maybe better as in cheaper but recent history (3870x2 > 9800GX2..... HD4870x2 > GTX295) shows that the 2 PCB designs dont actually give off any more heat (or perhaps it's a case of NVidia just manufacture better coolers IDK) and the NVida offererings seem to be a little more potent so.......I, as you readily admit, again call................ (see attached pic)

You know what they say "if the cap fits....." I dont wear it myself on this occasion as I currently own ATi so cant really be an Nvidia fanboi

:p
Posted on Reply
#39
X-TeNDeR
The current NVidia offerings are more potent, thats true. but i was referring to the now apparent fact that the complicated sandwich designs made by NVidia are too expensive and big. fanbiosm aside, i wouldnt mind having the GTX295, performance-wise. but these cards are too expensive and complicated. NVidia sees this and this is where the new design comes in.

And that shows that the general approach of single pcb is better, in terms of manufacturing costs and availability, especially in the long term. ATi must of made the math and stick to this design.

I'm glad NVidia is doing this, as this will lower the costs for everyone, and keep the market competitive.
Posted on Reply
#41
Tatty_Two
Gone Fishing
h3llb3nd4then they'll kill ati :D
Cap goes to you too! :laugh:
Posted on Reply
#42
h3llb3nd4
yep! with nvidia & intel on it :D
Posted on Reply
#43
OnionMan
Ya, again with memory on the back.. How fun to watercool..

The big question for me is how much more does it cost to produce it the way it is now compared to setting up a design team to redesign it and then cutting the cost of it.. Only to have a new model to replace it in just a few months.. Surely money and time spent on a redesign won't drive this product down more than $100..

Now if the price goes down to ATI's x2 prices, it would be a great deal, but ATI won't let them get that close.. At least without releasing something just under it for less or reducing it's already discounted line..

But in the end, I'd put money on this thing not being widely available or only being available for a very short period of time.. But then again, what do I know..;)
Posted on Reply
#44
KainXS
so after the 7950gx2, the 9800gx2 and the gtx295 nvidia now figures out, oh, this is not a good idea, lmao:roll:
Posted on Reply
#45
X-TeNDeR
KainXSso after the 7950gx2, the 9800gx2 and the gtx295 nvidia now figures out, oh, this is not a good idea, lmao:roll:
Exactly!

I don't hate NVidia, i hate the SANDWICH design.

OnionMan, the hope is for them to stop making these sandwiches permanently, and stick to one pcb designs for good. no matter how long this new 295 lasts on the market, the important thing is the change in design approach imo. :toast:
Posted on Reply
#46
DrPepper
The Doctor is in the house
KainXSso after the 7950gx2, the 9800gx2 and the gtx295 nvidia now figures out, oh, this is not a good idea, lmao:roll:
Can't have been that bad if they all stole the performance crown from ati :p
Posted on Reply
#47
Silverel
Two of these in Quad-SLI Folding, on water cooling... sheeyit. I could stop paying my gas bill and run it through the hot water pipes :D
Posted on Reply
#48
DarkMatter
KainXSso after the 7950gx2, the 9800gx2 and the gtx295 nvidia now figures out, oh, this is not a good idea, lmao:roll:
I think that's not the case. Sandwich design has proven to be the better cooling solution for dual cards. Although it's more expensive, it's better and now they are redesigning it, because the first one was a little bit overkill. That's how Nvidia works, simply. They do the sdame with the VRMs and many other things. First make it work like a charm, then find the way to make it cheaper without sacrifices. They could perform tons of tests in labs, but nothing will tell them better how cards are cooled in actual consumers' PCs, than seing them in actual PCs. Hence they went overkill, so that it works on every computer while at the same time they have a glance at how much watts the average consumer PC case can disipate in real world and act accordingly with the new design.
Posted on Reply
#49
D4S4
This could probably be one of the most expensive PCBs EVER
Posted on Reply
#50
Assimilator
KainXSso after the 7950gx2, the 9800gx2 and the gtx295 nvidia now figures out, oh, this is not a good idea, lmao
If the dual-PCB design is "not a good idea", how come all of the cards you listed were/are the performance kings of their respective generations?

Also, nVidia aren't investing all this time and money in a redesign for the good of the consumer - since a single-PCB card will be cheaper to produce, they can put a higher margin on it, while still offering it to the consumer at a lower price than the current GTX 295. So everyone wins - except the overclockers (and AMD).
Posted on Reply
Add your own comment
May 4th, 2024 11:45 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts