Wednesday, December 10th 2008

NVIDIA GeForce GTX 295 Spotted

VR-Zone scored the first photo of the upcoming GeForce GTX 295 card that's reported to make first appearance at next year's Consumer Electronics Show (CES). Unlike previous reports, the card will feature a sandwich design, like most dual GPU cards released by NVIDIA. Two 55nm GT200 CPUs will be incorporated in this card. From the picture we also see two DVI and one Display ports. The source also reports the card is using 8+6 pin connector combo to deliver external power. The pricing is yet to be disclosed, but card makers are speculating that NVIDIA will price it competitively against AMD 4870X2.
Source: VR-Zone
Add your own comment

96 Comments on NVIDIA GeForce GTX 295 Spotted

#76
lemonadesoda
DarkMatterSomeone has to fight the increasing load of BS in the forums. Like everything in your post. BS.
Try to be less rude and less arrogant. You'll get along a lot better with people on this forum if you try harder.

PS. You wasted two years on your engineering degree if you think it is EASIER to cool 200W sandwiched within a 1 inch space (200W per inch) than it is to cool 100W over 1 inch then another 100W over another inch (100W per inch). Yes, you can do it. But it's not going to be as easy and it will be noisier.
Posted on Reply
#77
Solaris17
Super Dainty Moderator
lemonadesodaTry to be less rude and less arrogant. You'll get along a lot better with people on this forum if you try harder.

PS. You wasted two years on your engineering degree if you think it is EASIER to cool 200W sandwiched within a 1 inch space (200W per inch) than it is to cool 100W over 1 inch then another 100W over another inch (100W per inch). Yes, you can do it. But it's not going to be as easy and it will be noisier.
55nm or not i will bank with you on that the 260 cores will run hotter...they are bigger and have more internals they will load rediculosly hot ....and if someone gets 2 GTX295's i have some bad news for them and how hot it will run...however regardless of fact or speculation this thread needs to calm down and im absolutely serious in case no one bothered to heed it the first time.
Posted on Reply
#78
btarunr
Editor & Senior Moderator
Two G200b GPUs + dozens of memory chips + NVIO 2 + BR-03 (nForce 200) chip all powered by 6+8 pin in all...mighty impressive!
Posted on Reply
#79
DarkMatter
lemonadesodaTry to be less rude and less arrogant. You'll get along a lot better with people on this forum if you try harder.

PS. You wasted two years on your engineering degree if you think it is EASIER to cool 200W sandwiched within a 1 inch space (200W per inch) than it is to cool 100W over 1 inch then another 100W over another inch (100W per inch). Yes, you can do it. But it's not going to be as easy and it will be noisier.
Wrong. Look at reviews, both cores on the GX2 run cooler than the second GPU in any X2. That is the fact, no need for expeculation. The reason I gave is why that occurs. I know the basis of termodynamics, thanks and btw it's all about VOLUMES of air that will have contact with the chip/fins and SURFACES (of the chips and cooler fins in this case) not about the space between the pcbs, 1 inch or whatever. As long as you move enough air, and the GX2 does, it doesn't matter how much space you have. The ammount of air that is moved into a case is much higher, there's mor free space, and the fans move a hell of a lot more air than GPU or CPU coolers, but those cool much better. Tell me by your theories why... I'll tell you the overall VOLUME of air is bigger, but the VOLUME of air that makes contact with the SURFACES of the cooler are the "same" (relative to speed) in both cases, but case fans can't provide the chips with cool and clean air. Also as important is HOW that volume of air gets to the chip or the fins. Clean straight air cools a lot better. In the X2 the second GPU does not get clean straight air. Next time check your facts before calling me.

EDIT: I forgot to mention the most important one: the temperature of the air. It doesn't work exactly as in the example I'm going to give but similarly and you can get the idea. If with no air circulation a chip was 100 C and our air temperature is 20 C, over the time and with perfect heat tranfer both will eventually end at 60 C, on the contrary if our air is already at 60 C the next chip will only get to 80 C. As I said it doesn't work that way, so linearly and without taking into account volumes, densities, thermal properties, etc. The same effect does occur though. With moving air is the same, but quicker.

PD. Now, when you SLI two of them, in most mobos the upper one will be almost unable to get fresh air because the other one is in the way and the second one is usually below the path that the air takes inside most cases, so the very first need I mentioned, fresh clean air delivery is destroyed and the cards can get hot hot too.

EDIT2: There's yet another factor to take into account, and that is that the contact surface of the chip is also important. Trying to cool down 200w through a contact surface of 256mm^2 is much harder than doing it on a 480mm^2 one. And that's one of the reasons the GTX295 WILL be cooler (speculating, but I'm going to enjoy saying "I told you").
Posted on Reply
#81
Solaris17
Super Dainty Moderator
hayder.masteri hope it is not another mistake like 9800gx2
how was the GX2 a mistake?
Posted on Reply
#82
TRIPTEX_CAN
hayder.masteri hope it is not another mistake like 9800gx2
I hope you're prepared for a 1000 word essay reply to that post :roll:
Posted on Reply
#83
Solaris17
Super Dainty Moderator
TRIPTEX_MTLI hope you're prepared for a 1000 word essay reply to that post :roll:
introduction

5 paragraphs

conclusion
Posted on Reply
#84
TRIPTEX_CAN
Solaris17introduction

5 paragraphs

conclusion
Nice of you to provide the layout for whoever writes the piece. I wasn't referring to you, but I can see a massive backlash to his claim.
Posted on Reply
#85
Solaris17
Super Dainty Moderator
TRIPTEX_MTLNice of you to provide the layout for whoever writes the piece. I wasn't referring to you, but I can see a massive backlash to his claim.
im running dual GX2's so im quite intrested in this claim that was more of a layout for myself
Posted on Reply
#86
TRIPTEX_CAN
Solaris17im running dual GX2's so im quite intrested in this claim that was more of a layout for myself
I did remember hearing the 9800GX2 had some scaling issues upon release mostly with min FPS and micro-stuttering in some games (probably fixed by now) but from what I understand the GTX200s have much better support for SLi so I don't see that being an issue.

You better get started on that essay we're not accepting submissions after 11:00 AM EST. ;)
Posted on Reply
#87
Solaris17
Super Dainty Moderator
TRIPTEX_MTLI did remember hearing the 9800GX2 had some scaling issues upon release mostly with min FPS and micro-stuttering in some games (probably fixed by now) but from what I understand the GTX200s have much better support for SLi so I don't see that being an issue.

You better get started on that essay we're not accepting submissions after 11:00 AM EST. ;)
i need to work at 12 and i wont be home till 8 right now im cleaning up a F#%ing mess because my GF's mom put all the presents in the room in the basment and the basment floods every year...guess what foot of water this morning and she didnt think to put anything up so the 22" acer the 850$$ D-SLR the 4GB memeory kit $80oz purfumes and shit i got my GF are totally fucked.....yay.........so i dont know if i want to get started on the GX2 this things work fine idkwtf people were doing to get micro stuttering but that deff isnt an issue...if people think it still is they have more on their rig to look at than the gaphics card.

-sol out
Posted on Reply
#88
Fitseries3
Eleet Hardware Junkie
according to an article i just read this card is not the gtx295. the 295 is a 55nm dieshrink of a 280 but oced more for better performance.

so wtf is this card called? are they just gonna call it the gtx260gx2? or since it has the new core would it be the gtx270gx2?
Posted on Reply
#89
TRIPTEX_CAN
Solaris17i need to work at 12 and i wont be home till 8 right now im cleaning up a F#%ing mess because my GF's mom put all the presents in the room in the basment and the basment floods every year...guess what foot of water this morning and she didnt think to put anything up so the 22" acer the 850$$ D-SLR the 4GB memeory kit $80oz purfumes and shit i got my GF are totally fucked.....yay.........so i dont know if i want to get started on the GX2 this things work fine idkwtf people were doing to get micro stuttering but that deff isnt an issue...if people think it still is they have more on their rig to look at than the gaphics card.

-sol out
Ouch man that would piss me off so much. I hope to hell you have insurance for water damage. :shadedshu

Good luck with the cleanup... :ohwell:
Posted on Reply
#90
Solaris17
Super Dainty Moderator
fitseries3according to an article i just read this card is not the gtx295. the 295 is a 55nm dieshrink of a 280 but oced more for better performance.

so wtf is this card called? are they just gonna call it the gtx260gx2? or since it has the new core would it be the gtx270gx2?
Techpowerup.com]the company is also planning a [U][COLOR="Reddual GPU[/COLOR] card named the GeForce GTX 295[/U]. Its single GPU flagship offering will be called GeForce GTX 285.
.
TRIPTEX_MTLOuch man that would piss me off so much. I hope to hell you have insurance for water damage. :shadedshu

Good luck with the cleanup... :ohwell:
o the only thing going through my head is death....i want to destroy things.
Posted on Reply
#91
Fitseries3
Eleet Hardware Junkie
dir..

my dyslexia translated the 285 into a 295.
Posted on Reply
#92
Solaris17
Super Dainty Moderator
fitseries3dir..

my dyslexia translated the 285 into a 295.
hah :)
Posted on Reply
#93
Hayder_Master
Solaris17how was the GX2 a mistake?
hello my friend , do you know i expect you replay on this and im ready to answer you sure
1- im talk about the 9800gx when release , extreme high price for normal performance , sure you are only win when you got it , cuz you got 2x9800gx in half price of an 9800gx2 when release or even it is still stick with high price for 2-3 months , after that nvidia kill it when gtx200 release , and if you remember this is same story of 7950gx2 , so i see and most people see this is big mistake , so if this new one release let we say come with price at 600$ , after 2 moths nvidia release new generation card's with double performance and same price , in that time anyone who got this card see himself do big mistake.
there is no mistake if take 9800gx2 at 200$ or 8800 ultra at 160$ in this case called smart chose , and you do smart chose sure if i have your mobo i do same thing
Posted on Reply
#94
Pixelated
DarkMatterI don't know what makes you think it will draw more power than the 9800 GX2, they have both the same 6-pin + 8-pin conectors.

I don't know why it would be really hot neither. The single PCB design of the HD3870 X2 didn't help out too much, the GX2 was cooler and quieter and indeed the temps were about the same as in single 9800s. Everything else were memes created long before the card even launched that continued existing thanks to the internets pop culture.

Rememer it's 55nm GT200 being used, we don't know anything about it yet. I say it's going to draw less power than the HD4870 X2 and have lower temps. Specilating like anyone else of course. :)
It's simple really. What card draws more power? 55nm 9800GTX+ or a 55nm GTX 260? I would say the GTX260 draws more. So double the power requirements and you get close to 300W of draw at peak.
Posted on Reply
#95
DarkMatter
DarkMatterI'm going to enjoy saying "I told you").
I told you.

Sorry but I couldn't let this go. :laugh: 100% Effectiveness in my assumptions. As I speculated (based on facts and precedents):

1- GTX295 is much faster than the X2 and also significantly faster than GTX260 SLI.

2- Consumes much less power than both the X2 and 260 SLI.

3- Temps are significantly better in the GTX295 than in the second GPU in the X2, even in Quad SLI configs, which doesn't seem to increase it's temps a lot.

There are tons of reviews in the front page of TPU, january 9th, so don't bother me with providing a link, do your homework.
Posted on Reply
#96
wolf
Performance Enthusiast
its now all fact.

1xGT200 beats 1XRV770

2xGT200 beats 2xRV770

naturally there are FEW circumstances where the other is possible, but if you've read half the reviews on the home page, you'll see its incontrovertible.

as for the cooling solution discussion, i think its obvious the 295 cooler is better, both cores at the same temp and low temps, not one core 6-10 degrees hotter. not to mention the over clocking potential it has over a 4870X2, just in case you want to trounce one extra hard.

i really love and cherish RV770, it came at just the right time, and was just what the gfx market needed.

but nvidia, again, hold the crown for fastest single GPU and fastest single card on the planet.

its like that, and thats the way it is.
Posted on Reply
Add your own comment
Apr 26th, 2024 12:33 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts