• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

More GeForce GTX 295 Details Trickle-in

Should have masked that AMD logo as well :)
The Fury Maxx is such an artifact.

I had forgot about that, but what I meant was dual GPU cards as they are implemented today using Crossfire and SLi.

bottom is 4870x2

You sure?

Did not ATI do dual GPU quite some time ago ?.
http://www.xbitlabs.com/images/video/crossfire/radeon_maxx.jpg

Anyways wish both companys would hurry up and bring the next lot of cards in like the 5870 and 380?.

I forgot about those cards, but I meant in modern iderations using Crossfire and SLi, too bad those cards were not supported on anything beyond Windows ME.

:p a lot of good points that my brain never thought of at the time, i was mearly speculating that nvidia hasn't improved their SLI bridge/connection w/e(to our knowledge) where as AMD has and their original xfire chip was superior to how the 9800GX2 was connected anyway, though it's going to be interesting since this will probly come out around the same time as ATI's HD 5870 or a little before it is released, i wonder what kind of performance improvements that will have over the HD 4870 and how it'll compare to the current HD 4870x2, and if it'll be as or more powerful than the GTX 280

NVidia didn't need to improve their SLi bridge connection used in the 9800GX2, it worked perfectly fine. AMD had to improve their bridge chip because it didn't support PCI-E 2.0. There really isn't anything superior about the way ATi does the internal crossfire connector vs. nVidia's SLi bridge cable.

When i changed from a 7600GT to a 7800GTX i thought i was the best! I soon came to realize that the size of the card is a joke. 2 slots for a gfx card is stupid and unacceptable. That didnt stop me from purchasing a pair 7900GTX's which were majorly flawed and replaced for 2 7950's.

The 7950's were ok, but nowhere near as good as the single 8800GTX i picked up. Ever since then the 2slot gfx cards have become a standard and with the new X58+ series motherboards we come to realize that the loss of that extra slot is a waste.

Ye, ye, im happy for the 295GTX, and i suppose ill be picking up a pair of them 2, but the fact remains that it will suck, like the 4870x2 sucks. We are many many many years away from decent gfx cards. Please keep inmind that my perception of decent may vary from your own.

Dual slot Graphics cards are wonderful, and I will always buy a dual slot card when available over the single slot card. There really aren't any benefit to single slot cards anymore, the lost PCI slots aren't worth anything.

Of course 7950's are not going to be as good as a single 8800GTX, the 8800GTX is a generation newer, why would you expect the 7950s to even hold a candle to the 8800GTX?
 
yes im sure...they are very very much similar the only reason i could tell the diff is because the 4870x2 has AMD silscreened above the pci-e teeth
 
well i suppose, it will perform close or a little worse than GTX 260 in SLI maybe better since it'll have 240SPU though i wonder if it'll really matter, it all is going to depend on the performance of the HD 5870 if it sells good and i look for it to do as good or slightly better than the GTX 280, and if it's priced competitvely say 320-399 at launch i thk it'll make good profits for ATI as the HD 48XX has. though i'm guessing this GTX295 will retail 699 or more.
 
Last edited:
This is no joke about nvidia wanting the performance crown, they are desperate so it seems. But i still dont compare this to 4870X2. They might as well sell two gtx270 and call it gtx295
 
I'm going to Tri-SLI three of these 295's.:)(If it was possible:()

Threre going to be like $300 tops.lol
 
NVidia didn't need to improve their SLi bridge connection used in the 9800GX2, it worked perfectly fine. AMD had to improve their bridge chip because it didn't support PCI-E 2.0. There really isn't anything superior about the way ATi does the internal crossfire connector vs. nVidia's SLi bridge cable.

you say that, but i wonder is it the xfire chip or the drivers that make the HD 4870x2 perform better than xfire HD 4870's?, the 9800GX2 wasn't better than 8800GTS's in SLI, benchmark, so their connection interfaces maybe equal but it seems ATI puts more work into their drivers, though nvidia maybe different this time around with their drivers for their dual PCB/Chip design.
 
well i suppose, it will perform close or a little worse than GTX 260 in SLI maybe better since it'll have 240SPU though i wonder if it'll really matter, it all is going to depend on the performance of the HD 5870 if it sells good and i look for it to do as good or slightly better than the GTX 280, and if it's priced competitvely say 320-399 at launch i thk it'll make good profits for ATI as the HD 48XX has. though i'm guessing this GTX295 will retail 699 or more.

Have we even seen any details on the HD5870? I've only seen a few forum posts on speculation, but nothing even close to being final or even offical. Will we even see the HD5870 in the next year? I doubt it, and if we do, it won't be until near the end of the year. So I really don't think we need to be considering the HD5870 in any serious way at this point.

you say that, but i wonder is it the xfire chip or the drivers that make the HD 4870x2 perform better than xfire HD 4870's?, the 9800GX2 wasn't better than 8800GTS's in SLI, benchmark, so their connection interfaces maybe equal but it seems ATI puts more work into their drivers, though nvidia maybe different this time around with their drivers for their dual PCB/Chip design.

Several factors go into the HD4870x2 outperforming the HD4870 in Crossfire, none of which are the chip used in the card to connect the cores. The first reason is that the HD4870x2 was released with 1GB of RAM per core. The regular HD4870s did not have 1GB per core at this time, so all the reviews of the HD4870x2 had it against 512MB HD4870s, which obviously perform worse than cards with 1GB, especially on extreme resolutions, which is where dual-GPU setups shine. Later reviews, using 1GB HD4870's show that there is very little performance difference between the two. Besides that, a lot of the reviews done with Crossfire HD4870's use Intel boards that only have x8 PCI-E slots, this also limits the dual card solution. As each card only gets half the bandwidth, while the single HD4870x2 gets the full bandwidth of the PCI-E x16 slot. Tweaktown actually did a test about this. When the HD4870x2 was put in a P45 board, it was about 10% faster than two HD4870s in the same board. However, when both were tested on an x48 board, the setups performed almost identically.

Now, as for the 9800GX2 not outperforming the 8800GTS's in SLi, the main reason is that the 9800GX2 is actually clocked lower than the 8800GTS's. It lacks 50MHz on the core clock, but more imporantly 125MHz on the shaders, this makes a huge impact on performance. When the two are clocked equally, they tend to perform equally.
 
the fact that they tested on the P45 at x8 each card never crossed my mind, i luv TPU :laugh: i get corrected and learn so much:laugh:
 
I'm still waiting for nVidia to utilize GDDR5. ati can't keep that to themselves forever. I was originally gonna go with a 4870 X2, but I got bashed on pretty hard, so I'm reconsidering. I don't wanna use dual slot cards, I'm gonna be using most of my slots...
 
best single slot card is HD 4850 :p
 
Or a water cooled GTX280 :rolleyes:
 
Or a water cooled GTX280 :rolleyes:

Water cooled 4870X2 :laugh:

Anyway, the GTX295 is basically a 9800GX2 of the current generation; take a hot but fast GPU and cut it down, followed by a dieshrink and less power usage, as well as lowerclock speeds, and fabricate two PCBs and an internal SLI bridge.

Anyway 300W power draw is a bit high... how much does the 4870X2 draw again?
 
I buy the cards to stay in the game CDdude55. Its an expensive hobby, but one i enjoy. I dont complain about the money, or the advance in technology, just the waste of slots when i need the space for something simple like a sound card.

newteckie1 i dont think they even make single slot gfx cards with remotely enough power for the enthusiest. That being said, i think we all have to get the dual slot cards. I dont think anyone expected the series 8 to be as superior at the time as it was. I cant explain why i assumed the 7950 would have been better then it was, but the 8800GTX did indeed shit all over it.

If graphics card technology doesnt go back to single slot with equal or better performance, then we will continue to lose functionality. AMD/ATi and Nvidia are constantly shrinking their technology but the cards are getting bigger.
 
mmmm, we expect that from nvidia , as you see my friend solaris this is another 9800gx2 but more than double price im sure
 
So who wants to get me one? :laugh:
 
I buy the cards to stay in the game CDdude55. Its an expensive hobby, but one i enjoy. I dont complain about the money, or the advance in technology, just the waste of slots when i need the space for something simple like a sound card.

newteckie1 i dont think they even make single slot gfx cards with remotely enough power for the enthusiest. That being said, i think we all have to get the dual slot cards. I dont think anyone expected the series 8 to be as superior at the time as it was. I cant explain why i assumed the 7950 would have been better then it was, but the 8800GTX did indeed shit all over it.

If graphics card technology doesnt go back to single slot with equal or better performance, then we will continue to lose functionality. AMD/ATi and Nvidia are constantly shrinking their technology but the cards are getting bigger.

The only single slot card in the current generation that would probably fit the bill is the HD4850. Though most enthusiest have come to actually want two slot cards, I know I have. As the GPUs get more powerful, they just keep putting out more heat. Most do not want that heat trapped in their case. Just as an example, when I put dual slot coolers on my 7900GT's no only did the GPU temperatures drop 5°C but my CPU tempurates dropped also.
 
first off realllly interesting thead to read.

second im really looking forward to this card, given how much a single GTX260 core 216 rocks.

honestly i'd say if a single GTX260 was released with the full 240 sp's, it really wouldn't need much overclocking at all for GTX280 speeds.

i dont think they will need to clock it slower, if they do its only from a heat perspective given the GTX285 is clock faster to the value of 10% more performance, all whist chewing 22.5% less power, given that, a pair of 55nm GTX260's should do well for themselves.

lets just weigh up how beasty this card will be (assuming SAME clocks as stock 260)

56 ROPS - 32 gigapixel fillrate
480 sp's - 80 gigatexel fillrate
1792mb of memory on a 896 bit bus (naturally halved per gpu)
theoretical 223.8 GB/s memory bandwith

wowza. me wantie. right meow.

all in all the 55nm iterations of GT200 + RV770 revamp? should kick some tail until the new cards hit hard late next year, ie the GT300 and RV870.
 
That makes little to no sense.

1.) nVidia has been doing the dual GPU cards for longer than ATi has. nVidia started it with the 7900GX2 which came out more than a year before ATi's first dual GPU card. And even then, it wasn't actually an ATi card, it was just a Sapphire exclusive designed by Sapphire.

2.) nVidia has taken their original idea, and continued to refine it. The dual PCB design of the 7900GX2 has evolved into GTX295. ATi has done the same.

3.) Yes, the GTX295 is similar in design to the 9800GX2, but how different is the HD4870x2 from the HD3870x2? Look at this picture, and tell me which is the HD4870x2 and which is the HD3870x2.

4.) ATi has been the one that has needed to create dual GPU cards to take the performance crown. For the bast 2 generations, this has been the only way ATi has been able to outperform nVidia's single GPU cards.
For #3) Top card is HD3870X2 and the Bottom is HD4870X2 :p

For #4 honestly whats the big deal about the card being dual GPU or not?
As long as it preforms well and are priced properly I don't give a damn:slap:

Not saying the guy you are qouting is right, but that 79X0 GX2 has the worst drivers ever who cares its nVidia that made the first crap?
(Well 3DFX did the first multiple GPU crap, but nVidia owns them anyways.)
 
Last edited:
That makes little to no sense.

1.) nVidia has been doing the dual GPU cards for longer than ATi has. nVidia started it with the 7900GX2 which came out more than a year before ATi's first dual GPU card. And even then, it wasn't actually an ATi card, it was just a Sapphire exclusive designed by Sapphire.

2.) nVidia has taken their original idea, and continued to refine it. The dual PCB design of the 7900GX2 has evolved into GTX295. ATi has done the same.

3.) Yes, the GTX295 is similar in design to the 9800GX2, but how different is the HD4870x2 from the HD3870x2? Look at this picture, and tell me which is the HD4870x2 and which is the HD3870x2.

4.) ATi has been the one that has needed to create dual GPU cards to take the performance crown. For the bast 2 generations, this has been the only way ATi has been able to outperform nVidia's single GPU cards.

1. Radeon Maxx?

2. The dual PCB design of the 7900GX2 is really, nothing new; other comapnies have already used that idea so many times its not funny.

3. The GTX295's design from what I can see is almost identical to the 9800GX2, probably with beefed up phases. The 4870X2 and 3870X2 both share a similar PCB too, but with more changes; to the memory bus (please note that GDDR5 and GDDR3 has a different layout), completely beefed up phases, etc. The Bottom is the 4870X2 for sure >_>. Again is there any problem with recycling designs? "Oh lets make a whole new PCB design to be original so that consumers complain less"-is the logic that you'd operate on. Basically doing that would jack up the retail price as extra redundant R&D would be required. Theres no need, in short.

4. Does it matter? Why do people bitch about how they attain the result? Its not like its immoral or anything. AMD could very easily just fabricate two RV770 cores in one package, but they wont, for several good reasons.
 
For #3) Top card is HD3870X2 and the Bottom is HD4870X2 :p

For #4 honestly whats the big deal about the card being dual GPU or not?
As long as it preforms well and are priced properly I don't give a damn:slap:

Not saying the guy you are qouting is right, but that 79X0 GX2 has the worst drivers ever who cares its nVidia that made the first crap?
(Well 3DFX did the first multiple GPU crap, but nVidia owns them anyways.)

It isn't a big deal where the power comes from, as long as it is there. I was just responding to his point that nVidia has needed to create these dual GPU cards to top ATi. While that is true, it isn't in the context that he put it in, as ATi has been the one needed dual GPU cards to top nVidia's single GPU cards, nVidia just responds with a dual-GPU card of their own. Thats the cycle when nVidia has the stonger GPU Core. When the roles are reversed, and ATi has the stronger GPU Core again, nVidia will probably be the first to pump out dual-GPU cards to top ATi's single GPU cards.

Personally, I would prefer single GPU solutions, simply because of all the problems SLi and Crossfire solutions add to the mix. You have games not supporting the technology, with users have to wait for patches from both the graphics card manufacturers and game developers. You have situations like GTA:IV, where SLi isn't supported. So everyone that bought 9800GX2's are stuck with the performance of a single GPU. Crysis, for the longest time, didn't support Crossfire properly, so users of the HD3870x2 and HD4870x2 were stuck with the performance of a single GPU.

1. Radeon Maxx?

2. The dual PCB design of the 7900GX2 is really, nothing new; other comapnies have already used that idea so many times its not funny.

3. The GTX295's design from what I can see is almost identical to the 9800GX2, probably with beefed up phases. The 4870X2 and 3870X2 both share a similar PCB too, but with more changes; to the memory bus (please note that GDDR5 and GDDR3 has a different layout), completely beefed up phases, etc. The Bottom is the 4870X2 for sure >_>. Again is there any problem with recycling designs? "Oh lets make a whole new PCB design to be original so that consumers complain less"-is the logic that you'd operate on. Basically doing that would jack up the retail price as extra redundant R&D would be required. Theres no need, in short.

4. Does it matter? Why do people bitch about how they attain the result? Its not like its immoral or anything. AMD could very easily just fabricate two RV770 cores in one package, but they wont, for several good reasons.

1.) Yes, we've gone over that. Read the thread. I mean modern implementations using Crossfire and SLi. If you go back far enough, you will find plenty of dual GPU implmentations.

2.) Yes, and the single PCB design of ATi's dual GPU cards is nothing new either, it has probably been used just as much. Your point?

3.) I have no problem with recycling designs. I say pick a design and continue to refine it. But what I want to know is how you have jumped to the conclusion that the GTX295 is almost identical to the 9800GX2 from a few off angle pictures, and no real picture of the PCBs. How can you make the claim that there are more changes from the HD4870x2 to the HD3870x2 without any good information on the GTX295's PCBs? The G200 is a completely different beast from the G92, there are likely huge changes to the PCB design. Funny how you see a card from ATi with essentially the same layout/form factor as the prevous generation, and say there are huge changes, but on the nVidia side, you see the same thing, and say there are no changes at all.

4.) See above.
 
well like you said we have no proof that there have been improvements, so can only speculate and assume off nvidia's previous actions, and they aren't big on starting something new ATI has been the one to take steps forward in the past 3 years. We're just speculating, though when the HD 4870x2 came about there were said improvements such as the xfire chip that were mentioned. It wasn't just an upgrade to making it PCI-E 2.0 there was a lot less micro-stutter between the 2 chips than before also they have a sideport that allows direct communication between the chips that could eliminate micro-stutter or or lessen it greatly. Though it's not activated yet but if it does get activated it could improve performance, will nvidia also improve their micro-stuttering? IDK maybe nvidia has done one hell of a job and refined and improved the PCB design greatly which is a possibility since it's a much larger chip, different bus size; or it could be dam near the same as the 9800GX2's. time will tell. will it matter? idk
 
As much as I'd like to see Nvidia offer up something worthwhile with this...there's two things that bother me.

1. Video ram/ram dac is still shared. Yes, having a total texture pool of near 2gb is helpful, but more so in theory, not in practice. If it was independent, thus being true 2gb, that would be another story. I'm wondering when dual processed GPUs are going to break that trend.

2. The most previous dual process solution, the 4870 X2(yes 4850 is more 'previous' sue me...)is, nothing to shake a stick at, but I've said it before and will always continue to say it - for the amount of horespower under it's hood, I feel it almost falls completely on it's face. It should perform twice as well as it does; but like a vehicle motor, slapping on a super charger can only take you so far, while the rest of the factory parts drag you down or limit your potential and real-world performance.

I don't think this Nvidia product is going to break either of those trends. It might be fast, in fact I'm fairly certain it will be, but if it doesn't perform at least 1 1/2 the amount of a normal 280, then...bleh.

I never thought I'd actually hear someone slam the 4870x2...since it's inception and till current times all I hear is how wonderful it is and how heavenly it is to own one. Truthfully, for almost 600 dollars it's no less a disappointment than when the gtx280 was that much and you could get a 4870 or 4850 for less than half that with 80 % plus the performance in most games. To me, this seems much the same scenario, I love the idea of dual gpu's on one card and yes this (gtx295) will absolute destroy anything out there currently (possibly in this next gpu round as well) but depending on pricing this will be just as uneconomical as the 4870x2...which particularly at high resolutions is actually BEATEN by the 4850x2 which is 200 bucks cheaper and uses much less electricity! How ridiculous is that? Anyhow this card will be much the same I believe and obviously it's the single slot champ in likely hood but it'll be just as bad a value as the 4870x2 is particularly once it reaches end of life...THEN it might be cool to pick one up if the price bombs, which amazingly of all the products of both companies they ALL have dropped even multiple times except for the 4870x2 so even that might be a wish and a prayer for quite a while!
 
well like you said we have no proof that there have been improvements, so can only speculate and assume off nvidia's previous actions, and they aren't big on starting something new ATI has been the one to take steps forward in the past 3 years

What previous actions do we really have to go on? The 7900GX2, 7950GX2, and 9800GX2. Not enough to really make a trend, IMO. However, if we do look at it. The 7950GX2 was a huge improvement over the 7900GX2 in terms of PCB design, in fact the whole purpose of the 7950GX2 was to improve the PCB design. Then comes the 9800GX2, which is extremely different from the 7950GX2, obviously.

What steps forward have ATi made in the past 3 years that have been something new?
 
Back
Top