Friday, November 19th 2010

NVIDIA Readying Dual-GF110 Graphics Accelerator, Eying Total Performance Leadership

NVIDIA stunned the computing world with a speedy launch of the GeForce GTX 580. The GPU was able to increase NVIDIA's single-GPU performance leadership, and also iron-out some serious issues with the power-draw and thermal characteristics of previous generation GeForce GTX 480. It is now that a dual-GPU implementation of the GF110 graphics processor, on which the GTX 580 is based, looks inevitable. NVIDIA seems to be ready with a prototype of such a dual-GPU accelerator, which the Chinese media is referring to as the "GTX 595".

The reference design PCB of the dual-GF110 accelerator (which still needs some components fitted) reveals quite a lot about the card taking shape. First, it's a single PCB card, both the GPU systems are located on the same PCB. Second, there are slots for three DVI output connectors present, indicating that the card with be 3D Vision Surround ready in a single card. You just have to get one of these, plug in three displays over standard DVI, and you're ready with a large display head spanning three physical displays.
Third, it could feature a total of 3 GB of video memory (or 1.5 GB per GPU system). Each GPU system has six memory chips on the obverse side of the PCB. At this point we can't comment on the memory bus width of each GPU. The core configuration of the GPUs are also unknown. Fourth, power is drawn in from two 8-pin PCI-E power connectors. The card is 2-way SLI capable with another of its kind.
Source: enet.com.cn
Add your own comment

153 Comments on NVIDIA Readying Dual-GF110 Graphics Accelerator, Eying Total Performance Leadership

#51
motasim
I would really love to see a dual-GF110 GPU but given the power requirements and thermal print unfortunately it won't be possible. What is more realistic; and is most probably the case here is a GPU with dual fully-enabled GF104 chips (aka GF114). As usual; time will tell :D
Posted on Reply
#53
LAN_deRf_HA
Whats the amorphous gray blob on the back there?
Posted on Reply
#54
Jiraiya
LAN_deRf_HAWhats the amorphous gray blob on the back there?
this original



first image After cleaning
Posted on Reply
#55
the54thvoid
Intoxicated Moderator
LAN_deRf_HAWhats the amorphous gray blob on the back there?
Oh that's proof of what it is. Fuzzy pictures ftl.

I love all the fanaticism coming out the woodwork again. People saying the 6990 is dead and Nvidia have won. Jeez.

Won't people learn.

Given the GTX580 is a vapor chamber cooled part that still can reach 90+ degrees and without power throttling guzzle 300+ watts (links below) do you really think they can get two GF110 cores on one card? Just like the GTX 295 wasn't 2x285's (it was 2 x 275's) and just like the 5970 is 2x5870's downclocked to 5850 speed, this would seem very likely not be two fully operational 580 cores.
Likewise, the 6970 rumours suggest a high power guzzler as it is a huge die (why it should come close to the 580). Given the leaks that point to a cayman XT Antilles card, the 6990 will probably be a monster also.
Both cards will surely be made in very limited numbers as they'll both be ultra high end parts.

Despite how good they might be, i wouldn't buy one simply because it's dual gpu and having had GTX 295 and 2x5850's, i want a single card now. A GTX 580 Super Overclock from Gigabyte would suffice :)

GTX 580 power draw:
www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_580/25.html
GTX 580 temp:
www.hexus.net/content/item.php?item=27307&page=15
Posted on Reply
#56
Ferrum Master
LAN_deRf_HAWonder if nvidia asked asus for advice after seeing their dual 480 board. Looks very similar. Even the same off center mounting holes. Actually are we sure this isn't a new mars card? I mean has nvidia said anything about it?

www.techpowerup.com/img/10-11-19/147a.jpg
www.techpowerup.com/img/10-07-16/mars_ii_1.jpg
Actually they look completely different. Power supply solution, mounting holes, power connector (power draw) etc... it is a completely different device.
Posted on Reply
#57
Unregistered
I don't want to sound like an a$$, but if a GTX580 consume 10% more than a 5970, imagine how much this beast will eat for breakfast!!! :eek::eek::eek:
#58
SabreWulf69
I hope it consumes 5000W/s. Power to the people. lol
Posted on Reply
#59
LAN_deRf_HA
Ferrum MasterActually they look completely different. Power supply solution, mounting holes, power connector (power draw) etc... it is a completely different device.
It doesn't look different enough given that the asus pic was a one off proof of concept prototype with different cores. I wouldn't rule out it being a new mars card as it may simply be a refinement and evolution of that concept.
Posted on Reply
#60
Unregistered
The power problem would be solved, theoretical, if they would reduce the manufacturing process down to 22nm or 18nm, or even lower. With the actual technology is not physical possible to increase the performance of a GPU without increasing its power consumption. The smaller the trasistor, the less power they eat, the hottest will get...I know, it's a vicious circle...
#61
[I.R.A]_FBi
SabreWulf69I hope it consumes 5000W/s. Power to the people. lol
actually ... power from teh ppl
Posted on Reply
#62
micropage7
it makes me remember the processor development, it started from single core, then the push the clock, when the max clock reached they switched into multi core until now, it looks gonna be like that they started with single, then dual, then put multi core on one package
lets see..
Posted on Reply
#63
Unregistered
1nf3rn0xWow, that seems great. This will be war of the worlds. 5990 vs. GTX 595
You mean 6990 vs GTX595...:rolleyes:
#64
Ferrum Master
LAN_deRf_HAIt doesn't look different enough given that the asus pic was a one off proof of concept prototype with different cores. I wouldn't rule out it being a new mars card as it may simply be a refinement and evolution of that concept.
Then we could match 9800GX with GTX295... the power circuit uses different Ics with with less phase count. Also nvidia's board has only DVI port connections... It isn't an evolution... it is just a cheaper design... using different components. It could be, that mars card has more layers too...
The layout rules are same for every card concerning wire length for memory bus, power lines etc... so that the components are placed more or less in the same place.

The power consumption cannot be more than we can attach to it - 2*150W + 75W from PCIe.
Posted on Reply
#65
Over_Lord
News Editor
CHAOS_KILLAIt bother's me:wtf: "plants 5,000 tree's" aahh I feel better, now I can play Crysis 3 with my 2000W GTX 795:toast::laugh:
Thank you, I thought everyone here (barring some) were insolent and insensitive fools
Posted on Reply
#66
SabreWulf69
Meh, meh and meh. This ain't about carbon neutral foolery, it's about an awesome graphics card. We pay our power bill, and we WILL use it the way we see fit, and what a hell of a way to use a load of it. What would be awesome is a dual-gpu with them on the same die, like CPU's (pardon me if they exist already).
Posted on Reply
#67
H82LUZ73
So in theory this would be about 5fps faster then a GTX580 sli setup right?
Posted on Reply
#68
a_ump
I'll believe this when it's released, though i'm def getting the feeling its going to have a weaker core config than the GTX 580, perhaps 2 570 chips, which would be kewl bc it'd show us what the 570's hardware specs. I feel that the 570 is gonna be a hella great deal, kinda like the 5850 was.

As for Nvidia taking back the single and dual gpu crown. I highly doubt it. Im not gonna be surprised if AMD purposelly helped the leaked rumors saying their 6970 has 1500 some shaders. sorta given em the benefit of the doubt or at least that they learned from Nvidia's mistake of dilly dallying.

Watch it release with 1920 shaders or something, 900mhz core clock. that'd be tight, n of course ROP's n whatnot would get n increase. its been said that the 6970/50 chip are a lil off coure from AMD's usual small but extremely efficient per mm2, So yea with the bumps in specs they spoke of, this chip would be smaller than 580
Posted on Reply
#69
mdsx1950
Finally a nVidia card worth being in my rig! :rockout:
Posted on Reply
#70
Eva01Master
I hope they call this monster if ever produced GTX 580 X2/GTX 570 X2/GTX 560 X2, so it's very clear which kind of GPU core they're using to power it to the average gamer with deep pockets.
Posted on Reply
#71
Unregistered
H82LUZ73So in theory this would be about 5fps faster then a GTX580 sli setup right?
NOT A CHANCE!!!!! :rockout::laugh::shadedshu
maybe 10% slower, my guess....
And If you ask me there will be two 580GPUs with 460 (560??) freqs.
Eva01MasterI hope they call this monster if ever produced GTX 580 X2/GTX 570 X2/GTX 560 X2, so it's very clear which kind of GPU core they're using to power it to the average gamer with deep pockets.
X2 is (was...) ATI trademark bro!!!:shadedshu
Posted on Edit | Reply
#73
xtremesv
The sleeping lion woke up

AMD run!!!
Posted on Reply
#74
lism
micropage7it makes me remember the processor development, it started from single core, then the push the clock, when the max clock reached they switched into multi core until now, it looks gonna be like that they started with single, then dual, then put multi core on one package
lets see..
But its not true with GPU's.

AMD / Ati has chosen to develop really tight GPU's with lesser power consumption, but lower developing-costs. Nvidia makes GPU's which are larger, tougher to make and having problems with TSMC making good wafles of chips lol.

I'd prefer a chip with less power consumption and being efficient as possible instead of a chip that generates more then 400Watts of heat in games.

Anyway, this is more of an answer towards AMD's upcoming 6990.... Nividia knows its going to get their ass kicked.
Posted on Reply
#75
TheMailMan78
Big Member
I love reading all the comments in a thread like this....

"Oh this is going to destroy the 5970!" and "AMD better run" :laugh:

I sure as hell hope a duel GPU offering thats a YEAR newer can beat the 5970. Also if the 580 is any indicator of where Nvidia is going then you better get ready to have your bubble burst.

Personally I am waitting this gen. out unless a 6970 is faster then two 5870s Ill call fail. The 580 was let down enough.
Posted on Reply
Add your own comment
Apr 24th, 2024 10:41 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts