Friday, November 19th 2010

NVIDIA Readying Dual-GF110 Graphics Accelerator, Eying Total Performance Leadership

NVIDIA stunned the computing world with a speedy launch of the GeForce GTX 580. The GPU was able to increase NVIDIA's single-GPU performance leadership, and also iron-out some serious issues with the power-draw and thermal characteristics of previous generation GeForce GTX 480. It is now that a dual-GPU implementation of the GF110 graphics processor, on which the GTX 580 is based, looks inevitable. NVIDIA seems to be ready with a prototype of such a dual-GPU accelerator, which the Chinese media is referring to as the "GTX 595".

The reference design PCB of the dual-GF110 accelerator (which still needs some components fitted) reveals quite a lot about the card taking shape. First, it's a single PCB card, both the GPU systems are located on the same PCB. Second, there are slots for three DVI output connectors present, indicating that the card with be 3D Vision Surround ready in a single card. You just have to get one of these, plug in three displays over standard DVI, and you're ready with a large display head spanning three physical displays.

Third, it could feature a total of 3 GB of video memory (or 1.5 GB per GPU system). Each GPU system has six memory chips on the obverse side of the PCB. At this point we can't comment on the memory bus width of each GPU. The core configuration of the GPUs are also unknown. Fourth, power is drawn in from two 8-pin PCI-E power connectors. The card is 2-way SLI capable with another of its kind.Source: enet.com.cn
Add your own comment

153 Comments on NVIDIA Readying Dual-GF110 Graphics Accelerator, Eying Total Performance Leadership

#1
Fourstaff
That PCB looks abit crowded to me. Any chance its going to be GF104 rather than GF114/GF110?
Posted on Reply
#2
newtekie1
Semi-Retired Folder
Wow, looks beastly.

Waits for someone to asking if it comes with its own nuclear power plant to power it.
Posted on Reply
#3
Batou1986
Good news ive almost finished converting my computer to nuclear power :shadedshu

Happy now newtekie
Posted on Reply
#4
mechtech
Looks really expensive lol
Posted on Reply
#5
yogurt_21
looks sexy but it's a low res image so i can't tell if any editing has been done.

if they are truly launching a dual 580 it will be epic, expensive, and powerhungry.

edit: also there must be 12 more ram chips on the back if this is truly supposed to be a dual 580.
Posted on Reply
#6
(FIH) The Don
and now people are gonna whine about power usage

WHY!!!!!!!!!!!!!!!!!!

YOU DO NOT BUY HIGHEND CARDS TO SAVE POWER YOU FREAKIN IDIOT !!! :laugh::roll::wtf::shadedshu
Posted on Reply
#7
yogurt_21
by: (FIH) The Don
and now people are gonna whine about power usage

WHY!!!!!!!!!!!!!!!!!!

YOU DO NOT BUY HIGHEND CARDS TO SAVE POWER YOU FREAKIN IDIOT !!! :laugh::roll::wtf::shadedshu
well if a 1kw psu doesn't run it I'm going to whine. lol
Posted on Reply
#8
KainXS
So these HAVE to have the power limiter chips and I wonder whats gonna happen if you turn it off

can't wait for it.

now when they make a new power guzzler nobody will be able to point the finger at the HD4870X2 lol
Posted on Reply
#9
JrRacinFan
Served 5k and counting ...
Time for a 1.1Kw power supply and factory watercooling block. ;)
Posted on Reply
#10
the54thvoid
For the memory - it must have 3GB. 1GB per card (for 2 total) would suffer the same problems at 2560 that the AMD cards have with high AA and quality settings - too much texturing for the 1Gb each gpu to handle. So, surely, following 1.5Gb for a GTX 580 (and 480) it would need 3GB, otherwise I wouldn't touch it with a bargepole.

Yeah, the power issue is irrelevant. The 6990 will be just as bad.

I dont think both cards will be very good thermally or acoustically. But you never know...
Posted on Reply
#11
KainXS
by: the54thvoid
For the memory - it must have 3GB. 1GB per card (for 2 total) would suffer the same problems at 2560 that the AMD cards have with high AA and quality settings - too much texturing for the 1Gb each gpu to handle. So, surely, following 1.5Gb for a GTX 580 (and 480) it would need 3GB, otherwise I wouldn't touch it with a bargepole.

Yeah, the power issue is irrelevant. The 6990 will be just as bad.

I dont think both cards will be very good thermally or acoustically. But you never know...
when the 6970 comes out we will have an idea of how bad the 6990 will be but right now nobody knows so you can't say the 6990 will be just as bad

I think what they can do is set up the power circuitry in a loop so when one gpu down clocks the other one upclocks and that might help

Ima guess 370 watts with limiter at least 400 without
Posted on Reply
#12
newtekie1
Semi-Retired Folder
by: Batou1986
Good news ive almost finished converting my computer to nuclear power :shadedshu

Happy now newtekie
Yes I am.:D
Posted on Reply
#15
Lionheart
I hope it is 2 GTX 580's on one board, jizz time, but Im gonna still wait for AMD's shiny red beasts :)
Posted on Reply
#16
Sasqui
That is SICK! I'm surprised (and I've been saying this all along) that they didn't go with a sandwich design, but they managed to get two of those beastly peices of silicon on a single PCB it appears.

The cooling is going to have to be very creative, so is the price. :wtf:
Posted on Reply
#17
ToTTenTranz
I kinda doubt a dual GF110 card will ever see the light of day, at least not outside some kind of very limited edition.

We're talking about a ~550W card here. Even if they figure out how to fit it in regular cases, it'll need a huge cooler (3-slot?).

But a dual GF104 card, with all 384 ALUs enabled in each GPU, would be much more believable.
Posted on Reply
#18
hv43082
So let's see which camp will have their top dog out first.
Posted on Reply
#19
SabreWulf69
Pure PWNAGE, w00t, I'm all up for one if they are indeed 2x 580's :-D
Posted on Reply
#20
Mistral
GF110x2? I'm baffled... Is the third 8-pin power connector on the other side of the PCB? :rolleyes:
Posted on Reply
#21
Tatty_One
Senior Moderator
by: Mistral
GF110x2? I'm baffled... Is the third 8-pin power connector on the other side of the PCB? :rolleyes:
Don't be baffled, it will be downclocked therefore less voltage/draw and 375W should be enuff (hopefully).
Posted on Reply
#22
Kreij
Senior Monkey Moderator
by: KainXS
now when they make a new power guzzler nobody will be able to point the finger at the HD4870X2 lol
If this thing uses more power than my 4870x2 I will be really saddened.
Having the largest carbon footprint is the only claim to fame my poor card has left. :cry:

Anyway ... I hope this thing rocks. The GPU war never gets old in my book.
Posted on Reply
#23
qamulek
power limit?

A 580 is currently power limited due to spiking above the total available power of 150+75+75=300watt max. In tpu's article here the 580 power limits itself after spiking up then it settles to a max watts of 200(question why 200 and not something more like 250 or 290??? Probably due to the limits of the electronics used on board...). One question I didn't see answered was did the performance increase proportionately to the increase in power used after the power limit was taken away?

In any case the point is a 580 really needs an 8pin+8pin power to keep itself from being power limited from the cables(let alone the electronics used on the board as well as possible problems with load balancing), so how is a dual gpu going to fare better if a single gpu is already power limited? I guess I will just have to wait till someone who actually knows what they're talking about gives a go at it, or just wait till the card comes out and reviews are posted.

ah! My guess: The gtx580 is so powerful to the point it has to be power limited when used to its fullest, however most applications will be limited by the weakest link in the gpu before using the full power of the gtx580(example: cut the rops in half and suddenly the card is limited by the rop count). The dual gpu card will still need to be power limited, but it won't matter as most normal applications will be limited by the weakest link in the gpu before reaching the power limit.
Posted on Reply
#24
yogurt_21
by: Tatty_One
Don't be baffled, it will be downclocked therefore less voltage/draw and 375W should be enuff (hopefully).
shoot at 375w my psu could handle 2 of em with a decent i7...wouldn't be able to clock it at all though.

even downclocked a dual gf110 would be amazing, now I'm starting to wonder if the 6990 is going to dominate as the 5970 has, the 5970 didn't have any dual card competition.
Posted on Reply
#25
LAN_deRf_HA
Wonder if nvidia asked asus for advice after seeing their dual 480 board. Looks very similar. Even the same off center mounting holes. Actually are we sure this isn't a new mars card? I mean has nvidia said anything about it?


Posted on Reply
Add your own comment