Friday, August 23rd 2013

NVIDIA Working on GK110-based Dual-GPU Graphics Card?

GeForce GTX 295 showed that its possible to place two GPUs with ludicrously high pin-counts next to each other on a single PCB, and if you get a handle over their thermals, even deploy a 2-slot cooling solution. NVIDIA might be motivated to create such a dual-GPU graphics card based on its top-end GK110 chip, to counter AMD's upcoming "Volcanic Islands" GPU family, or so claims a VideoCardz report, citing sources.

The chips on the card needn't be configured, or even clocked like a GTX Titan. The GTX 780 features just 2,304 of the chip's 2,880 CUDA cores, for example. Speaking of 2,880 CUDA cores, the prospect of NVIDIA developing a single-GPU GeForce product with all streaming multiprocessors on the GK110 enabled, the so-called "Titan Ultra," isn't dead. NVIDIA could turn its attention to such a card if it finds AMD's R9 2xxx within its grasp.
Source: VideoCardz
Add your own comment

43 Comments on NVIDIA Working on GK110-based Dual-GPU Graphics Card?

#1
BarbaricSoul
by: Prima.Vera
I want one for 300$. :D
me too :toast:
Posted on Reply
#2
james888
by: Prima.Vera
I want one for 300$. :D
Now now, you have to be reasonable. It should be no more than $550. :p
Posted on Reply
#3
radrok
by: newtekie1
Because it was never designed, and there really isn't a need, to go over 1.2v. That is certainly not "stock" your talking about here, so the statement that it is barely enough for stock is absurd. In fact, it will handle the maximum voltage nVidia makes available to you by default without a problem, the PWM only becomes an issue when you push beyond 1.2v. And even at 1.2v the Titans I had both did 1,040Hz, that is good enough in my books. A Titan at 1,040MHz is insane as it is, the tiny amount more you'd get by going higher is pointless.
My point is that basically almost only people who tweak their card are going to drop 1k $ on a GPU.

I kinda imagine the % of people buying the Titan and leaving it a stock is tiny because let's face it, you gotta be really into it to buy one or two, like me.

That said, it wouldn't have cost that much to Nvidia to deliver a proper VRM.

So yes, I'm ranting cause Nvidia always cheaps out on PCB components ;)
Posted on Reply
#4
Easy Rhino
Linux Advocate
great card to play your console ports!
Posted on Reply
#5
Am*
by: radrok
Do not want.

Would probably be retardedly cripped by a "barely enough" PWM section, like my Titans.

There's no fun in having a powerhouse if you can't tweak it, especially because these cards are rarely purchased by people who do not tweak.
THIS.

If they can release a Titan Ultra that doesn't throttle with all 2880 cores and clock it at a full Gigahertz, it might just get Nvidia by until whenever Maxwell is due. This however, means the VRMs on the board must support at least 300W-350W of power to not throttle the chip. Nvidia themselved proved that their cards were throttled when they released their GTX 770s which consume more power than the 680s they originally replaced.

I honestly hope they don't waste their time with this dual-GPU card bullshit, as it would be ironic for them to release it, especially after they spent months "educating" everyone about RUNT frames and how much having 2x Xfire/SLI GPUs increases latency, ever since they released the Titan. I don't even think anyone here would give two shits about a dual Titan card for any reason, not even to drool over, since the Titans throttle on their own already without sharing a single PCB. It doesn't matter even if I had $5K burning a hole in my pocket, I wouldn't go anywhere near any dual-GPU card with a barge pole after seeing how it fucks up frame timings in most games.
Posted on Reply
#6
MxPhenom 216
Corsair Fanboy
by: radrok
So how comes there's like 0.04v-0.06v Vdroop when playing with voltage over 1.25v?

I've seen it drop as little as 1.261v when setting it 1.32v.

That not a good VRM. Good for me was reference Volterra on my two 6990s.
Dude vdroop is going to happen regardless of what VRMs you have. Maybe some more than others, but I know erockers 7970s have quite a bit of vdroop as well and AMD is known to use a better power delivery system typically.
Posted on Reply
#7
radrok
by: MxPhenom 216
Dude vdroop is going to happen regardless of what VRMs you have. Maybe some more than others, but I know erockers 7970s have quite a bit of vdroop as well and AMD is known to use a better power delivery system typically.
The GTX 580 Lightning I had didn't have noticeable vdroop as far as I can remember...

If you are talking about reference well, volterras on my 6990s didn't have noticeable vdroop either.

I get what you are saying though.
Posted on Reply
#8
newtekie1
Semi-Retired Folder
by: radrok
My point is that basically almost only people who tweak their card are going to drop 1k $ on a GPU.

I kinda imagine the % of people buying the Titan and leaving it a stock is tiny because let's face it, you gotta be really into it to buy one or two, like me.

That said, it wouldn't have cost that much to Nvidia to deliver a proper VRM.

So yes, I'm ranting cause Nvidia always cheaps out on PCB components ;)
Actually, I would argue that the people stupid enough to drop $1k+ on a GPU are also stupid enough to not bother to overclock it, or figure out how to overclock it.

But my point is that even the ones that do want to tweak it can without a problem, the PWM is good enough for decent overclocks.
Posted on Reply
#9
radrok
by: newtekie1
Actually, I would argue that the people stupid enough to drop $1k+ on a GPU are also stupid enough to not bother to overclock it, or figure out how to overclock it.

But my point is that even the ones that do want to tweak it can without a problem, the PWM is good enough for decent overclocks.
Watch out, you basically insulted a good part of this community ;)

Anyway to answer in your tone, you'll eventually realize someday that money isn't directly proportional to stupidity.

Goes well with the fact that those people who make good money actually have used their brain in the right way. :toast:

I'm done since you had to empower your point with an insult :cool:
Posted on Reply
#10
newtekie1
Semi-Retired Folder
by: radrok
Watch out, you basically insulted a good part of this community ;)

Anyway to answer in your tone, you'll eventually realize someday that money isn't directly proportional to stupidity.

Goes well with the fact that those people who make good money actually have used their brain in the right way. :toast:

I'm done since you had to empower your point with an insult :cool:
A good part of this community would likely agree with me actually.

No insult intended at all, the plain fact is that titans are completely overpriced and only an idiot would actually pay the outrageous premium for them, W1z has even said so himself.

And there are plenty of people out there with more money than brains. Just because you have money doesn't mean you're smart.;)
Posted on Reply
#11
adulaamin
by: newtekie1
A good part of this community would likely agree with me actually.

No insult intended at all, the plain fact is that titans are completely overpriced and only an idiot would actually pay the outrageous premium for them, W1z has even said so himself.

And there are plenty of people out there with more money than brains. Just because you have money doesn't mean you're smart.;)
Titans are overpriced but I wouldn't go as far as calling Titan owners idiots. They have the extra cash. If I did I would've bought one too. :)
Posted on Reply
#12
MxPhenom 216
Corsair Fanboy
by: radrok
Watch out, you basically insulted a good part of this community ;)

Anyway to answer in your tone, you'll eventually realize someday that money isn't directly proportional to stupidity.

Goes well with the fact that those people who make good money actually have used their brain in the right way. :toast:

I'm done since you had to empower your point with an insult :cool:
here. do this to fix your vdroop.

http://www.overclock.net/t/1421221/gtx780-titan-any-ncp4206-card-vdroop-fix-solid-1-325v
Posted on Reply
#17
MxPhenom 216
Corsair Fanboy
by: erocker
It looks like a piece of solder drops directly onto that LED and turns it on! Neat gif!
It doesn't turn off. the .gif just restarts. The machine isn't on at the beginning of the gif, then it gets turned on, and the cards goes up in smoke.
Posted on Reply
#18
BiggieShady
by: MxPhenom 216
It doesn't turn off. the .gif just restarts. The machine isn't on at the beginning of the gif, then it gets turned on, and the cards goes up in smoke.
Not quite ... but close enough

[YT]sRo-1VFMcbc[/YT]
Posted on Reply
Add your own comment