• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Working on GK110-based Dual-GPU Graphics Card?

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,670 (7.43/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
GeForce GTX 295 showed that its possible to place two GPUs with ludicrously high pin-counts next to each other on a single PCB, and if you get a handle over their thermals, even deploy a 2-slot cooling solution. NVIDIA might be motivated to create such a dual-GPU graphics card based on its top-end GK110 chip, to counter AMD's upcoming "Volcanic Islands" GPU family, or so claims a VideoCardz report, citing sources.

The chips on the card needn't be configured, or even clocked like a GTX Titan. The GTX 780 features just 2,304 of the chip's 2,880 CUDA cores, for example. Speaking of 2,880 CUDA cores, the prospect of NVIDIA developing a single-GPU GeForce product with all streaming multiprocessors on the GK110 enabled, the so-called "Titan Ultra," isn't dead. NVIDIA could turn its attention to such a card if it finds AMD's R9 2xxx within its grasp.



View at TechPowerUp Main Site
 
Last edited:
If they make a Titan Ultra, to counter any AMD flag ship single GPU card coming later this year wouldn't that need to be near the same price point as the AMD card? meaning that would drop the Original Titan and the 780 prices quite a bit. This is assuming the AMD card is very much competitive to the current 780 and Titan, which I'm predicting will perform right inbetween them.

Now a dual GPU GK110 card would be crazy! Price would be right around ~$1300 I would think.
 
If they make a Titan Ultra, to counter any AMD flag ship single GPU card coming later this year wouldn't that need to be near the same price point as the AMD card? meaning that would drop the Original Titan and the 780 prices quite a bit. This is assuming the AMD card is very much competitive to the current 780 and Titan, which I'm predicting will perform right inbetween them.

Now a dual GPU GK110 card would be crazy! Price would be right around ~$1300 I would think.

Indeed, I'm waiting for AMD to release their next-gen cards before I upgrade. $1300 would be pretty good I think - $1300 over here would buy you a titan from a lower-end company haha. TITAN x2 or whatever it is going to be called will probably be around $2k here at least. damn....Very intense performance potential though.
 
If they make a Titan Ultra, to counter any AMD flag ship single GPU card coming later this year wouldn't that need to be near the same price point as the AMD card? meaning that would drop the Original Titan and the 780 prices quite a bit. This is assuming the AMD card is very much competitive to the current 780 and Titan, which I'm predicting will perform right inbetween them.
A lot would depend on how AMD's Hawaii shapes up and where it falls in performance against the GK 110 - which itself has a lot of untapped potential, as demonstrated by the overclocking headroom of the higher voltage limit afforded the GTX 780 Classified (for example). Also not beyond the realms of possibility that Nvidia would marry the GK 110 with the 7GHz memory found on the 770/760 for a 780 Ti or whatever naming convention that's currently in vogue.
The Article said:
GeForce GTX 290 showed that its possible to place two GPUs with ludicrously high pin-counts next to each other on a single PCB
That would be the GTX 295
 
One of these days they'll finally move on from Kepler and develop something actually "new". :rolleyes:

Not that AMD is doing any different. Year after year of rehashes for both. Still have very minimal reason to upgrade from my 460s.
 
One of these days they'll finally move on from Kepler and develop something actually "new". :rolleyes:

Not that AMD is doing any different. Year after year of rehashes for both. Still have very minimal reason to upgrade from my 460s.

Yeah, cause GCN isn't new or anything :rolleyes:
 
One of these days they'll finally move on from Kepler and develop something actually "new". :rolleyes:

Not that AMD is doing any different. Year after year of rehashes for both. Still have very minimal reason to upgrade from my 460s.


While not going that far on the 460 not having a valid upgrade (I'm pretty sure the 760 at $250 or the 7950 at $200 are valid choices there), I think it is sad that AMD sat on the 7000 series all year and that nVidia did a simple rename on the old lines because I can still remember the (good) old days when nVidia would pop out a new card at the drop of a hat. When the Geforce 2 came out months after the Geforce 1 or the Geforce FX5900 came out months after the FX5800 series. Or the ATI 9800 Pro soon after the 9700 Pro.

Those were great days in some ways. Of course, the driver situation was abhorrent, especially on the ATI side, and things were a lot more iffy about games working properly. Plus, SLI (where it existed) was incredibly iffy.

Still, those were fun times. Now we have cards sitting on the same tech with little updates for years. Years. Not hard to imagine a future where nVidia doesn't even bother to release a rename/rebranding and in that year, I doubt AMD would even feel compelled to release something new at the end of the year like they do right now.

I suspect AMD is going to try and ride the wave of gaming interest from new consoles with their new hardware. If that's the case, then expect cards to hit the $299, $399 and/or $499 price points as "counterpoints" to the upcoming next gen consoles. They'll make a big, big push to argue they're in all the consoles and they're in all the important ports (except Watch Dogs or AC4) from next gen's launch. It's a fair argument. I'm sure they'll have a great bundle of games for their coupon system they've constructed by November, too.

That's assuming the rumors are accurate and not just smoke 'n mirrors to unnerve nVidia/customers who might go 7xx series. I find it hard to imagine AMD releasing a $550 card when a $400 console is being released and called, "next gen." It seems like that'd be the price point they'd target aggressively. "Why spend that $400 on a PS4 when you can buy a Radeon with 6GB and Titan-like performance? Plus, you get access to three new games we've just lined up in our bundle. Including Battlefield 4."

That'd be a compelling argument and it would light a fire under nVidia's butt.
 
All of you guys are missing something very big, it's almost unbelievable how shortsighted people can be.

Die shrinks in the past were mostly painless, whereas nowadays not only you have to develop new architectures, you also have to take into account all the intricacies of new nodes - I hope you haven't forgotten how painful it's recently been for both NVIDIA and AMD to move to new nodes.

Then maybe you'll recall that modern GPUs are extremely complex beasts containing up to 6 billions of transistors. How many of you could design such things? How many of you even have the knowledge of high-level APIs to access them? Direct3D, OpenCL, CUDA, OpenGL, etc. etc. etc.

Modern GPUs have almost surpassed the x86/x86-64 architecture in terms of complexity, and definitely surpassed it in terms of transistors count.

Then what about TDP? Do you really think NVIDIA and AMD really want to create beasts consuming over 500W of electricity? Yeah, there'll be a market for them - they'll probably move 10K units and bankrupt.

Still, you'll be screaming, "give us moar!" (sic)
 
NVIDIA could turn its attention to such a card if it finds AMD's R9 2xxx within its grasp.

That would really be sad. This is an architecture that is over a year old at this point, it was so competitive that they just sat on the higher end GPU for 6 months before releasing it to the public. If GK110 is competitive with AMD's next generation that means nVidia might stick with GK110(but hopefully at lower prices) for another 6 months+. They've had a year and a half to just sit and perfect their next gen GPU...that thing better be a beast or nVidia is doing something wrong.
 
Do not want.

Would probably be retardedly cripped by a "barely enough" PWM section, like my Titans.

There's no fun in having a powerhouse if you can't tweak it, especially because these cards are rarely purchased by people who do not tweak.
 
nothing will be like GTX690, all dual GPU's from nvidia fail
 
Do not want.

Would probably be retardedly cripped by a "barely enough" PWM section, like my Titans.

There's no fun in having a powerhouse if you can't tweak it, especially because these cards are rarely purchased by people who do not tweak.

Barely enough PWM? The Titan has a pretty beefy 6-Phase PWM setup. The Titan consumes about 50w less than a GTX570, and those made due with a 4-Phase PWM setup. There is plenty of room to tweak in the Titan design, but nVidia is limiting things in other ways.
 
Barely enough PWM? The Titan has a pretty beefy 6-Phase PWM setup. The Titan consumes about 50w less than a GTX570, and those made due with a 4-Phase PWM setup. There is plenty of room to tweak in the Titan design, but nVidia is limiting things in other ways.

It's crappy and it has a lot of vdroop and it definitely is not up to the challenge of overclocking.

So it's barely enough to get the card through stock ;)
 
The original article also mentioned that Nvidia is planning to release a whole new series of mid- and low-end cards under the GTX 700/Ti moniker. It also listed Maxwell as coming in early 2014. Both tidbits every bit as interesting as the GTX 790 and Titan Ultra rumors.
http://videocardz.com/45403/nvidia-to-launch-more-cards-this-year-maxwell-in-q1-2014

I think it's interesting that it is taking two generations for AMD to catch up to the performance of the GK110 chip. As most are aware, this chip was supposed to be the GTX 680, but GK104 was cheap and competitive enough to keep GK110 on the shelf. All the same, the chips in the current crop of GTX 780's have a manufacturing date of Summer 2012, so they could have come out at any time over the past year.
 
Do want GM110 :Q____

Imagine the leap over GK110. ..
 
It's crappy and it has a lot of vdroop and it definitely is not up to the challenge of overclocking.

So it's barely enough to get the card through stock ;)

The two Titans I've held in my hands both did over 1GHz clock, the PWM seems to be more than enough for overclocking, and not anywhere near "barely enough for stock".
 
The two Titans I've held in my hands both did over 1GHz clock, the PWM seems to be more than enough for overclocking, and not anywhere near "barely enough for stock".

yea well enough for stock clocks. I think Nvidia knows well enough what type of power delivery system they need on there cards to operate out of the box. If you want more out of the card than reference get a vendors non reference cards.
 
The Titans VRM looks like a joke when compared to what a reference 7970 comes with. The Titan has a stock voltage of 1.125v the caps are flat and the coils are of questionable origin probably rated at 40A where as the 7970 uses 6 or 7 60a coiltronics coils alowing a power draw of 360+a and over 500w total power out put. The Titan has 240a and maximum 288w power out put.
 
To those hoping to see Maxwell in Q1:

Absolutely no way in hell will that happen unless it's in minute volumes / paper launch. They have major problems with it. I'd be amazed if it hits the channel in competitive volume before Q3.

That's if it even makes it ... they've cancelled a hell of a lot of stuff in the last 18 months.

P.S. If that Videocardz article is right about first Maxwell cards being 28nm, it'll be hugely watered down. Maxwell was always intended to be 20nm. I simply can't imagine this is true. Everything on the grapevine says yields will be horrific for Maxwell ... but if 28nm is true it probably means it's totally unviable in its 20nm form - rejigging it for 28nm would probably mean a complete redesign, which would be really expensive and unlikely to be very competitive.
 
nV are smart enough to have left performance on the table with the GK110, sure it is a year old already but as the K6000 has shown the 28nm process has matured enough to produce full a fledged GK110 at good clock speeds with perfectly reasonable power consumption.

It should come as no surprise to anyone they have options up their sleeve to rain on AMD's impending parade. Equally it's unlikely they have been sitting on their hands the whole time counting their money and waiting for AMD to respond.

Things are finally going to get interesting again rather than yet another discount and an ever expanding bundle of games. :rolleyes:
 
nV are smart enough to have left performance on the table with the GK110, sure it is a year old already but as the K6000 has shown the 28nm process has matured enough to produce full a fledged GK110 at good clock speeds with perfectly reasonable power consumption.

It should come as no surprise to anyone they have options up their sleeve to rain on AMD's impending parade. Equally it's unlikely they have been sitting on their hands the whole time counting their money and waiting for AMD to respond.

Things are finally going to get interesting again rather than yet another discount and an ever expanding bundle of games. :rolleyes:

They don't have any options, though. GK110 is way too low yield to put in cards that aren't expensive. If they could produce it cheaply and at high yields, it'd be in them. The new AMD cards are replacing everything from the mid range up.

Maxwell isn't coming for a long time, and is reportedly suffering even worse yields than ever before ... and I don't believe the VC article about 28nm.
 
To those hoping to see Maxwell in Q1:Absolutely no way in hell will that happen unless it's in minute volumes / paper launch. They have major problems with it. I'd be amazed if it hits the channel in competitive volume before Q3.
That's if it even makes it ... they've cancelled a hell of a lot of stuff in the last 18 months....Everything on the grapevine says yields will be horrific for Maxwell ... but if 28nm is true it probably means it's totally unviable in its 20nm form - rejigging it for 28nm would probably mean a complete redesign, which would be really expensive and unlikely to be very competitive.
Wow. That's a whole lot of supposition with no supporting evidence. Hope you got the information from a more reliable source than your pro-AMD leaker:
From what I hear, the first showing [of Bulldozer] was delayed due to final bugs being ironed out of the chipset / mobos.
Nope. The 900 series chipsets arrived a full five months before Bulldozer.
If anything, I'd call it a good omen. If they're confident enough to not show it at all this close to launch, they must think performance is pretty good. It's still meant to be launching mid-late April.
You were only six months out...so maybe not that good an omen

As for Maxwell, I've yet to see any definitive information regarding yields, performance, or cancellations - with the notable exception of Charlie D, who has a habit of declaring new GPUs (for page views) followed by the imaginary GPUs then being cancelled (also for page views). Old ploy. Most people with any sense don't buy into it. Feel free to share with the group any actual information.

As for yields in general, if AMD are supposedly delivering "Titan-beating-performance" then they are almost certainly going to have to do it with the largest die yet used on an ATI/AMD GPU product - it certainly should eclipse the 420mm^2 of the R600, so I wouldn't bank on any miracles in pricing. AMD's initial pricing of the HD 7990 (and HD 7970 before it) and the FX-9590 should be seen as an indicator of what to expect vis-à-vis the given performance.
 
The two Titans I've held in my hands both did over 1GHz clock, the PWM seems to be more than enough for overclocking, and not anywhere near "barely enough for stock".

So how comes there's like 0.04v-0.06v Vdroop when playing with voltage over 1.25v?

I've seen it drop as little as 1.261v when setting it 1.32v.

That not a good VRM. Good for me was reference Volterra on my two 6990s.
 
roadmap

People seem to forget that GPU designs are developed YEARS before release. Both Nvidia and AMD have products in varying stages that won't come out until 2016. Along the way processes get refined and advancements occur but bottom line is these products have a long time from drawing board to consumer. It's funny when I read some comments that indicate the poster thinks, oh a product came out from company A, now company B will immediately do this to respond. Not how it works.
As far as a dual 110, I think it would be an incredible performer.
 
So how comes there's like 0.04v-0.06v Vdroop when playing with voltage over 1.25v?

I've seen it drop as little as 1.261v when setting it 1.32v.

That not a good VRM. Good for me was reference Volterra on my two 6990s.

Because it was never designed, and there really isn't a need, to go over 1.2v. That is certainly not "stock" your talking about here, so the statement that it is barely enough for stock is absurd. In fact, it will handle the maximum voltage nVidia makes available to you by default without a problem, the PWM only becomes an issue when you push beyond 1.2v. And even at 1.2v the Titans I had both did 1,040Hz, that is good enough in my books. A Titan at 1,040MHz is insane as it is, the tiny amount more you'd get by going higher is pointless.
 
Back
Top