Friday, January 2nd 2015

Possible NVIDIA GM200 Specs Surface

Somebody sent our GPU-Z validation database a curious looking entry. Labeled "NVIDIA Quadro M6000" (not to be confused with AMD FirePro M6000), with a device ID of 10DE - 17F0, this card is running on existing Forceware 347.09 drivers, and features a BIOS string that's unlike anything we've seen. Could this be the fabled GM200/GM210 silicon?

The specs certainly look plausible - 3,072 CUDA cores, 50 percent more than those on the GM204; a staggering 96 ROPs, and a 384-bit wide GDDR5 memory interface, holding 12 GB of memory. The memory is clocked at 6.60 GHz (GDDR5-effective), belting out 317 GB/s of bandwidth. The usable bandwidth is higher than that, due to NVIDIA's new lossless texture compression algorithms. The core is running at gigahertz-scraping 988 MHz. The process node and die-size are values we manually program GPU-Z to show, since they're not things the drivers report (to GPU-Z). NVIDIA is planning to hold a presser on the 8th of January, along the sidelines of the 2015 International CES. We're expecting a big announcement (pun intended).
Add your own comment

80 Comments on Possible NVIDIA GM200 Specs Surface

#1
ZoneDymo
Sooo can we now finally truly move on to 4k?
Posted on Reply
#2
HumanSmoke
Assuming its correct and is indicative of a shipping product, the clocks seem fairly high given that this appears to be a workstation card, especially the memory clock.
If so, clocks-wise, it augers well for GeForce branded cards...now I guess, we wait around until Nvidia is pressured into releasing it as such.

If I'm reading this right, the ROP and core count are both 50% greater than GM 204, but the 252.9 GTex/sec fillrate implies that the texture address units have increased by 100% from 128 to 256 ( 988 * 256 = 252.928 GTexels/sec) - similar steps to that seen in the Kepler arch, although assuming 128 cores per module (as per GM 107 and GM 204) the number should be 192 ( 3072 cores / 128 per module = 24 SMM * 8 TMU per SMM = 192)

@btarunr
Should run a poll on the thread: "What percentage of posts will howl about pricing?"
A. 90-95%
B. 96-97%
C. 98%+
Posted on Reply
#3
Prima.Vera
ZoneDymoSooo can we now finally truly move on to 4k?
No. You can barely run at 1440p, nevermind 4K...
Posted on Reply
#4
the54thvoid
Intoxicated Moderator
If that's the quadro part we could expect 6gb memory on the desktop part. Assuming info is correct.

And yes, I think the flame trolls will be inbound with haste on the cost front.
Posted on Reply
#5
HumanSmoke
the54thvoidIf that's the quadro part we could expect 6gb memory on the desktop part. Assuming info is correct.
Aye, and it will be interesting to see what the clock envelope is for the gaming/prosumer parts. 12gb running at 6600M effective must chew through at least a third of the power budget for a workstation card (assuming ~250W board power)
Posted on Reply
#6
Chitz
Holy Cupcakes just comparing my 680's specs to this puts my entire rig to shame
Posted on Reply
#7
GhostRyder
Interesting spec sheet as if this really is the GM 200 we have been anticipating then its got a lot of power hidden inside waiting especially if you factor in how the 980 with significantly less Cuda cores performs with Kepler showcasing a significant amount more cores. I will be more interested in what the core and memory clocks are on the desktop counterpart more than anything and the final memory configuration even though based on that its probably going to be probably 6gb based off the preliminary guesses.
ZoneDymoSooo can we now finally truly move on to 4k?
Well one 290X or 980 can do an ok job right now at high settings so based on that if this has about a ~30% performance difference depending on clocks I would say yes that two of these would be able to drive almost all games ultra at 60FPS. But that is just a guess...
the54thvoidIf that's the quadro part we could expect 6gb memory on the desktop part. Assuming info is correct.
That is what I think as well but I think that is going to be the GTX 1080 (Yet to be named but I would love it to be that name!!!) amount as I just do not see the Titan II having 6gb again based on where they (sortta) aim that product.
Posted on Reply
#8
alwayssts
HumanSmokeAye, and it will be interesting to see what the clock envelope is for the gaming/prosumer parts. 12gb running at 6600M effective must chew through at least a third of the power budget for a workstation card (assuming ~250W board power)
According to my math, that sounds about right. :toast:

So (if this is true) figure a core clock roughly 20-25% higher or so (real/'boost'/load), which more-or-less meshes with the earlier rumors.

I have to imagine the default clock (for consumer parts) is fluid dependent upon whatever Fiji/whatever is comparably, but I don't doubt that earlier-reported 1100n/1390b (overclocked?) is within reason for 300w.

I've always figured this arch was engineered with 20nm in mind (which obviously nvidia backed out of at some point, probably not long after they threw that hissy-fit presentation about tsmc's 20nm cost) and the clocks vs older archs reflect that. IOW, 1100 is probably the new '~900mhz' and 1390 the new 11xx (think 1.163-1.175ish volts for both companies earlier products). It meshes with one of nvidia's chief scientists that said 20nm gave a 20-25% boost.

Extrapolate as we may, we still prolly have a ways to go before any hard numbers that could truly give an indication of where it (or the competition) will end up.

The only thing one can safely say is 980 is meant to preempt a faster iteration of a 290x-like product, and a higher segment is typically around 15-20% faster.

From there, each company *should* have two more products on new chips that climb two more approx similar steps, while obviously each lower chip generally overclocks to the level of the stock next level.

If you take that to it's logical conclusion assuming 21 and 24 SMM parts that (over)clock slightly lower (~90%?) than GM204 counterparts (due to extra ram etc), I think it gels...but that's pure speculation.

(and I didn't even mention cost....even though I think 290/290x clearly show a trend for the way forward in AMD's future price structure...which will lead to inevitable 980 price drops and higher-end chips taking it's place.)
Posted on Reply
#9
Steevo
Dear god, 50% more than current production, and assuming a slight process gain in efficiency? Its like AMD is getting tentacle raped in the bad way, all holes all the time.

Even if this is the pro grade part, the slightly cut down version will still be a monster, but assuming its on the same process node how are they cooling this beast?
Posted on Reply
#10
ZhuI
I'm laughing at all the anxiousness of the pro-NV crowd who don't want a debate on price.

Let's be honest, if this is indeed the Titan II then we'll see the 1000 dollar pricetag return. The Titan I was an epic failure. Those who got a 780 basically got a card which was only 10% away from the Titan but at half the price. And let's not even talk about Titan Z vs R295X2.

Nvidia needs to destroy the Titan line. But they won't, because lots of NV fanboys will do almost anything for Nvidia even as the company pisses in their mouths, the fanboys only beg for more.

And btw, if AMD goes down in the GPU space - which is absolutely a possibility - consider the X80 flagships gone and replaced with the 1000 dollar GPU cards instead. But I'm sure you people will defend that, too :D
Posted on Reply
#11
CrAsHnBuRnXp
ZhuII'm laughing at all the anxiousness of the pro-NV crowd who don't want a debate on price.

Let's be honest, if this is indeed the Titan II then we'll see the 1000 dollar pricetag return. The Titan I was an epic failure. Those who got a 780 basically got a card which was only 10% away from the Titan but at half the price. And let's not even talk about Titan Z vs R295X2.

Nvidia needs to destroy the Titan line. But they won't, because lots of NV fanboys will do almost anything for Nvidia even as the company pisses in their mouths, the fanboys only beg for more.

And btw, if AMD goes down in the GPU space - which is absolutely a possibility - consider the X80 flagships gone and replaced with the 1000 dollar GPU cards instead. But I'm sure you people will defend that, too :D
Because this post doesnt scream AMD zealot at all.
Posted on Reply
#12
Hilux SSRG
If this is the full chip and not a cutdown then it's going to cost $1000-$1500.
Posted on Reply
#13
the54thvoid
Intoxicated Moderator
CrAsHnBuRnXpBecause this post doesnt scream AMD zealot at all.
So true.

I'm one of those evil NVidiots. I drive my Nissan Skyline with decals of a winged JSH on the hood/bonnet. I buy NV stock and smoke rolled up AMD shares.

Or I but what suits my needs as long as my budget meets it.
Posted on Reply
#14
ZeDestructor
ZhuII'm laughing at all the anxiousness of the pro-NV crowd who don't want a debate on price.

Let's be honest, if this is indeed the Titan II then we'll see the 1000 dollar pricetag return. The Titan I was an epic failure. Those who got a 780 basically got a card which was only 10% away from the Titan but at half the price. And let's not even talk about Titan Z vs R295X2.

Nvidia needs to destroy the Titan line. But they won't, because lots of NV fanboys will do almost anything for Nvidia even as the company pisses in their mouths, the fanboys only beg for more.

And btw, if AMD goes down in the GPU space - which is absolutely a possibility - consider the X80 flagships gone and replaced with the 1000 dollar GPU cards instead. But I'm sure you people will defend that, too :D
Please. Titan 1 is the poor man's Quadro K6000, not your rich man's GTX 785. Its sole purpose is to provide FP64 processing and ECC at $1000 instead of $6000, and for that, it works just fine.

Titan Z is just a showoff piece, not a card that sells. Mind you, it sold even more poorly than nvidia expected, so there is a sentiment of overpricedness there as well.
Posted on Reply
#15
MxPhenom 216
ASIC Engineer
ZoneDymoSooo can we now finally truly move on to 4k?
More like 5-8k.
Posted on Reply
#16
Xzibit
ZhuII'm laughing at all the anxiousness of the pro-NV crowd who don't want a debate on price.

Let's be honest, if this is indeed the Titan II then we'll see the 1000 dollar pricetag return. The Titan I was an epic failure. Those who got a 780 basically got a card which was only 10% away from the Titan but at half the price. And let's not even talk about Titan Z vs R295X2.

Nvidia needs to destroy the Titan line. But they won't, because lots of NV fanboys will do almost anything for Nvidia even as the company pisses in their mouths, the fanboys only beg for more.

And btw, if AMD goes down in the GPU space - which is absolutely a possibility - consider the X80 flagships gone and replaced with the 1000 dollar GPU cards instead. But I'm sure you people will defend that, too :D
If this turns out to be true.

NVIDIA Planning To Ditch Maxwell GPUs For HPC Purposes Due To Lack of DP Hardware – Will Update Tesla Line With Pascal in 2016, Volta Arriving in 2017

There might not be Titans this time around but high prices is another thing.
Posted on Reply
#18
Fluffmeister
XzibitIf this turns out to be true.

NVIDIA Planning To Ditch Maxwell GPUs For HPC Purposes Due To Lack of DP Hardware – Will Update Tesla Line With Pascal in 2016, Volta Arriving in 2017

There might not be Titans this time around but high prices is another thing.
DP is irrelevant for average joe consumer anyway, so no great lose there.

And it presumably makes sense with 28nm sticking around longer than expected, the recent launch of the K80 no doubt helps to fill the gap.

Speaking of Titan, that big fat government contract for two new super computers Summit and Sierra should keep them busy.

NVIDIA Volta, IBM POWER9 Land Contracts For New US Government Supercomputers
Posted on Reply
#19
Xzibit
It also gives a little more credibility to the leaks from Sisoft remember. Might also mean the ChipHell leaks were not far off either.

Nvidia GM200


AMD Fiji
Posted on Reply
#20
Assimilator
Blimey. 50% more CUDA cores, 50% more ROPs, and a staggering 100% more TMUs. If these specs are correct, this card will be a beast of note. And if nVidia's already got GM200 ready to go, only slightly after AMD releases Fiji... it's not gonna be good for AMD. Even if Fiji outperforms GM204 significantly, nVidia can just drop GM200 into the retail channel and erode that advantage.
Posted on Reply
#21
techy1
if AMDs Fiji will be weak - this will cost 2000$ at least... if AMDs Fiji will be beast+ low price , then this will be priced 700$ max.... so lets us all hope and cheer for AMD - so we can all get this Nvidia cheap and really handle hat 4K
Posted on Reply
#22
Xzibit
techy1if AMDs Fiji will be weak - this will cost 2000$ at least... if AMDs Fiji will be beast+ low price , then this will be priced 700$ max.... so lets us all hope and cheer for AMD - so we can all get this Nvidia cheap and really handle hat 4K
If one puts any real value into the Nvidia/AMD leaks. This one is interesting because we could very well be headed for a nice performance/price war if these products come out close to one another.

Posted on Reply
#23
HumanSmoke
XzibitIf one puts any real value into the Nvidia/AMD leaks. This one is interesting because we could very well be headed for a nice performance/price war if these products come out close to one another.
Those "results" were debunked as bogus almost as soon as they emerged.
Ask yourself, what source would be in possession of BOTH Nvidia's and AMD's next top cards as well as AMD's second tier offering and Nvidia's GM 200 salvage part. One source having access to four unreleased top tier parts across both vendors :rolleyes:

Not only do they have access to both vendors next offerings, not a single other source has even a single one of those four benchmarked.
Posted on Reply
#24
Xzibit
HumanSmokeThose "results" were debunked as bogus almost as soon as they emerged.
Ask yourself, what source would be in possession of BOTH Nvidia's and AMD's next top cards as well as AMD's second tier offering and Nvidia's GM 200 salvage part. One source having access to four unreleased top tier parts across both vendors :rolleyes:
A similar source that put up the Sisoft scores? Which months later now match the TPU validation..

Can you source the debunking? How were they debunked by the way? Was it a consensus from people who didn't like the outcome?
Posted on Reply
#25
the54thvoid
Intoxicated Moderator
XzibitA similar source that put up the Sisoft scores? Which months later now match the TPU validation..

Can you source the debunking? How were they debunked by the way? Was it a consensus from people who didn't like the outcome?
It IS hugely unlikely to get both vendors top cards. Besides, the scores are all over the shop.

On the other hand, a 290X runs on par with GTX980 at 4K so it's not out of the question to expect AMD and NV top be close this time around. The loser will be whoever releases first (IMO). I think the vendor that releases second will tinker with their product to pull a performance edge or use aggressive pricing.

Good for both camps.
Posted on Reply
Add your own comment
Apr 28th, 2024 03:46 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts