Monday, March 16th 2015

NVIDIA GeForce GTX TITAN-X Specs Revealed

NVIDIA's GeForce GTX TITAN-X, unveiled last week at GDC 2015, is shaping up to be a beast, on paper. According to an architecture block-diagram of the GM200 silicon leaked to the web, the GTX TITAN-X appears to be maxing out all available components on the 28 nm GM200 silicon, on which it is based. While maintaining the same essential component hierarchy as the GM204, the GM200 (and the GTX TITAN-X) features six graphics processing clusters, holding a total of 3,072 CUDA cores, based on the "Maxwell" architecture.

With "Maxwell" GPUs, TMU count is derived as CUDA core count / 16, giving us a count of 192 TMUs. Other specs include 96 ROPs, and a 384-bit wide GDDR5 memory interface, holding 12 GB of memory, using 24x 4 Gb memory chips. The core is reportedly clocked at 1002 MHz, with a GPU Boost frequency of 1089 MHz. The memory is clocked at 7012 MHz (GDDR5-effective), yielding a memory bandwidth of 336 GB/s. NVIDIA will use a lossless texture-compression technology to improve bandwidth utilization. The chip's TDP is rated at 250W. The card draws power from a combination of 6-pin and 8-pin PCIe power connectors, display outputs include three DisplayPort 1.2, one HDMI 2.0, and one dual-link DVI.
Source: VideoCardz
Add your own comment

55 Comments on NVIDIA GeForce GTX TITAN-X Specs Revealed

#26
64K
the54thvoidInitial reports seem to point to fully fledged.
I think so too. We may get some higher clocks on the gaming version of GM200 with 6 GB VRAM but probably not more cores. We'll see.
Posted on Reply
#27
rtwjunkie
PC Gaming Enthusiast
64KI think so too. We may get some higher clocks on the gaming version of GM200 with 6 GB VRAM but probably not more cores. We'll see.
We can hope. I'm looking forward to comparing that gaming version (whatever they want to call it) to the 390X and making a decision! Good times ahead!
Posted on Reply
#28
GhostRyder
the54thvoidInitial reports seem to point to fully fledged.
That is what I hope, so I hope the reports are true!
64KI think so too. We may get some higher clocks on the gaming version of GM200 with 6 GB VRAM but probably not more cores. We'll see.
Yep, if they do that I would see it as a great move for the consumers more than anything so no one feels like the middle finger has been shot at them. Though I will say this, they are clearly marketing this card (Titan X) as a gamer card in their initial demos though I am sure we will see a 6gb GTX 1080 card in the future but I am almost wondering if they will cut a few cores on it.
rtwjunkieWe can hope. I'm looking forward to comparing that gaming version (whatever they want to call it) to the 390X and making a decision! Good times ahead!
Ill drink to that:toast:
Posted on Reply
#29
64K
GhostRyderYep, if they do that I would see it as a great move for the consumers more than anything so no one feels like the middle finger has been shot at them. Though I will say this, they are clearly marketing this card (Titan X) as a gamer card in their initial demos though I am sure we will see a 6gb GTX 1080 card in the future but I am almost wondering if they will cut a few cores on it.
I find it hard to understand Nvidia sometimes. They also marketed the Titan Z as a "gaming monster". Not quite equal to two Titan Blacks for the price of three. :rolleyes:


GeForce GTX TITAN Z
GeForce® GTX™ TITAN Z is a gaming monster, the fastest graphics card we’ve built to power the most extreme PC gaming rigs on the planet. Stacked with 5760 cores and 12 GB of memory, this dual GPU gives you the power to drive even the most insane multi-monitor displays and 4K hyper PC machines.

www.geforce.com/hardware/desktop-gpus/geforce-gtx-titan-z
Posted on Reply
#30
alwayssts
rtwjunkieWe can hope. I'm looking forward to comparing that gaming version (whatever they want to call it) to the 390X and making a decision! Good times ahead!
Before with an expected 1ghz/640gbps 4GB model, I would have said a lot (as small of a detail as it is) comes down to if they kept 64 or sprang for more ROPs in Fiji because many times you'd be comparing [in practicality] say clocks of '3360 units' in a 21SMM GM200 to the (when not used for compute effects) rop-limited count of somewhere around 3755 if 64 (or the full 4096 if more)...then offset by the performance given by extra bandwidth (which in amd's case would have been around 5-10% depending on clock and situation)...but now...

8GB, 375w tdp, 1050mhz clock, 1.28Tbps of bw.....There really isn't going to be a contest unless it's a ton more expensive (titan-like) and/or that extra tdp (or amd cards in general) really bothers someone. I could see stock performance being damn close to the typical overclocked performance of such a cut-down gm200 in some cases. IOW, it's going to out-titan Titan in that form. While it may or may not clock as high, it has a more efficient (in both size and type) memory buffer. The extra tdp beyond whatever engine clocks will be supplemented by bandwidth performance. Going from 4->8GB (or more importantly from 640gb>1.28Tbps) should add around 16% just by itself...if unless

Crazy pants. I really never thought we'd see a true 'battle of the titans'. I hope they give it a different name than '390x 8GB.'


edit: Just realized slide says '1TB is up ahead' and '2x1GB' diagram probably means the dual-link likely won't increase bandwidth...probably still 'only' 640gbps. It should still compete well with 'GM200 Gaming' and Titan both...I think they'll end up being really close (assuming such a GM200 does around 1400 and Fiji can do 1200+ like many 28nm chips [partially perhaps in thanks to Carrizo-style voltage/clock voodoo vs something like Hawaii]), plus or minus how clocks turn out (for said gm200 part as well as 300w and 375w Fiji flavors). 8GB is still nice for crossfire/4k and the extra power pretty much guarantees it's ability to stack up well with Titan.

(I think it's time to get some sleep so my reading comprehension improves beyond skimming the actual important information of said slides and relying on reporters' boldface spec tables).
Posted on Reply
#32
GhostRyder
64KI find it hard to understand Nvidia sometimes. They also marketed the Titan Z as a "gaming monster". Not quite equal to two Titan Blacks for the price of three. :rolleyes:


GeForce GTX TITAN Z
GeForce® GTX™ TITAN Z is a gaming monster, the fastest graphics card we’ve built to power the most extreme PC gaming rigs on the planet. Stacked with 5760 cores and 12 GB of memory, this dual GPU gives you the power to drive even the most insane multi-monitor displays and 4K hyper PC machines.

www.geforce.com/hardware/desktop-gpus/geforce-gtx-titan-z
Yep, the cards are marketed more towards gamers with the names and the way they are presented whether or not they are more intended for the professional crowd. But personally I have always still viewed the Titan's as what the GTX 580's and below used to do at a higher price. Just my opinion of course...

I still personally want to see this out in public, we are going to have a war on our hands this round and the question is which choice is going to lead to victory?
15th Warlock

Here, you can share mine. I am sure the real show will begin soon enough!
Posted on Reply
#33
TheGuruStud
DeNeDeWhy not DP 1.3 ?
This has an easy answer, imo. DP 1.2a and above has adaptive sync which allows AMD's Freesync to work.

Nvidia doesn't want someone hacking the drivers and enabling it (at least a proof of concept) on their cards to prove that Gsync is a scam. That's why even the 900 series do not have 1.2a.

I don't know how long they can keep the charade going, though.
Posted on Reply
#34
SirEpicWin
390X + 2k Freesync
There is no more I can ask for.
Posted on Reply
#35
15th Warlock
GhostRyderYep, the cards are marketed more towards gamers with the names and the way they are presented whether or not they are more intended for the professional crowd. But personally I have always still viewed the Titan's as what the GTX 580's and below used to do at a higher price. Just my opinion of course...

I still personally want to see this out in public, we are going to have a war on our hands this round and the question is which choice is going to lead to victory?



Here, you can share mine. I am sure the real show will begin soon enough!
Oh it has begun already, the usual trolls and haters are here ;)



Oh well, I'm just waiting for W1zz's review of this card, it goes live at midnight I assume? Pretty sure he'll also have an SLI review as well.

Can't wait to try these cards myself :toast:
Posted on Reply
#36
Maban
15th WarlockOh well, I'm just waiting for W1zz's review of this card, it goes live at midnight I assume? Pretty sure he'll also have an SLI review as well.
I would assume the NDA expires the same time as the press event which is scheduled for tomorrow at 9AM PST.
Posted on Reply
#37
Sony Xperia S
15th WarlockOh it has begun already, the usual trolls and haters are here
You haven't seen anything yet. :D

Posted on Reply
#38
the54thvoid
Intoxicated Moderator
TheGuruStudThis has an easy answer, imo. DP 1.2a and above has adaptive sync which allows AMD's Freesync to work.

Nvidia doesn't want someone hacking the drivers and enabling it (at least a proof of concept) on their cards to prove that Gsync is a scam. That's why even the 900 series do not have 1.2a.

I don't know how long they can keep the charade going, though.
Not really a charade. 27" 4k G-Sync monitor at £560. Not too expensive tbh. A couple of freesyncs at £450 ish for 1440p. There's a decent price comparison between the two. I don't know much about Gsync/freesync except you need an appropriate monitor?
If the prices are quite similar, is there a problem?
Posted on Reply
#39
64K
MabanI would assume the NDA expires the same time as the press event which is scheduled for tomorrow at 9AM PST.
Well then I know where I will be at noon tomorrow EST. Here clicking my refresh button every 5 minutes until I see the Titan X review popup. :D
Posted on Reply
#40
15th Warlock
Sony Xperia SYou haven't seen anything yet. :D

Well, that description easily fits fanboys from both camps :p

Look, I'm not here to start an argument, been blessed enough to be able to afford the best cards from both camps, but I understand most ppl only have one choice, and just like when you buy a car, you feel the need to justify your investment on a particular brand name.

What I don't get is how ppl can be so passionate, blindly defending certain brands when obviously all big corporations are out to grab your hard earned money...

Oh well, at least it's entertaining :toast:
Posted on Reply
#41
TheGuruStud
the54thvoidNot really a charade. 27" 4k G-Sync monitor at £560. Not too expensive tbh. A couple of freesyncs at £450 ish for 1440p. There's a decent price comparison between the two. I don't know much about Gsync/freesync except you need an appropriate monitor?
If the prices are quite similar, is there a problem?
Gsync tacks on 200ish USD to the cost. 1080p TNs are 400 USD with g sync. Outrageous.

Over 800 for a 4k 27" ? Ain't gonna happen lol
32" for that price is more reasonable. And I assume they're all TN panels. Definitely forget it.
Posted on Reply
#42
the54thvoid
Intoxicated Moderator
TheGuruStudGsync tacks on 200ish USD to the cost. 1080p TNs are 400 USD with g sync. Outrageous.

Over 800 for a 4k 27" ? Ain't gonna happen lol
32" for that price is more reasonable. And I assume they're all TN panels. Definitely forget it.
UK prices tend to differ from US so it's harder to judge here. That being said the only comparison I have directly is for 1440p and it's a £500 Benq Free-sync versus a £620 Asus ROG G-Sync. In comparison the cheapest BenQ at 1440p without Free-Sync is £320 and the equivalent cheapest Asus is £420. So for BenQ you pay an extra £180 (or %56 more) and for Asus you pay an extra £200 (48% more). In terms of what you pay more for Free-Sync or G-Sync, you'll probably find the prices aren't that far off (all TN panels). Though I wont argue the point that Nvidia got there first so slapped a premium on it. If they drop that premium as more Free-Sync appears then they simply did what most tech does - charge more initially because nobody else has it.

The cheapest 1080p G-Sync on this retailer is £330. Unfortunately no 1080p Free-Sync to compare. Another retailer has a 1440p 28" Acer G-Sync at £510 (£80 more than their equiv Free-Sync).

Either way, I have little interest in G/Free sync for now as I may go AMD or Nvidia next round so I don't want a redundant monitor choice.


EDIT: NPU has news - looks like 390X is $700-1000. Sony Xperia wont be happy.
Reports say that the R9 390X price will be in the $700 to $1000 range, depending on whether you go for the 4GB model, or the 8GB model.
www.nextpowerup.com/news/19232/amd-r9-390x-vs-r9-290x-benchmark-scores-and-peformance-leaked-r9-390x-is-60-faster-than-r9-290x.html
Posted on Reply
#44
the54thvoid
Intoxicated Moderator
TheGuruStudYou guys get shafted on component pricing it seems. Not like the Aussies, though lol
20% tax on items. As for the Aussies, yes - they are absolutely shafted. Criminally so. :(


Well the new leaks on 390X are cool indeed. If 60% faster than 290X and Titan is 50% faster than 980, tally it against this:



And you'll get a heat. If an 8GB 390X is $1000 and a 12GB Titan is $1000 and both pull 250Watts, this will be a fucking awesome round of cards :rockout:
Posted on Reply
#45
arbiter
alwaysstsAgain, not trying to be 'that guy', but I am about infinately less pumped about this product now that it has been 'leaked' AMD will have an 8GB variant of the 390x. One could have rightfully assumed they might, but I was under the impression 2nd gen HBM was more of an end of the year thing (granted those road maps are old and 'subject to change without notice'), meaning Arctic Islands/Pascal being the first to use them in 2016 or at best an end-of-year refresh for 390x. If it comes earlier, that changes things dramatically.

Sure, it might not come first, or even for a while, but the fact they are (seemingly) putting a rush on it really puts a clamp on Titan's main function (scaling above limitations of 4GB/1440p; 6GB and it's conceivable inadequacy at 4k). If the mainstream GM200/Fiji parts can overclock well, which by their memory setup (both less and in AMD's case more efficient use of power) they should...it really takes the wind out of Titan's sails.
Wonder what kinda price premium the 8gb version will have over the 4gb version, current 290 cards are 120$ gap. That is with current memory, so would guess could be as much as 200$ or something cause its new type of memory.
alwaysstsWhen I saw the 'leaked benchmarks' of overclocks on Titan (1222mhz), first thing I did was subtract the power usage from 6GB of ram. What you end up with is a part that *should* overclock to ~1400mhz*. Doing the math, one can assume 980ti (with conceivably 3 less units) ends up a similar part in the end...pretty much separated by stock performance, core power efficiency, and memory buffer. It also just so happens this should be enough core performance (Titan at stock/980ti overclocked) to max out settings at 1440p or overclocked (in Titan's case because of buffer) strong-enough in an sli config to run even demanding games at 4k.
Just cause the spec of 6pinx8pinxpci-e is 300watts max, Doesn't mean that they can't do much more then that. less you forget AMD pull up to 600watts from 2x8pinxpci-e bus. so that is about 250-300watts per 8pin, almost twice the spec.
TheGuruStudThis has an easy answer, imo. DP 1.2a and above has adaptive sync which allows AMD's Freesync to work.

Nvidia doesn't want someone hacking the drivers and enabling it (at least a proof of concept) on their cards to prove that Gsync is a scam. That's why even the 900 series do not have 1.2a.

I don't know how long they can keep the charade going, though.
Adaptive sync is an Option part of the Spec, its not required.
the54thvoidAnd you'll get a heat. If an 8GB 390X is $1000 and a 12GB Titan is $1000 and both pull 250Watts, this will be a fucking awesome round of cards :rockout:
No way in heck that 390x will pull 250watts, more looking like 350-400watts as most like TDP for it.
Posted on Reply
#46
xorbe
390X has 4096b vram bus? How does that even work? Can they can fit that many io pins on a consumer-bound chip? There must be an intermediate "consolidator" or something.
Posted on Reply
#47
Maban
xorbe390X has 4096b vram bus? How does that even work? Can they can fit that many io pins on a consumer-bound chip? There must be an intermediate "consolidator" or something.
As far as I can tell, each chip will actually have 1024 data paths. Remember that these are being put on the actual package, not on the PCB allowing for smaller traces. /guess
Posted on Reply
#48
TheGuruStud
arbiterAdaptive sync is an Option part of the Spec, its not required.
Plausible deniability or so they think they can use that. If 1.2a or 1.3 were used and were mysteriously missing adaptive sync, then Nvidia would absolutely look crooked and have to formulate some cockamimi lie.

No one has called them out on DP specs, though.
Posted on Reply
#49
Xzibit
Looks like the original leaks for both are panning out so far..

Nvidia

AMD
Posted on Reply
#50
GhostRyder
the54thvoid20% tax on items. As for the Aussies, yes - they are absolutely shafted. Criminally so. :(


Well the new leaks on 390X are cool indeed. If 60% faster than 290X and Titan is 50% faster than 980, tally it against this:



And you'll get a heat. If an 8GB 390X is $1000 and a 12GB Titan is $1000 and both pull 250Watts, this will be a fucking awesome round of cards :rockout:
If that happens (the 250watt part) I think we would all be blown away. I just hope for around ~300 just so it will not be too difficult to run 3 of them for the crazy people like me (Albeit I am probably sitting this round out). Though if it is I am probably going to blow a gasket.
the54thvoidUK prices tend to differ from US so it's harder to judge here. That being said the only comparison I have directly is for 1440p and it's a £500 Benq Free-sync versus a £620 Asus ROG G-Sync. In comparison the cheapest BenQ at 1440p without Free-Sync is £320 and the equivalent cheapest Asus is £420. So for BenQ you pay an extra £180 (or %56 more) and for Asus you pay an extra £200 (48% more). In terms of what you pay more for Free-Sync or G-Sync, you'll probably find the prices aren't that far off (all TN panels). Though I wont argue the point that Nvidia got there first so slapped a premium on it. If they drop that premium as more Free-Sync appears then they simply did what most tech does - charge more initially because nobody else has it.

The cheapest 1080p G-Sync on this retailer is £330. Unfortunately no 1080p Free-Sync to compare. Another retailer has a 1440p 28" Acer G-Sync at £510 (£80 more than their equiv Free-Sync).

Either way, I have little interest in G/Free sync for now as I may go AMD or Nvidia next round so I don't want a redundant monitor choice.


EDIT: NPU has news - looks like 390X is $700-1000. Sony Xperia wont be happy.



www.nextpowerup.com/news/19232/amd-r9-390x-vs-r9-290x-benchmark-scores-and-peformance-leaked-r9-390x-is-60-faster-than-r9-290x.html
So there are 8gb variants already in the works, I missed that honestly as I thought it was just a bunch of hope/rumors. I would think some 8gb variants of the card would be pretty nice in a pair for 4k gamers.
Posted on Reply
Add your own comment
Apr 25th, 2024 11:09 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts