Tuesday, February 19th 2013

NVIDIA GeForce GTX Titan Final Specifications, Internal Benchmarks Revealed

Specifications of NVIDIA's upcoming high-end graphics card, the GeForce GTX Titan, which were reported in the press over the last couple of weeks, are bang on target, according to a specs sheet leaked by 3DCenter.org, which is allegedly part of the card's press-deck. According to the specs sheet, the GTX Titan indeed features 2,688 out of the 2,880 CUDA cores present on the GK110 silicon, 6 GB of GDDR5 memory across a 384-bit wide memory interface, and draws power from a combination of 6-pin and 8-pin PCIe power connectors.

The GeForce GTX Titan core is clocked at 837 MHz, with a GPU Boost frequency of 876 MHz, and 6.00 GHz memory, churning out 288 GB/s of memory bandwidth. The chip features a single-precision floating-point performance figure of 4.5 TFLOP/s, and 1.3 TFLOP/s double-precision. Despite its hefty specs that include a 7.1 billion-transistor ASIC and 24 GDDR5 memory chips, NVIDIA rates the card's TDP at just 250W.

More slides and benchmark figures follow.

The next slide leaked by the source reveals key features of the reference design cooling solution, which uses a large lateral blower that features RPM and voltage-based speed control on the software side, a vapor-chamber plate that draws heat from the GPU, memory, and VRM; and an extended aluminum fin stack that increases surface area of dissipation.

Next up, we have performance numbers by NVIDIA. In the first slide, we see the GTX Titan pitted against the GTX 680, in Crysis 3. The GTX Titan is shown to deliver about 29 percent higher frame-rates, while being a tiny bit quieter than the GTX 680.

In the second slide, we see three GeForce GTX Titans (3-way SLI) pitted against a pair of GeForce GTX 690 dual-GPU cards (quad-SLI). In every test, the Titan trio is shown to be faster than GTX 690 Quad-SLI. In Crysis 3, GTX Titan 3-way SLI is shown to be about 75 percent faster; 100 percent faster in Max Payne 3, 40 percent faster in TESV: Skyrim, and 95 percent faster in Far Cry 3. Why this comparison matters for NVIDIA is that if Titan does end up being a $1000 product, NVIDIA will have to sell three of them while offering something significantly better than GTX 690 quad-SLI.

Source: 3DCenter.org
Add your own comment

132 Comments on NVIDIA GeForce GTX Titan Final Specifications, Internal Benchmarks Revealed

#1
Phusius
by: james888
Bragging in this thread also?
just pointing out that AMD is cheaper :)
Posted on Reply
#2
the54thvoid
I can't believe nobody has leaked the actual NDA.
Posted on Reply
#3
xenocide
by: Phusius
just pointing out that AMD is cheaper :)
AMD has been cheaper for quite some time, but I'll admit when I was looking at my 670 I was tempted by the HD7970, but then I remembered how bad the drivers already were for my HD5850, and how bad they were for my HD4870, and decided it was worth the premium :P
Posted on Reply
#4
Phusius
by: xenocide
AMD has been cheaper for quite some time, but I'll admit when I was looking at my 670 I was tempted by the HD7970, but then I remembered how bad the drivers already were for my HD5850, and how bad they were for my HD4870, and decided it was worth the premium :P
never once had a problem with drivers, including the beta ones... but im new to PC's. this is my first true gaming pc. /shrug
Posted on Reply
#5
the54thvoid
I'm not a mod but can we keep AMD drivers out of the thread please? It's about Titan and it's leaked benchmarks, by all accounts compare a 7970 to those but an isolated fart in the wind about AMD drivers is just trolling.
Posted on Reply
#6
Rahmat Sofyan
So far recap from Oj101 XS member :
Crysis 2
Radeon HD 7970 GHz Edition: 68 %
GeForce GTX 680: 65 %
GeForce GTX Titan: 100 %
—————-
GeForce GTX Titan vs Radeon HD 7970 GHz Edition: (100/68)*100 = 147 % = 47 % faster
GeForce GTX Titan vs GeForce GTX 680: (100/65)*100 = 154 % = 54 % faster

3DMark 2013 X Firestrike
Radeon HD 7970 GHz Edition: 77 %
GeForce GTX 680: 67 %
GeForce GTX Titan: 100 %
—————-
GeForce GTX Titan vs Radeon HD 7970 GHz Edition: (100/77)*100 = 130 % = 30 % faster
GeForce GTX Titan vs GeForce GTX 680: (100/67)*100 = 149 % = 49 % faster

3DMark Vantage GPU
Radeon HD 7970 GHz Edition: 76 %
GeForce GTX 680: 81 %
GeForce GTX Titan: 100 %
—————-
GeForce GTX Titan vs Radeon HD 7970 GHz Edition: (100/76)*100 = 132 % = 32 % faster
GeForce GTX Titan vs GeForce GTX 680: (100/81)*100 = 124 % = 24 % faster

Battlefield 3
Radeon HD 7970 GHz Edition: 74 %
GeForce GTX 680: 65 %
GeForce GTX Titan: 100 %
—————-
GeForce GTX Titan vs Radeon HD 7970 GHz Edition: (100/74)*100 = 135 % = 35 % faster
GeForce GTX Titan vs GeForce GTX 680: (100/65)*100 = 154 % = 54 % faster

Far Cry 3
Radeon HD 7970 GHz Edition: 70 %
GeForce GTX 680: 73 %
GeForce GTX Titan: 100 %
—————-
GeForce GTX Titan vs Radeon HD 7970 GHz Edition: (100/70)*100 = 143 % = 43 % faster
GeForce GTX Titan vs GeForce GTX 680: (100/73)*100 = 137 % = 37 % faster

Hitman
Radeon HD 7970 GHz Edition: 81 %
GeForce GTX 680: 73 %
GeForce GTX Titan: 100 %
—————-
GeForce GTX Titan vs Radeon HD 7970 GHz Edition: (100/81)*100 = 124 % = 24 % faster
GeForce GTX Titan vs GeForce GTX 680: (100/73)*100 = 137 % = 37 % faster

Conclusion
GeForce GTX Titan average increase over Radeon HD 7970 GHz Edition: (47 + 30 + 32 + 35 + 43 + 24) / 6 = 35 %
GeForce GTX Titan average increase over GeForce GTX 680: (54 + 49 + 24 + 54 + 37 + 37) / 6 = 42.5 %

All benchmarks done with drivers so premature they aren’t even launch day drivers… Either 3DMark is way off the ball, or drivers are going to show some massive improvements as the scores I’ve seen in 3DMark paint a much more favourable picture. As I said, I haven’t got game performance info, so I can’t comment on the accuracy of the above.
So, when the official review will come?NDA?
Posted on Reply
#7
xenocide
by: the54thvoid
I'm not a mod but can we keep AMD drivers out of the thread please? It's about Titan and it's leaked benchmarks, by all accounts compare a 7970 to those but an isolated fart in the wind about AMD drivers is just trolling.
By that rationale AMD shouldn't even be mentioned in articles about Nvidia products, yet somehow they always are. :rolleyes:

Honestly I am more intrigued to see how this does since GK100 never really showed up. Should make for an interesting card regardless, and damn does it look good (improvement on even the 690's design). Essentially, unless it drops below $650 it's not exactly cost effective, but does offer nice bragging right I suppose.

by: Rahmat Sofyan
So, when the official review will come?NDA?
Rumor says NDA drops later today and reviews go out on thursday.
Posted on Reply
#8
Fourstaff
by: claylomax
Thanks. So that means around £800 here in the UK. Wonderful. I can get two 7970 GHZ for that or less than that.
But if you get three of this Titans, how are you going to replicate this performance with other cards? :p
Posted on Reply
#9
Vinska
by: Fourstaff
But if you get three of this Titans, how are you going to replicate this performance with other cards? :p
In case of getting three Titans, one could go cry in a corner over the fact that the same money could get Ya a sweet-ass ride, some swag and other sh*t, and thus, the ability to ride around town picking up chicks. And if done well, Ya probably would still have some cash left.

TL;DR - crack is cheaper.
Posted on Reply
#10
Fourstaff
by: Vinska
In case of getting three Titans, one could go cry in a corner over the fact that the same money could get Ya a sweet-ass ride, some swag and other sh*t, and thus, the ability to ride around town picking up chicks. And if done well, Ya probably would still have some cash left.

TL;DR - crack is cheaper.
If the rumoured $1000 a card is true, then 3 of them will not even pay for half a year's rent in Soviet Britain, let alone a sweet-ass ride.
Posted on Reply
#12
claylomax
by: Fourstaff
If the rumoured $1000 a card is true, then 3 of them will not even pay for half a year's rent in Soviet Britain, let alone a sweet-ass ride.
With the price of three Titans I can pay my rent for five months (bills included). ;)
Posted on Reply
#13
Hayder_Master
if still on $850 i will prefer 2x GTX640 4G in SLI.
Posted on Reply
#14
Kaynar
So it's set... I'm getting a second 7970 for xfire...
Posted on Reply
#15
Vinska
by: Fourstaff
If the rumoured $1000 a card is true, then 3 of them will not even pay for half a year's rent in Soviet Britain, let alone a sweet-ass ride.
by: claylomax
With the price of three Titans I can pay my rent for five months (bills included). ;)
MEANWHILE, 3x Titans would be very close to my yearly income.
So I am just hoping for a little price drop on 670 and/or 680 && get two of those some time soon.
Posted on Reply
#16
Prima.Vera
Most likely the price will be between 680 and 690.
Posted on Reply
#17
Kaynar
by: Vinska
MEANWHILE, 3x Titans would be very close to my yearly income.
So I am just hoping for a little price drop on 670 and/or 680 && get two of those some time soon.
Well you dont HAVE to buy 3 Titans, its just that this was the only way to prove that their card is worth that price... compare a $3000 setup to a nearly £2500 setup, show its 75% faster for just $500 more, win.

This card reminds me of several people on this website posting about how during the last GPU generations the performance of the new top card is better while the price has also increased dramatically...
Posted on Reply
#18
Dangi
So you need to spend 3.000$ in a 3-SLI to beat a 2.000$ 4-SLI ??

That's clever............nice one Nvidia
Posted on Reply
#19
Vinska
MEANWHILE
wonder how this 3x Titan SLI would do against 4x 7970 Crossfire, which is both cheaper and faster than 2x 690 quad-sli
Posted on Reply
#20
jihadjoe
Get 3 cards? But aren't there going to be only 10,000 Titans?
Jen Hsun must be doing this just to see fights break out in a line full of rich people.
Posted on Reply
#21
Fluffmeister
by: Vinska
MEANWHILE
wonder how this would do against 4x 7970 Crossfire, which is potentially both cheaper and faster than 2x 690 quad-sli
Sounds like a recipe for disaster to me. :laugh:
Posted on Reply
#22
jaggerwild
by: xenocide
By that rationale AMD shouldn't even be mentioned in articles about Nvidia products, yet somehow they always are. :rolleyes:

Honestly I am more intrigued to see how this does since GK100 never really showed up. Should make for an interesting card regardless, and damn does it look good (improvement on even the 690's design). Essentially, unless it drops below $650 it's not exactly cost effective, but does offer nice bragging right I suppose.



Rumor says NDA drops later today and reviews go out on thursday.
Yes!
Until then its all speculation, it looks great!! :D
Posted on Reply
#23
Ikaruga
Hey guys, I'm afraid some people in this thread are missing the point. We all know that Nvidia is much more expensive compared to AMD, someone has to have a better price/performance ratio between the two, and (sadly for us) it's not Nvidia for quite a long time now.

But ask yourself:
- is it the fastest card on the planet? The answer is yes.
- does it scale brutally well as the resolution increases, especially in SLI configuration? That's probably a yes again
- as a high-end card, is it more power efficient than the competitions? Yep, looks like it is.
- etc,etc

It's like asking: can you buy a car which only costs you a few hundred thousands $ and which will perform near or just as good as cars costing a million $ would? Yep, but they still sell cars worth million(s) because there is a market for it.

There is a demand for the best, and some people just don't care about the money. New innovations and top performance products were never cheap. If you can forget the price and look at what the card can do, this card currently just owns everything out there, period.
Posted on Reply
#24
the54thvoid
by: Ikaruga
Hey guys, I'm afraid some people in this thread are missing the point. We all know that Nvidia is much more expensive compared to AMD, someone has to have a better price/performance ratio between the two, and (sadly for us) it's not Nvidia for quite a long time now.

But ask yourself:
- is it the fastest card on the planet? The answer is yes.
- does it scale brutally well as the resolution increases, especially in SLI configuration? That's probably a yes again
- as a high-end card, is it more power efficient than the competitions? Yep, looks like it is.
- etc,etc

It's like asking: can you buy a car which only costs you a few hundred thousands $ and which will perform near or just as good as cars costing a million $ would? Yep, but they still sell cars worth million(s) because there is a market for it.

There is a demand for the best, and some people just don't care about the money. New innovations and top performance products were never cheap. If you can forget the price and look at what the card can do, this card currently just owns everything out there, period.
This is essentially the reason why the GTX 690 is the tell tale card here for gauging it's 'value'. I mentioned elsewhere the 690 scales very well indeed being a dual chip card. It's also quieter than the 680 and consumes a ridiculously low amount of power.
I'd even go so far as to argue that the engineering on the 690 is more impressive than the big ass old school Nvidia approach of having huge die sizes (a la Titan).

Given the sli capabilities of the 690 (works in pretty much every game of note) if the Titan card is not as good but costs more then the Titan card is a bum deal. I know folk always argue that dual cards are not as good as single cards but I've never seen that proven with the 690. On my 7970's, yes, crossfire not so proficient (still good though) but 690's have proven their position.

Titan ought to be cheaper than the 690 - then it works. It should cost way more than a 680 - it deserves to but it needs to hit the right spot.

Here's hoping.
Posted on Reply
#25
Zubasa
by: Fluffmeister
Sounds like a recipe for disaster to me. :laugh:
Quad-GPU scaling in general is a freaking disaster ;)
Posted on Reply
Add your own comment