Monday, February 18th 2013

NVIDIA GeForce GTX Titan Graphics Card Pictured in Full

Here it is, folks; the first pictures of NVIDIA's newest pixel crunching dreadnought, the GeForce GTX Titan. Pictures leaked by various sources east of the Greenwich Median reveal a reference board design that's similar in many ways to that of the GeForce GTX 690, thanks to the magnesium alloy cooler shroud, a clear acrylic window letting you peep into the aluminum fin stack, and a large lateral blower. The card features a glowy "GeForce GTX" logo much like the GTX 690, draws power from a combination of 6-pin and 8-pin PCIe power connectors, and features two SLI bridge fingers letting you pair four of them to run 3DMark Fire Strike as if it were a console port from last decade.

The GeForce GTX Titan PCB reveals that NVIDIA isn't using a full-coverage IHS on the GK110 ASIC, rather just a support brace. This allows enthusiasts to apply TIM directly on the chip's die. The GPU is wired to a total of twenty four 2 Gbit GDDR5 memory chips, twelve on each side of the PCB. The card's VRM appears to consist of a 6+2 phase design which uses tantalum capacitors, slimline chokes, and driver-MOSFETs. The PCB features a 4-pin PWM fan power output, and a 2-pin LED logo power output that's software controllable.

Given the rumored specifications of the GTX Titan, the card could be overkill for even 2560 x 1600, and as such could be designed for 3DVision Surround (3 display) setups. Display outputs include two dual-link DVI, an HDMI, and a DisplayPort.

According to most sources, the card's specifications look something like this:
  • 28 nm GK110-based ASIC
  • 2,688 CUDA cores ("Kepler" micro-architecture)
  • 224 TMUs, 48 ROPs
  • 384-bit GDDR5 memory interface
  • 6 GB memory
  • Clocks:
  • o 837 MHz core
    o 878 MHz maximum GPU Boost
    o 6008 MHz memory
  • 250W board power
Sources: Egypt Hardware, VideoCardz
Add your own comment

118 Comments on NVIDIA GeForce GTX Titan Graphics Card Pictured in Full

#2
okidna
by: btarunr
and features two SLI bridge fingers letting you pair four of them to run 3DMark Fire Strike as if it were a console port from last decade.
:roll::roll::roll::roll:

Neat design just like GTX 690, love it :toast:
Posted on Reply
#3
Rei86
Review WIZZ COMEON !!@!@!

NDA should be lifted
Posted on Reply
#4
Crowned Clown
At last! Only thing left to start my new build is Ivy B-E! :toast:
Posted on Reply
#5
Flibolito
that is beautiful. wow i wish 4k monitors were available and i would hop on one of these.
Posted on Reply
#6
NutZInTheHead
That's an awesome looking card. I hope the price is reasonable.
Posted on Reply
#7
Enmitynz
The single most sexy card ive ever seen. Where is the insert hole for peen? I wanna fuck this burrito!
Posted on Reply
#8
Delta6326
Now to have this in the Case Labs S3 with custom water...
Posted on Reply
#9
Animalpak
ladies and gentleman's... THIS IS OUR METEOR !!


Pure show of strength !
Posted on Reply
#10
renz496
the look alone looks expansive lol. so i suppose this thing should compete head to head to 7970 (which also rated at 250w)? :p
Posted on Reply
#11
xenocide
by: renz496
the look alone looks expansive lol. so i suppose this thing should compete head to head to 7970 (which also rated at 250w)? :p
It should crush the HD7970.
Posted on Reply
#13
The Von Matrices
by: KainXS
review eta wizz?
If he even acknowledged that the card existed he would be breaking the NDA.
Posted on Reply
#14
zolizoli
I am really worried abot the price... NVIDIA is my favorite forever.. but lately the become GIGA GREEDY.. They do the same like apple: nice designe,low cost hardware,gigantic marketing= Astronomical price.
I suspect that it will cost the same as a 690 but it will be slower and if that so there is no point on buying it.. i dont care of a bit of micro shuttering or 2gb less ram. Iam not a molecular scientist,iam just an average gamer who love tech...

Thats how its goes nowdays in NVIDIAs lab..
TECHNICIAN: Hey boss. we have a large amount of GK110 in the storage from the tesla program, should i throw them out to the container? cause we need space for the next project..
BOSS: No Way. There is a lots of stupid gamer out there. Call the design department to make a fancy name and design for this garbage and lets make some cash. If you can make it i will make you to my personal coffe maker!
TECHNICIAN: You got it BOSS.
Posted on Reply
#15
Mathragh
The VRM's dont look thát beefy. Does anyone know how this compares to a 7970?
Posted on Reply
#16
btarunr
Editor & Senior Moderator
by: Mathragh
The VRM's dont look thát beefy. Does anyone know how this compares to a 7970?
It looks sufficient for 250W.
Posted on Reply
#17
SIGSEGV
let nvidia rolls this card out in the market and sucks much of money from people who always want a fastest thing in their life. i'd like to save my money and buy a ps4 console machine in the future. i'm done with nvidia.

anyway, its design looks cool and sweet. it surely would attracts many ants to get this sugar candy. :laugh:
Posted on Reply
#18
Mathragh
by: btarunr
It looks sufficient for 250W.
Not trying to bash the card or anything, just curious; if this is to be such a monster card, shouldn't it also have a monster VRM, or atleast, quite a beefy one? Atleast, if I were Nvidia when bringing out "the card of cards" I'd make sure that when people go all crazy with this card, the problem wont be the VRM's.

I'm not saying it has bad VRMs, but I'm wondering if anyone with knowledge of these things has anything to say about the VRM's we can see on the pictures.
The only thing that really got my attention is that apparently they didnt feel the need to fill all the space reserved for the VRM's, as there appear to be 2(or 1,5) empty spots.
I could be totally wrong, however:) :toast:
Posted on Reply
#19
james888
Arn't 680's voltage locked/limited in some way? It would be funny if they did that here.
Posted on Reply
#20
okidna
by: Mathragh
Not trying to bash the card or anything, just curious; if this is to be such a monster card, shouldn't it also have a monster VRM, or atleast, quite a beefy one? Atleast, if I were Nvidia when bringing out "the card of cards" I'd make sure that when people go all crazy with this card, the problem wont be the VRM's.

I'm not saying it has bad VRMs, but I'm wondering if anyone with knowledge of these things has anything to say about the VRM's we can see on the pictures.
The only thing that really got my attention is that apparently they didnt feel the need to fill all the space reserved for the VRM's, as there appear to be 2(or 1,5) empty spots.
I could be totally wrong, however:) :toast:
If I'm not wrong (I'm no "VRM expert" at all), Titan is using 6+2 VRM config.
More or less the same with reference design of 7970 (note that AMD also did not fill all the space reserved for the VRMs).

These 2 cards also use the same power connectors design, 6+8 pin. So IMHO, if this kind of VRM setup is sufficient for HD7970, it should be sufficient for Titan.

But just like you said, I could be totally wrong :)
Posted on Reply
#21
Lionheart
I must admit, that is one nice looking card :eek:

Time to play the waiting game for Wizzard's review ;)
Posted on Reply
#22
Mathragh
by: okidna
If I'm not wrong (I'm no "VRM expert" at all), Titan is using 6+2 VRM config.
More or less the same with reference design of 7970 (note that AMD also did not fill all the space reserved for the VRMs).

These 2 cards also use the same power connectors design, 6+8 pin. So IMHO, if this kind of VRM setup is sufficient for HD7970, it would be sufficient for Titan.

But just like you said, I could be totally wrong :)
Makes sense:)
Posted on Reply
#23
buggalugs
by: zolizoli
I am really worried abot the price... NVIDIA is my favorite forever.. but lately the become GIGA GREEDY..
Whadaya mean lately? Nvidia have been that way for 15 years.
Posted on Reply
#24
Kovoet
Think I'll wait till next year and maybe just go crossfire again.
Posted on Reply
#25
zolizoli
by: buggalugs
Whadaya mean lately? Nvidia have been that way for 15 years.
Not really. Even in the time of gtx280 it was affordable even at release and it was the fastes single chip for a while.
I think they turned the GREED ENGINE on since the gtx500 serie and it was just a refreshed 400 serie.
The 680 is so insanely priced,the GK104 not even designed to be high end. But it had a better performance than expected so why not fool the customers and rip them long as they can and keep yesterdays tech in a shelf and when the customers recover financialy they sell it as futures wonder tech.
This greedy corporates holding back our technological evolution.
Posted on Reply
Add your own comment