Saturday, March 7th 2015

NVIDIA GeForce GTX TITAN-X Pictured Up-close

Here are some of the first close-up shots of NVIDIA's new flagship graphics card, the GeForce GTX TITAN-X, outside Jen-Hsun Huang's Rafiki moment at a GDC presentation. If we were to throw in an educated guess, NVIDIA probably coined the name "TITAN-X" as it sounds like "Titan Next," much like it chose "TITAN-Z" as it sounds like "Titans" (plural, since it's a dual-GPU card). Laid flat out on a table, the card features an a matte-black colored reference cooling solution that looks identical to the one on the original TITAN. Other cosmetic changes include a green glow inside the fan intake, the TITAN logo, and of course, the green glow on the GeForce GTX marking on the top.

The card lacks a back-plate, giving us a peek at its memory chips. The card features 12 GB of GDDR5 memory, and looking at the twelve memory chips on the back of the PCB, with no other traces, we reckon the chip features a 384-bit wide memory interface. The 12 GB is achieved using twenty-four 4 Gb chips. The card draws power from a combination of 8-pin and 6-pin power connectors. The display I/O is identical to that of the GTX 980, with three DisplayPorts, one HDMI, and one DVI. Built on the 28 nm GM200 silicon, the GTX TITAN-X is rumored to feature 3,072 CUDA cores. NVIDIA CEO claimed that the card will be faster than even the previous generation dual-GPU flagship product by NVIDIA, the GeForce GTX TITAN-Z.
Source: MaximumPC
Add your own comment

85 Comments on NVIDIA GeForce GTX TITAN-X Pictured Up-close

#51
qubit
Overclocked quantum bit
GhostRyderHmm, I wish they would ditch the DVI port on the card and stick with just the DP's and HDMI. Still the best single GPU output choices on the market though with those 3 DP's and an HDMI...


Indeed, though I think the problem with the first round was more the Titan was viewed by most as being the new top end GTX X80 card and expecting the purchase to be more satisfying than it was in the end. I mean who saw that the GTX 780 would be so close in performance for much cheaper or that the 780ti would best it by a great margin and then another Titan would even come besting the previous one. Even after people learned the Titan was no the full die it was one of those wtf moments when the other three cards came out for much better prices and better performance especially if you factor in later people could get 6gb 780's.

But whatever, I personally would not buy one except second hand if I really decided I needed a trio unless the price fell more in line. Though I think the best part will be seeing the benches and then being able to compare it to what could later come out as the 1080.
Exactly. History is going to repeat itself with this card. I'm not surprised NVIDIA is doing this again, because the last ones sold so well so why shouldn't they? I know the GTX version is gonna be awesome. I'm also quite keen to see what numbering scheme they give it now that they've used up all the single digits. Why on earth did they skip the 8xx series?!

I think the only real justification for this card is for people that want the uncrippled compute functionality. Or if someone well off just wants it. ;)
Posted on Reply
#52
arbiter
btarunrUnlikely. 1.3 would mean FreeSync support. NVIDIA will drag on to 1.2 until resolution hikes make it obsolete.
Wrong, Adaptive-sync is an Optional Part of the 1.2a spec. Its not required for a device to be certified as 1.3.
Posted on Reply
#53
renz496
btarunrUnlikely. 1.3 would mean FreeSync support. NVIDIA will drag on to 1.2 until resolution hikes make it obsolete.
isn't that Adaptive Sync still optional for DP 1.3? which means nvidia can still be fully compliant with DP 1.3 without supporting Adaptive Sync.
Posted on Reply
#54
arbiter
renz496isn't that Adaptive Sync still optional for DP 1.3? which means nvidia can still be fully compliant with DP 1.3 without supporting Adaptive Sync.
Yea its optional part of displayport they can't really require it least now cause to use it requires scaler chip in the monitor.
Posted on Reply
#55
the54thvoid
Intoxicated Moderator
qubitI think the only real justification for this card is for people that want the uncrippled compute functionality. Or if someone well off just wants it. ;)
Maxwell is compute crippled as far as DP is concerned. Kepler has been respun for GK210 I think. Kepler is holding the compute ground for NV.
This may well be a thoroughbred gaming card, if so it may be cheaper than we all think. Without the compute benefit of Kepler it would be unwise for Nvidia to sell at the same point as Titans before.
The 12gb is a worrying sign though.
Posted on Reply
#56
radrok
the54thvoidMaxwell is compute crippled as far as DP is concerned. Kepler has been respun for GK210 I think. Kepler is holding the compute ground for NV.
This may well be a thoroughbred gaming card, if so it may be cheaper than we all think. Without the compute benefit of Kepler it would be unwise for Nvidia to sell at the same point as Titans before.
The 12gb is a worrying sign though.
I'd love these to be no more than 750$ a pop, would make 3way justifiable in some way.

All I need is CUDA cores and lots of VRAM for what I do, DP is not needed, atleast by me :toast:
Posted on Reply
#57
Chaitanya
mroofiewhat type of compute ??


But ... But dat matte pcb :)
Mostly database related computes(data mining for client's project before deploying the software to cloud) are done on GPU at my workplace.
Posted on Reply
#58
buildzoid
MxPhenom 216Underpowered based on what? Remember it is Maxwell, so it will have lower power requirements than the last Titan(s).
Rumors are that this will have a 250W TDP. And Nvidia has been putting underpowered VRMs on every card since GTX 590. The 680 VRM dies at 1.45V the 780 and Titan die at 1.5V the 780 Ti and Titan Black IDK. The 980 VRM is only OK because it's VRM couldn't get much cheaper. But for cards that pull over 200W Nvidia has been skimping on VRMs.
Posted on Reply
#59
Animalpak
Outstanding looks !!

Who buys this graphics card leaves it just as it is

The green LED and the black color are really beautiful.

What a shame to get rid of and put a waterblock on it.
Posted on Reply
#60
qubit
Overclocked quantum bit
the54thvoidMaxwell is compute crippled as far as DP is concerned. Kepler has been respun for GK210 I think. Kepler is holding the compute ground for NV.
This may well be a thoroughbred gaming card, if so it may be cheaper than we all think. Without the compute benefit of Kepler it would be unwise for Nvidia to sell at the same point as Titans before.
The 12gb is a worrying sign though.
Hey, you could well be right. We won't know for sure exactly what this card offers until the reviews come out and I'm on tenterhooks to find out!
Posted on Reply
#61
eroldru
Inspired by gamers, never affordable for gamers!
Posted on Reply
#62
radrok
buildzoidRumors are that this will have a 250W TDP. And Nvidia has been purring underpowered VRMs on every card since GTX 590. The 680 VRM dies at 1.45V the 780 and Titan die at 1.5V the 780 Ti and Titan Black IDK. The 980 VRM is only OK because it's VRM couldn't get much cheaper. But for cards that pull over 200W Nvidia has been skimping on VRMs.
While I agree, the same VRMs we are talking about are still withstanding 1.4v daily for gaming since the voltage unlock in afterburner came out. Been running 1500 MHz easily and reliably, maybe they aren't THAT bad after all.

I used to bash that same power circuitry a lot, probably some members here recall that aswell :toast:
qubitHey, you could well be right. We won't know for sure exactly what this card offers until the reviews come out and I'm on tenterhooks to find out!
Damn right qubit (about the tenterhooking :) )! Can't wait to see reviews too! Maybe they are holding off until Nvidia's GTC which is due soon this month.

www.gputechconf.com/ March 17-20
Posted on Reply
#63
qubit
Overclocked quantum bit
radrokDamn right qubit (about the tenterhooking :) )! Can't wait to see reviews too! Maybe they are holding off until Nvidia's GTC which is due soon this month.

www.gputechconf.com/ March 17-20
Wow I'd have loved to go to that.
Posted on Reply
#64
buildzoid
radrokWhile I agree, the same VRMs we are talking about are still withstanding 1.4v daily for gaming since the voltage unlock in afterburner came out. Been running 1500 MHz easily and reliably, maybe they aren't THAT bad after all.

I used to bash that same power circuitry a lot, probably some members here recall that aswell :toast:
I'm pretty sure Nvidia's 6+2 phase is specced for 240A up to 300A because the VRAM eats up a surprising amount of power(IIRC ~40W) so the 238W stock titan really only puts 190W to the core before efficiency loss (90% for switching DC-DC) so VRM only deals with 171W at 1.15V so only about 150A at stock. Clock to power scaling is not 1 to 1 and really depends on the chip but is between 1 to 0.8 and 1 to 0.5. So 1.4V/(1.15V/148A)*(1+(1500/993-1*0.6)) gives you about 235A so it's no suprise the VRM has held up since that's bellow the spec of the MOSFETs but give the GPU 1.5V at the same clocks and the VRM dies since it has to push 252A to the core which it isn't built for. I also read of a case where someone's card died after a couple weeks at 1.45V which also makes sense since that's around 243A just over the spec. Also I based all my calculations of the peak power draw figure from W1zz's review instead of the average power draw figure like I should have but you get the idea.
Posted on Reply
#65
MxPhenom 216
ASIC Engineer
buildzoidI'm pretty sure Nvidia's 6+2 phase is specced for 240A up to 300A because the VRAM eats up a surprising amount of power(IIRC ~40W) so the 238W stock titan really only puts 190W to the core before efficiency loss (90% for switching DC-DC) so VRM only deals with 171W at 1.15V so only about 150A at stock. Clock to power scaling is not 1 to 1 and really depends on the chip but is between 1 to 0.8 and 1 to 0.5. So 1.4V/(1.15V/148A)*(1+(1500/993-1*0.6)) gives you about 235A so it's no suprise the VRM has held up since that's bellow the spec of the MOSFETs but give the GPU 1.5V at the same clocks and the VRM dies since it has to push 252A to the core which it isn't built for. I also read of a case where someone's card died after a couple weeks at 1.45V which also makes sense since that's around 243A just over the spec. Also I based all my calculations of the peak power draw figure from W1zz's review instead of the average power draw figure like I should have but you get the idea.
i dont know about you, but if you are pushing a card that much on a reference design, you should probably just get non reference designs with beefed up power delivery. That is kind of why board partners make those kind of cards, however that is a bit harder to do with Titan. Being an enthusiast GPU it would make a whole lot more sense if Nvidia beefed it to like an 8+2, + higher quality components, but this is Nvidia we are talking about.
Posted on Reply
#66
buildzoid
MxPhenom 216i dont know about you, but if you are pushing a card that much on a reference design, you should probably just get non reference designs with beefed up power delivery. That is kind of why board partners make those kind of cards, however that is a bit harder to do with Titan. Being an enthusiast GPU it would make a whole lot more sense if Nvidia beefed it to like an 8+2, + higher quality components, but this is Nvidia we are talking about.
Well AMD strapped a 350A VRM to the refrence R9 290X and while the voltage accuracy of it is kinda awful it survives up to 1.7V.
Posted on Reply
#67
HumanSmoke
qubitHey, you could well be right. We won't know for sure exactly what this card offers until the reviews come out and I'm on tenterhooks to find out!
Maxwell seems to scale proportional to resources and clocks, so the final numbers shouldn't be too hard to guess. A GTX 960 is basically half a GTX 980 (at roughly the same clocks) and the 960's performance is roughly half of the 980 ( 52% based on W1zz's 4K chart, a resolution which would eliminate CPU limitation)
GM 200 should be a GM 204 + GM 206 in most respects, so ~50% over GTX 980 given roughly equal clocks, a little less in CPU bound situations, more in VRAM bound situations, and a little greater still if using 8GHz/8Gbps memory IC's rather than the 7GHz chips of the 980/960.
.......................GM 204...........GM 206..........Total........."publicized GM 200 spec"
Cores...........2048................1024..............3096...............3096
SM's.................16.......................8...................24.....................24
Bus width.....256...................128...............384..................384
L2 (MB).............2........................1.....................3.......................3
ROP.................64......................32..................96....................96
TMU..............128.....................64................192.......................?
Die size........398..................228............~ 580*...........est 600
* 398 + (228 - 20% PHY already included in GM 204....Command Processor, Thread dispatchers, Display out, PCI-E interface, SLI interface, NVENC)
Posted on Reply
#68
MxPhenom 216
ASIC Engineer
buildzoidWell AMD strapped a 350A VRM to the refrence R9 290X and while the voltage accuracy of it is kinda awful it survives up to 1.7V.
Those cards also consume more power out of the box too, but yeah AMD does some solid power delivery on their reference cards. Also the HBM that will be on the 390 and 390x makes me interested in jumping ship if the price is right. Not that I really need too, i have a pretty damn good 780, I can clock it to the sky and no looking back. I guess thats what happens when you buy parts on launch day, you always get the better clocking parts. I dont push it higher then 1.3v though.
Posted on Reply
#69
radrok
I'll be so sad if there won't be any voltage tweak :( I'm happy with softmods too!
Posted on Reply
#70
Vlada011
Titan X look amazing. AMD is not capable to launch such cards.
Maybe 1.-1.5 year after.
Only without backplate backside look really empty and much cheaper.
Posted on Reply
#71
Bytales
Im more interested in SLI performance. Supposedly one is faster after a single Titan-Z, so what scores will 2 of them yeald. I only got room for two, and will get two.

Hopefully EVGA will make a custom model with waterblock allready attached, single slot, and 2x8pin connector, higlhy overclocked.
Posted on Reply
#72
64K
BytalesIm more interested in SLI performance. Supposedly one is faster after a single Titan-Z, so what scores will 2 of them yeald. I only got room for two, and will get two.

Hopefully EVGA will make a custom model with waterblock allready attached, single slot, and 2x8pin connector, higlhy overclocked.
If one Titan X is a little faster than a Titan Z then two Titan X should be able to handle even 4K very well. I am hoping Nvidia will release the counterpart Maxwell to the GTX 780Ti and then I will pick that up so long as it's not over $700. $700 was the release price of the 780Ti so I guess that's reasonable as far as Nvidia prices.

I didn't pay much attention to the Kepler Titans but didn't Nvidia not allow non-reference coolers on the Titan/Titan Black? They may do the same thing again with the Maxwell Titans.
Posted on Reply
#73
radrok
Nvidia allowed to swap the cooler for watercooling, I remember Hydro copper Titan was a thing.

They didn't allow custom PCBs for sure though.
Posted on Reply
#74
MxPhenom 216
ASIC Engineer
Vlada011Titan X look amazing. AMD is not capable to launch such cards.
Maybe 1.-1.5 year after.
Only without backplate backside look really empty and much cheaper.
wow......

For how much this card will cost it better come with a damn backplate. Also i am pretty sure AMD has released more cards with backplates out of the box than Nvidia.
Posted on Reply
#75
qubit
Overclocked quantum bit
MxPhenom 216wow......

For how much this card will cost it better come with a damn backplate. Also i am pretty sure AMD has released more cards with backplates out of the box than Nvidia.
Yeah, it gets me how they can release such expensive cards and then skimp on a 10 pence backplate.

For example, my MSI GTX 780 Ti GAMING cost me £500 but still doesn't have a backplate.
Posted on Reply
Add your own comment
May 10th, 2024 09:02 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts