• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA GeForce Titan X 12 GB

This is interesting.

Hexus get's a 20% OC boost in performance.... No voltage and similar cocks to W1zzard.

Not the best game but an idea of performance.

26.png



then there's this:

72552.png
 
This looks like a gaming card as double precision is not fully enabled. $1000 is way too much for a gaming card.
 
lol fail .. I was playing with the System Specs feature (to add mouse and keyboard) and forgot to change it back

Yup, that's totally what happened. What he said.
 
oh well not he card that will make me ditch my 290... alike the 980: the price is presumptuous for what it give over that one, i still get 80+ fps in most of my games (on high/ultra) let's wait the 390X to see what will be the next move :D (or even the 380X i suspect AMD just refine the 290X and give the 380X out from that because: it can still compete.)
 
I'm so pleased you included 970 SLI in the performance charts. I'm also so glad I went with SLI 970s...
 
I'm so pleased you included 970 SLI in the performance charts. I'm also so glad I went with SLI 970s...

Yep, that's a solid alternative. :toast:
 
Wow, non-stock Titan X cards this round.

I wouldn't get my hopes up, EVGA was allowed to use REFERENCE PCB to make a Hydro Copper version with the original Titan too.

It will probably be the only instance of non reference cooled Titan X.

I HOPE I'm proven wrong though.
 
That game is an Ubislop mess of a PC port. :shadedshu:

Yeah I agree, and it's equally messy on consoles too. Wonder if that one will ever be fixed. I certainly hope so, because graphics fidelity in that game is quite gorgeous.
 
Hmm, still not quite 4K ready. ~40 FPS isn't bad, but not quite there.

Like I keep saying...this gen is 1440p single card, 4k dual card...this gen at most was going to get us 75% of the way there (3200x1800 or slightly less?) Single chip 4k will be 14nm/16nm...conceivably even 2nd gen (first gen might be shrinks/mem consolidation through 2nd gen HBM etc).

It's also why AMD doesn't need to be as fast, just consistent across different titles. They literally just need to keep above 60 at 1440p in most worse-case scenarios and 30+ (whatever scaling generally turns out to be on avg...vicariously 60 in dual config) at 4k, even if through overclocking, because that in practicality is all that matters. The questions most people (as most don't use adaptive sync) ask themselves is if it will generally stay above 30/60fps (or again vicariously below 16.66/33.333ms) and at what price. The question is not if it will do 38 vs 35 fps. This is nvidia's gpu that mostly can (and through overclocking seems probably will almost always) accomplish that.

Also,

Big thanks to W1zzard for this:

http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_Titan_X/33.html

It pretty much proves my point: 8GB is generally the sweet spot (conceivably for the life of the current consoles). That said, I respectfully disagree with his '640k 4GB is enough for everybody' conclusion. I think more and more games will push up against that wall, or at least 6GB as Mordor does (scaling from a 900p XB1/1080p PS4 game), as it is clearly shown which titles consume the most memory: AC, Mordor, COD (even if busted), Rise, W_D, DR3. What do these games have in common? If you said they were developed primarily towards the memory architecture of the ps4/xb1...You'd be right. That higher memory usage is not going to change, and only become more apparent as creators target not only more tricky uses for the console memory arch to squeeze performance from those otherwise-limited machines, but also start targeting lower-resolutions on consoles that will require greater texture scaling on PC.

For example, Mordor is currently 1080p on PS4, 900p on XB1. What if next time they targeted 720p on xbox and 900p on ps4 (which I think will end up being a very common occurrence for multiplat as time passes, as it allows xb1 to still remain HD)? You end up with 8GB making a lot more sense, if not the most sense. While that may occur (worst case scenario), the compute power of the consoles never changes and performance in that respect for scaling is a known...hence why cards are aimed at certain metrics outlined in my top-most comment.

At any rate, I primarily was eying the upcoming baby brother to this because I assumed (and rightfully so it appears) Mordor would consume just under 5.5GB at a higher resolution (given memory use for me before the game starts doing that weird hiccup thing) and would make sense for many more games that might have a similar mentality on console (900-1080p). I think it also verifies 3.5GB is indeed an issue for that game at some resolutions where 4GB would not be one, but also 4GB could be an over-all limiting factor where core performance may not be (4GB 390x?), even if more consistent because not (weird switching to separate memory controller) issues.

TLDR: If you could continue to add memory usage as more titles are released, it would be most appreciated. I would be willing to bet people would LOVE to know what Witcher will demand at various resolutions...

As always, appreciate the thorough review with nice graphs. Only thing I would mention is to please not forget us over-volters (at least for one sku based on a chip), although I can tell it's become less of a priority. The clock-scaling/voltage/leakage graphs you used to do on new parts was REALLY awesome/helpful. I hope that doesn't get left-behind completely.
 
please not forget us over-volters
there is no software voltage control because the voltage controller has no i2c interface. so i guess i'd have to bios mod or solder mod. and then try to not blow up my card :D
 
4K users here is your war horse.

There can be a 980 Ti at this point with 8 GB of RAM.

I think the RAM is the thing that makes the price so high and of course the name TITAN.
 
there is no software voltage control because the voltage controller has no i2c interface. so i guess i'd have to bios mod or solder mod. and then try to not blow up my card :D

Would it be possible to impose a forced voltage to the controller like it was for the OG Titan?
 
Would it be possible to impose a forced voltage to the controller like it was for the OG Titan?
No, that controller had I2C, so you could talk to it with software. This one is dumb and just has VID lines going to it, which form a bit pattern that defines the voltage.
 
Oh, I forget. What is video decode/encode support with this card? Same as gm206 or gm204.
 
No, that controller had I2C, so you could talk to it with software. This one is dumb and just has VID lines going to it, which form a bit pattern that defines the voltage.

Assuming that's because NVidia are now marching towards no overvolting on any cards, so they can get away with making the cheapest possible VRM assembly section. It'll all have to be done by AIB's, if they get a chance to touch Titan X (which I doubt).

Oh, I forget. What is video decode/encode support with this card? Same as gm206 or gm204.

NVENC encode as well as HEVC. Decode, I'm assuming VP6 or 7(likely 7 like the 960) (PureVideo) with the ability for HEVC decoding.
 
Costly would say even far too expensive depending on the results and produce costs. The results are good but would not appoint these cards to 4K gameplay . It will, therefore(4K), requires SLI and of course for physics need is one GPU, because we know that if only one of the two will be set for physics so we need 3 GPU. This could be solved long ago, as a matter in the software. Something for the current fad would say because there is a 20nm.
So too expensive, under desired performances, expected results, Bad cooled a little OC area. I would say that it is more profitable to buy a GTX 980 and max OC.:( For the rich and stupid. I would not buy unless it was necessary and I was very rich.
 
Hum Maxwells' mojo didn't seem to scale... It's odd the cooling solution should have been plenty to have it run free, but it seems like to curtail power consumption (aka heat) they reign in the clocks. Isn't that what it seems by the thermal camera. Then the "matches the Radeon R9 290X noise"... ouch!

As W1zzard said "So what's all the fuss about? Roughly 50% more number-crunching muscle as the GTX 980, a 50% wider memory bus, and three times more memory." And from all that they gain 23% FpS increase @ 1440 (not earth shattering), while on the Average/Peak power numbers are up like 42/32% respectively. The impression I’m left with is the excess memory is contributing to both heat and power it doesn’t make use of.

Is it me or is it interesting the Titan is made with Hynix, while the reference GTX 980 came out with Samsung modules. Wonder if that could be contributing to the memory running hot under W1zzard thermal camera?

Almost like too much of a good thing?
 
Last edited:
Don't agree the criticism about vram. 12GB is a good amount for the targeted audience(cheap quadro, raytracing), and the criticism towards COD... that it never uses again the portion of the data... the hell free ram = wasted ram.

Overall. The card is a disappointment as expected... this should be the current flagship for the 980ties price... not a cut down second tier historical MX like card... everything is gone bonkers.
 
@the54thvoid
This is interesting.

Hexus get's a 20% OC boost in performance.... No voltage and similar cocks to W1zzard.

Not the best game but an idea of performance.

O man soda out my nose :toast:.
 
@W1zzard you made a mistake with the controller's name. There is no NCP8114 it's an NCP81174. I think the same controller is used for MSI gaming GTX 980 and GTX 970 and I know that one of those has a Vmod guide using the FB pin. The good news is that this VRM is the same oddity that the GTX TITAN BE and GTX 780 Ti had so it should be mostly OK up to 1.35V.

Here datasheet

Looks simple enough to mod. Cut the ILIM and put a VR on the VFB pin. If you aren't that hardcore you can short the shunt resistors near the PCI-e power connectors using CLU(because you can remove it if you want) and you'll also disable the power limit.
 
Last edited:
I'm so pleased you included 970 SLI in the performance charts. I'm also so glad I went with SLI 970s...
Hell I'm still running two 670's in SLI at 1080p and have yet to see a reason to upgrade. Every game maxed out at above 60 FPS. If I ever upgrade monitors MAYBE I might need a new GPU but this is yet another generation I can skip and still max out everything.
 
Back
Top