Tuesday, February 19th 2013

NVIDIA Announces GeForce GTX Titan, The Fastest GPU in the World

NVIDIA today introduced the new GeForce GTX TITAN, powered by the fastest GPU on the planet and designed to unleash the world's fastest gaming PCs including personal gaming supercomputers and svelte, quiet, small form-factor PCs.

"GeForce GTX TITAN is a beast of a GPU -- and the only one in the world powerful enough to play any game at any resolution at any time," said Scott Herkelman, general manager of the GeForce business unit at NVIDIA. "And yet, all of this immense power is housed in a sleek, sexy design, so gamers can also build beautifully-designed PC gaming machines about the size of a gaming console, yet magnitudes more powerful and always upgradeable."

GTX TITAN is built with the same NVIDIA Kepler architecture that powers Oak Ridge National Laboratory's newly launched Titan supercomputer, which is number 1 in the list of the Top500 supercomputers in the world.

By harnessing the power of 3 GeForce GTX TITAN GPUs simultaneously in 3-way SLI mode, gamers can max out every visual setting without fear of a meltdown while playing any of the most demanding PC gaming titles.

Designed with unsurpassed craftsmanship, GeForce GTX TITAN features an array of innovative technologies complemented by sleek materials that contribute to the exotic design of the card, including a high-quality exterior aluminum frame and high efficiency vapor chamber cooling. Overall, GeForce GTX TITAN's aesthetic design evokes the spirit of a supercomputer and the enormous capability within: a blistering-fast GPU and astonishing graphics horsepower that is delivered with the power efficiency that only Kepler-class GPUs can provide.



With its advanced thermal and acoustic characteristics, GeForce GTX TITAN is also perfect for powering the new wave of small form-factor gaming PCs. So gamers no longer have to make the choice between performance and size -- they can have both at the same time.
"GeForce GTX TITAN will allow us to create the nearly-impossible product our customers have wanted for years: a ridiculously fast, tiny system that you barely know is running," said Kelt Reeves, CEO of Falcon Northwest.

The GeForce GTX TITAN:
  • Contains 7 billion transistors
  • Has 2,668 GPU cores -- 75% more than the Company's NVIDIA GeForce GTX 680 GPU
  • Delivers 4.5 Teraflops of single precision and 1.3 Teraflops of double precision processing power
  • Supports new GPU Boost 2.0 technology which automatically boosts graphics performance and supports unlocked voltage and advanced controls for even more gaming control and overclocking customization
  • Can be combined with additional GTX TITANs in SLI mode for even more performance
The GeForce GTX TITAN GPU will be available starting on February 25, 2013 from NVIDIA's add-in card partners, including ASUS and EVGA in North America, and additional partners, including Colorful, Galaxy, Gigabyte, INNO 3D, MSI, Palit and Zotac outside the US. Partner participation will vary by region. Pricing is expected to start at $999 USD.

GeForce GTX TITAN will also be sold in fully configured systems from leading U.S.-based system builders, including AVADirect, Cyberpower, Digital Storm, Falcon Northwest, Geekbox, IBUYPOWER, Maingear, Origin PC, Puget Systems, V3 Gaming, Velocity Micro, and other system integrators outside North America.
Add your own comment

123 Comments on NVIDIA Announces GeForce GTX Titan, The Fastest GPU in the World

#1
Animalpak
by: Prima.Vera
You serious?? Half of games I know have problems running not on 4 but even on 3 cards...:eek:
I think he will use them for benchmarks, because for games is the gold of overkills.
Posted on Reply
#3
Recus
by: T4C Fantasy
nope NVidia wont allow a shroud change unless liquid cooled for the 690 or titan, they will get in trouble
In the first batch, only ASUS and EVGA will launch the GeForce GTX Titan. This will be followed by Colorful, Gainward, Galaxy, Gigabyte, Inno3D, MSI and Palit with more models to come. NVIDIA did not forbid modifying the cards, meaning, manufactures are free to introduce custom models, maybe even with custom cooling.
http://videocardz.com/39721/nvidia-geforce-gtx-titan-released
Posted on Reply
#4
Jizzler
by: Prima.Vera
You serious?? Half of games I know have problems running not on 4 but even on 3 cards...:eek:
Sure. Because half is not all, not even when I convert to metric.
In the first batch, only ASUS and EVGA will launch the GeForce GTX Titan. This will be followed by Colorful, Gainward, Galaxy, Gigabyte, Inno3D, MSI and Palit with more models to come. NVIDIA did not forbid modifying the cards, meaning, manufactures are free to introduce custom models, maybe even with custom cooling.
That's nice. Maybe we can get quad DP models for triple 4K Surround Mode.
Posted on Reply
#5
Max Mojo
Nearly 1000$ and no backplate included. This is beggarly. And what a cheap packaging.
Obviously there's not a single woman working in the company ...

Watch Linustech on youtube
Posted on Reply
#6
zolizoli
If i want to pay 1000 euro (cause that much it gonna cost in EU) for a SUPER magnesium cased AWESOME design than ill buy jewelry and not a GPU with just a minor performance boost over last gen.
I would say NVIDIA give us instead a naked ugly PCB with astronomical performance and not
a nice looking box with astronomical price (they even advertising it like it would be a frakin superstar assuming that all the people brainwashed enough trough the media to buy it)

They even invented SLI so customers will not be limited to buy just 1 card per PC so keep the price lower and more sales will occure.
The only reason i dont want to switch to AMD is cause they drivers allways full of BUG.
Posted on Reply
#7
phanbuey
i wonder why when it has so many more shaders than the 680 that it is only rumored to be 30% faster?

also that price is just stupid.
Posted on Reply
#8
Abate
by: Seyumi
You're right. It's aimed at people who need to spend $4000 on GPUs on their $8000 computer system to play their console port games that look 10% better than on a $250 Xbox 360.
That's their problem and money..
Why are you caring???
Makes no sense at all.. :banghead:
Posted on Reply
#9
Abate
by: Prima.Vera
Agree. This card should have cost ~700-750$ not more. Good luck with that. I think there will be A LOT of sucker to buy this.
If you can't afford to buy it.. Juzt don't buy it. It's as simple as that..
How can you bash the others, who will buy this?
It's a weird world after all.. My Goodness!
Posted on Reply
#10
Slizzo
Not paying $1k for the card. Waiting continues...
Posted on Reply
#11
yogurt_21
remember when AMD came out and said that the GTX 280 would be Nvidia's last monolithic gpu due to market shifts?

We didn't believe them of course, but its funny to see nvidia sticking like a dinosaur to the old way of thinking many years later.

Don't get me wrong I love these kind of cards, they simply don't sell well at all. I really can't see what nvidia was thinking here. The 600 series made so much more sense.
Posted on Reply
#12
Max Mojo
Besides: what puzzles me is the max temp of 95°C. On the other hand nvidia advertises Titan has to be runned cool as possible for maximum performance. Otherwise the card is not running in Turbo mode and is clocking down. This is weird.
Posted on Reply
#13
MxPhenom 216
Corsair Fanboy
by: Max Mojo
Besides: what puzzles me is the max temp of 95°C. On the other hand nvidia advertises Titan to be runned cool for maximum performance. This is weird.
Where has there been any benchmarks of how hot this GPU runs? Im pretty sure the max temp of 95c your talking about its the max safe temperature to run it at.
Posted on Reply
#14
Max Mojo
Just quoting the figures. Benchmarks are not allowed until 21th. But this is still controversary in my view. My 680s never reached that max temp. But maybe you're right. Have to wait for benchies.
Posted on Reply
#17
theoneandonlymrk
by: Slizzo
Not paying $1k for the card. Waiting continues...
I cant either , my rod neads an exhaust more.
.... and ive not yet won the lotto.
I do find it aggravating that in the same speal Nv say
Finally there is a single gpu that can play any game at any res then a few lines later say combine 3x in sli to play at high res' s smooth ass wtf does one do then, contradiction mm
Clarifies amds press release yesterday for me tho, more a hey rem us guys thing.

How do both companies allways manage a slight press spoiler each release, they both need to tune some moles in.
Damn fat fingers.dam pone
Posted on Reply
#18
Max Mojo
Temps and noise levels are nearly equal. My 680 is running 32°C idle and about 76°C at load.
Posted on Reply
#19
Max Mojo
by: MxPhenom 216
yeah. Only hits about 80c in that(680 topped at 79c), and probably lower. My 680 never hit 70c which is the threshold till it starts to drop clocks in like 13mhz increments till temps are below 70c.
If you use EVGA precision you can manage the clocks not to clock down at all.
Posted on Reply
#20
natr0n
"Hmm I wonder if I can afford this... It says I need 3 hmm... strolls to vault for a spare brick of gold."
Posted on Reply
#22
syeef
I think they are releasing it just so they can claim: "The Fastest GPU in the World" and to beat AMD 7970 which was making their GTX 680 look bad.
Posted on Reply
#23
Xzibit
by: syeef
I think they are releasing it just so they can claim: "The Fastest GPU in the World" and to beat AMD 7970 which was making their GTX 680 look bad.
I'm sure that the slow down in PC sector helped aswell to make that decision. Orders for Tesla werent as expected and they had a lot of inventory to re-allocate due to various other reasons.

S/A article back in Oct 2012
Posted on Reply
#24
MxPhenom 216
Corsair Fanboy
by: Max Mojo
If you use EVGA precision you can manage the clocks not to clock down at all.
How so, because I run Precision X.

The downlocking is on BIOS side im pretty sure. So unless you edit BIOS, Precision can't do anything.
Posted on Reply
#25
Max Mojo
by: MxPhenom 216
How so, because I run Precision X.

The downlocking is on BIOS side im pretty sure. So unless you edit BIOS, Precision can't do anything.
Hmm, have to remember, because my overclocking experience has been when EVGA GTX680 was just released. As your card is on H20 it should stay sub 70 C anyhow.
If you use the onscreen monitor you can watch your clocks while gaming or benching.
On air: I occed the clocks gpu/shader/mem for benching to the limit via Precision. Made a fan profil to keep it cool. Have a well ventilated rig. Played demanding games for hours. Benched for hours. Unigine Heaven or whatever. But there was no downclocking at all.
Did you changed power management in Nvidia CP to maximum performance?
Posted on Reply
Add your own comment