Monday, January 21st 2013

NVIDIA to Name GK110-based Consumer Graphics Card "GeForce Titan"

2013 started off on a rather dull note for the PC graphics industry. NVIDIA launched its game console platform "Project: Shield," while AMD rebranded its eons-old GPUs to Radeon HD 8000M series. Apparently it could all change in late-February, with the arrival of a new high-end single-GPU graphics card based on NVIDIA's GK110 silicon, the same big chip that goes into making the company's Tesla K20 compute accelerator.

NVIDIA may have drawn some flack for extending its "GTX" brand extension too far into the mainstream and entry-level segment, and wants its GK110-based card to stand out. It is reported that NVIDIA will carve out a new brand extension, the GeForce Titan. Incidentally, the current fastest supercomputer in the world bears that name (Cray Titan, located at Oak Ridge National Laboratory). The GK110 silicon physically packs 15 SMX units, totaling 2,880 CUDA cores. The chip features a 384-bit wide GDDR5 memory interface.
Source: SweClockers
Add your own comment

203 Comments on NVIDIA to Name GK110-based Consumer Graphics Card "GeForce Titan"

#77
alwayssts
NeoXFOnly way I see this card having revelancy to such a name is... if it's 4K-proof to AT LEAST a handful of modern titles... since I hate multi-monitor gaming setups with a passsion and even the puny-ish 2560x1440/1600 are no way near as common as they should be... Nor is the hardware or programing skill of most game studios up to snuff.
I'm with you on all that, but be realistic.

It's conceivable at the very edge this gen, but the big push will be on 20nm both because of the timing of the displays reaching a more tangible market as well as process abilities of tsmc (not to mention the potential need for more bw and/or more dense buffers without building monstrosities that will likely come between now and then).

Figure 4k is 4x 1080p.

I figure 20nm will bring similar designs to gk110/8900 aimed at the sweet-spot market with their shiny new 4k displays in late 2014 to 2015. That is to say efficient and 48 ROPs...obviously on more realistic size/yielding silicon and in consumer-friendly power envelopes. If that were roughly 2688 units (12 nvidia smx = 2688 w sfu, amd 42 cu = 2688) at ~1300mhz, it would be ~4x something like a 7850 (1024 x 860mhz), the baseline to play most titles at 1080p, and likely not changing much given the new rumored console specs.

Considering the process shrink should bring roughly 2x density, ~20-30% clock hikes at similar voltage, and gddr6 (and/or 4Gb GRAM) if not some other tech may rear it's head by that time, it seems a realistic trajectory. See clock/power skew of 28nm in previous post but note TSMC will lower the voltage aim on 20nm...1.264v ain't gonna be refuse anymore certainly to AMD's disappointment. The process will likely wimper out around where most designs hover because of efficiency, 1.15-1.175v (blame a united process with an eye focused on mobile SoCs, ). That means potentially ~1400-1500mhz, minus ~10% for stock skus...or around 1300mhz give or take.

Speculative maths, to be sure. But realistic.
Posted on Reply
#78
The Von Matrices
I'm confused. I always thought that the "Ti" suffix was short for "Titan." If "Ti" isn't short for "Titan", then what is its meaning?
Posted on Reply
#79
Optimis0r
The Von MatricesI'm confused. I always thought that the "Ti" suffix was short for "Titan." If "Ti" isn't short for "Titan", then what is its meaning?
Titanium according to Nvidia
Posted on Reply
#80
Samskip
20mmrainIf you ask me.... Nvidia can keep their over priced card that can perform 25% better then the HD8970. I would get 4 HD8970's for the price of two Titans and have much more fun.

All speculation until we hear some real proof.... this article is not enough for me yet.
I don't think quad crossfire would be that much more fun becuase of all the driver issues in games. And the 4 HD8970 would use more power and produce more heat.
But for pure benchmarking quad crossfire could give better results i guess.
Posted on Reply
#81
Fluffmeister
SamskipI don't think quad crossfire would be that much more fun becuase of all the driver issues in games. And the 4 HD8970 would use more power and produce more heat.
But for pure benchmarking quad crossfire could give better results i guess.
AMD haven't exactly been generous with prices either, the 7970 was massively overpriced when it launched too.
Posted on Reply
#82
Prima.Vera
900 bucks for console ports...! Cool story bro!
Posted on Reply
#83
Fluffmeister
Prima.Vera900 bucks for console ports...! Cool story bro!
That's a pretty standard response here, it's only natural for people to get defensive when their card drops down a notch. :laugh:
Posted on Reply
#84
Zubasa
The Von MatricesI'm confused. I always thought that the "Ti" suffix was short for "Titan." If "Ti" isn't short for "Titan", then what is its meaning?
Ti = Titanium
Although titanium is named after the Titans.
Posted on Reply
#85
blibba
N3M3515Actually, acording to techpowerup charts, if GTX 780 is 85% of GTX 690, then GTX 780 would be to GTX 680:

45% faster at 2560x1600
29% faster at 1920x1080
21% faster at 1680x1050
6% faster at 1280x800

Avg. of 25% faster.

AMD needs something 25% avg faster than 7970 Ghz Ed., to keep the marginal lead they have.
Those percentages are an average including games that have CPU bottlenecks, and (if Wizz calculates overall performance using average or total FPS) are also weighted towards the games with the highest FPS. You cannot use them to compare FPS in the way that you are doing. In addition, throughput-based metrics give cards like the 690 an artificially inflated result, which would not necessarily be replicated in latency based testing.
Posted on Reply
#86
N3M3515
blibbaThose percentages are an average including games that have CPU bottlenecks, and (if Wizz calculates overall performance using average or total FPS) are also weighted towards the games with the highest FPS. You cannot use them to compare FPS in the way that you are doing. In addition, throughput-based metrics give cards like the 690 an artificially inflated result, which would not necessarily be replicated in latency based testing.
Your point being?
BTW, my calculations are pretty logical, or explain me otherwise.
Posted on Reply
#87
bogami
Finally, the bastards ate profit from KG104 and decided to offer a significant delay product whose price was overpaid by 100%. Now we have pulled kepler life design to reduce GPU processor production to 18 microns. Of course, we will also pay the same patents this year and I hope that pig burst of obesity and gluttony. We know that the driver for the Quadro series DirectX bypasses the problem of efficiency and thereby obtained only from 30% to 100% of the CPU unčinkovitosti and we will have to wait for the Maxwell GPU, which will have the advantage as built. I hope that they will strainers saliva of licensing and contractual profits, just like us for good hardware: P
Posted on Reply
#88
blibba
N3M3515Your point being?
BTW, my calculations are pretty logical, or explain me otherwise.
My previous post was an explanation of the illogicality of yours. But if you can't be bothered with it, the first sentence and the very fact that there are different differences at different resolutions should suffice. It is not so much that the 690 is a different amount faster at different resolutions as it is that the 690 is to a different extent held up by other bottlenecks at different resolutions. In the absence of extra-GPU bottlenecks the 780 (Titan) will be in a worst-case scenario 50% faster than the 680. If memory bandwidth doesn't hold things up, you can call that figure 70%. Of course in reality we'll see a much, much smaller difference, because CPU bottlenecks will hold things up to at least some degree in nearly every game.
Posted on Reply
#89
Crap Daddy
N3M3515Your point being?
BTW, my calculations are pretty logical, or explain me otherwise.
Here's a better comparison for you, GTX690 being 53% faster than the GE at 1200p



Now, math is not my strong point but I reckon going by the presumed 85% performance of the Titan (ium) compared to GTX690, this GK110 based card would be 45% faster than the 7970GE.
Posted on Reply
#91
blibba
Crap DaddyHere's a better comparison for you, GTX690 being 53% faster than the GE at 1200p
On average, where the average includes many games where they perform very closely due to bottlenecks.
Crap DaddyNow, math is not my strong point but I reckon going by the presumed 85% performance of the Titan (ium) compared to GTX690, this GK110 based card would be 45% faster than the 7970GE.
So no.
Posted on Reply
#92
DarkOCean
And if 8970 is at least 20% faster than 7970ge then titan would be ~8% faster than 8970.
Posted on Reply
#93
erocker
*
I've decided to save the $900 bucks to buy this card and get a new awesome television and PS4 or Xbox 720 instead. How can that not be a better choice?

I hope that this price point doesn't become the standard for high end single GPU cards.
Posted on Reply
#94
N3M3515
blibbaMy previous post was an explanation of the illogicality of yours. But if you can't be bothered with it, the first sentence and the very fact that there are different differences at different resolutions should suffice. It is not so much that the 690 is a different amount faster at different resolutions as it is that the 690 is to a different extent held up by other bottlenecks at different resolutions. In the absence of extra-GPU bottlenecks the 780 (Titan) will be in a worst-case scenario 50% faster than the 680. If memory bandwidth doesn't hold things up, you can call that figure 70%. Of course in reality we'll see a much, much smaller difference, because CPU bottlenecks will hold things up to at least some degree in nearly every game.
Well, 85% of a GTX 690 is nowhere near +50% diff man...
Posted on Reply
#95
N3M3515
Crap DaddyHere's a better comparison for you, GTX690 being 53% faster than the GE at 1200p

tpucdn.com/reviews/AMD/HD_7970_GHz_Edition/images/perfrel_1920.gif

Now, math is not my strong point but I reckon going by the presumed 85% performance of the Titan (ium) compared to GTX690, this GK110 based card would be 45% faster than the 7970GE.
I'm sorry man, but 153*85% = 130, 30% faster than 7970 GE, at 1920x1080
Posted on Reply
#96
Easy Rhino
Linux Advocate
erockerI've decided to save the $900 bucks to buy this card and get a new awesome television and PS4 or Xbox 720 instead. How can that not be a better choice?

I hope that this price point doesn't become the standard for high end single GPU cards.
I have an even better option for you. Don't buy a TV and instead spend $300 on a GPU and $600 on games.
Posted on Reply
#97
blibba
N3M3515Well, 85% of a GTX 690 is nowhere near +50% diff man...
You've clearly missed my point, but what you've written here is wrong anyway.

A 680 is 50% of a 690. A 780 is 85% of a 690. 85% is 170% of 50%, a 70% difference.
Posted on Reply
#98
N3M3515
blibbaYou've clearly missed my point, but what you've written here is wrong anyway.

A 680 is 50% of a 690. A 780 is 85% of a 690. 85% is 170% of 50%, a 70% difference.
GTX 680 IS NOT 50% of a GTX 690 omg....
Posted on Reply
#99
Crap Daddy
erockerI've decided to save the $900 bucks to buy this card and get a new awesome television and PS4 or Xbox 720 instead. How can that not be a better choice?

I hope that this price point doesn't become the standard for high end single GPU cards.
I kinda feel this is year is a turning point in the GPU business and in the broader picture the PC business. Low end will disappear replaced by integrated graphics, mid range cards will be hard to sell over 200$ and there will be an enthusiast niche covered by the likes of Titanium and whatever AMD comes up with. The rest of us will game on consoles, tablets, laptops and smartphones. It's the same as with the music industry, there are a handful of people who still buy vinyl records, stare at cover and have expensive Hi-Fi equipment, they like to clean their records and upgrade the equipment while the rest are enjoying low-fi MP3s played through portable devices which they will throw away gladly when a new gadget will come along. Mind you, these are priced in the vicinity of a few pizzas or a good night out.

There will be a market for Titanium same as there's a market for outrageously priced DACs and headphones. Crysis 3 (if one's interested) can be played on a console. No big deal, single player, couple of days of fun then forget about it. Life goes on.
Posted on Reply
#100
TheoneandonlyMrK
blibbaA 680 is 50% of a 690
what:roll: nooo, they don't scale that well mate and deffinately not in every title, refer back to your own argument on avg framerates for the fail in that quote

and anyway your arguing against what one guys opinion of a guestimated avg performance chart Might be (id guess similar to him imho):), he may be right you may be, the arguings pointless either way:D
Posted on Reply
Add your own comment
Apr 24th, 2024 21:06 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts