• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA to Name GK110-based Consumer Graphics Card "GeForce Titan"

Only way I see this card having revelancy to such a name is... if it's 4K-proof to AT LEAST a handful of modern titles... since I hate multi-monitor gaming setups with a passsion and even the puny-ish 2560x1440/1600 are no way near as common as they should be... Nor is the hardware or programing skill of most game studios up to snuff.

I'm with you on all that, but be realistic.

It's conceivable at the very edge this gen, but the big push will be on 20nm both because of the timing of the displays reaching a more tangible market as well as process abilities of tsmc (not to mention the potential need for more bw and/or more dense buffers without building monstrosities that will likely come between now and then).

Figure 4k is 4x 1080p.

I figure 20nm will bring similar designs to gk110/8900 aimed at the sweet-spot market with their shiny new 4k displays in late 2014 to 2015. That is to say efficient and 48 ROPs...obviously on more realistic size/yielding silicon and in consumer-friendly power envelopes. If that were roughly 2688 units (12 nvidia smx = 2688 w sfu, amd 42 cu = 2688) at ~1300mhz, it would be ~4x something like a 7850 (1024 x 860mhz), the baseline to play most titles at 1080p, and likely not changing much given the new rumored console specs.

Considering the process shrink should bring roughly 2x density, ~20-30% clock hikes at similar voltage, and gddr6 (and/or 4Gb GRAM) if not some other tech may rear it's head by that time, it seems a realistic trajectory. See clock/power skew of 28nm in previous post but note TSMC will lower the voltage aim on 20nm...1.264v ain't gonna be refuse anymore certainly to AMD's disappointment. The process will likely wimper out around where most designs hover because of efficiency, 1.15-1.175v (blame a united process with an eye focused on mobile SoCs, ). That means potentially ~1400-1500mhz, minus ~10% for stock skus...or around 1300mhz give or take.

Speculative maths, to be sure. But realistic.
 
I'm confused. I always thought that the "Ti" suffix was short for "Titan." If "Ti" isn't short for "Titan", then what is its meaning?
 
If you ask me.... Nvidia can keep their over priced card that can perform 25% better then the HD8970. I would get 4 HD8970's for the price of two Titans and have much more fun.

All speculation until we hear some real proof.... this article is not enough for me yet.


I don't think quad crossfire would be that much more fun becuase of all the driver issues in games. And the 4 HD8970 would use more power and produce more heat.
But for pure benchmarking quad crossfire could give better results i guess.
 
I don't think quad crossfire would be that much more fun becuase of all the driver issues in games. And the 4 HD8970 would use more power and produce more heat.
But for pure benchmarking quad crossfire could give better results i guess.

AMD haven't exactly been generous with prices either, the 7970 was massively overpriced when it launched too.
 
900 bucks for console ports...! Cool story bro!
 
900 bucks for console ports...! Cool story bro!

That's a pretty standard response here, it's only natural for people to get defensive when their card drops down a notch. :laugh:
 
I'm confused. I always thought that the "Ti" suffix was short for "Titan." If "Ti" isn't short for "Titan", then what is its meaning?
Ti = Titanium
Although titanium is named after the Titans.
 
Actually, acording to techpowerup charts, if GTX 780 is 85% of GTX 690, then GTX 780 would be to GTX 680:

45% faster at 2560x1600
29% faster at 1920x1080
21% faster at 1680x1050
6% faster at 1280x800

Avg. of 25% faster.

AMD needs something 25% avg faster than 7970 Ghz Ed., to keep the marginal lead they have.


Those percentages are an average including games that have CPU bottlenecks, and (if Wizz calculates overall performance using average or total FPS) are also weighted towards the games with the highest FPS. You cannot use them to compare FPS in the way that you are doing. In addition, throughput-based metrics give cards like the 690 an artificially inflated result, which would not necessarily be replicated in latency based testing.
 
Those percentages are an average including games that have CPU bottlenecks, and (if Wizz calculates overall performance using average or total FPS) are also weighted towards the games with the highest FPS. You cannot use them to compare FPS in the way that you are doing. In addition, throughput-based metrics give cards like the 690 an artificially inflated result, which would not necessarily be replicated in latency based testing.

Your point being?
BTW, my calculations are pretty logical, or explain me otherwise.
 
Finally, the bastards ate profit from KG104 and decided to offer a significant delay product whose price was overpaid by 100%. Now we have pulled kepler life design to reduce GPU processor production to 18 microns. Of course, we will also pay the same patents this year and I hope that pig burst of obesity and gluttony. We know that the driver for the Quadro series DirectX bypasses the problem of efficiency and thereby obtained only from 30% to 100% of the CPU unčinkovitosti and we will have to wait for the Maxwell GPU, which will have the advantage as built. I hope that they will strainers saliva of licensing and contractual profits, just like us for good hardware: P
 
Your point being?
BTW, my calculations are pretty logical, or explain me otherwise.

My previous post was an explanation of the illogicality of yours. But if you can't be bothered with it, the first sentence and the very fact that there are different differences at different resolutions should suffice. It is not so much that the 690 is a different amount faster at different resolutions as it is that the 690 is to a different extent held up by other bottlenecks at different resolutions. In the absence of extra-GPU bottlenecks the 780 (Titan) will be in a worst-case scenario 50% faster than the 680. If memory bandwidth doesn't hold things up, you can call that figure 70%. Of course in reality we'll see a much, much smaller difference, because CPU bottlenecks will hold things up to at least some degree in nearly every game.
 
Your point being?
BTW, my calculations are pretty logical, or explain me otherwise.

Here's a better comparison for you, GTX690 being 53% faster than the GE at 1200p

perfrel_1920.gif


Now, math is not my strong point but I reckon going by the presumed 85% performance of the Titan (ium) compared to GTX690, this GK110 based card would be 45% faster than the 7970GE.
 
Here's a better comparison for you, GTX690 being 53% faster than the GE at 1200p

On average, where the average includes many games where they perform very closely due to bottlenecks.

Now, math is not my strong point but I reckon going by the presumed 85% performance of the Titan (ium) compared to GTX690, this GK110 based card would be 45% faster than the 7970GE.

So no.
 
And if 8970 is at least 20% faster than 7970ge then titan would be ~8% faster than 8970.
 
I've decided to save the $900 bucks to buy this card and get a new awesome television and PS4 or Xbox 720 instead. How can that not be a better choice?

I hope that this price point doesn't become the standard for high end single GPU cards.
 
My previous post was an explanation of the illogicality of yours. But if you can't be bothered with it, the first sentence and the very fact that there are different differences at different resolutions should suffice. It is not so much that the 690 is a different amount faster at different resolutions as it is that the 690 is to a different extent held up by other bottlenecks at different resolutions. In the absence of extra-GPU bottlenecks the 780 (Titan) will be in a worst-case scenario 50% faster than the 680. If memory bandwidth doesn't hold things up, you can call that figure 70%. Of course in reality we'll see a much, much smaller difference, because CPU bottlenecks will hold things up to at least some degree in nearly every game.

Well, 85% of a GTX 690 is nowhere near +50% diff man...
 
I've decided to save the $900 bucks to buy this card and get a new awesome television and PS4 or Xbox 720 instead. How can that not be a better choice?

I hope that this price point doesn't become the standard for high end single GPU cards.

I have an even better option for you. Don't buy a TV and instead spend $300 on a GPU and $600 on games.
 
Well, 85% of a GTX 690 is nowhere near +50% diff man...

You've clearly missed my point, but what you've written here is wrong anyway.

A 680 is 50% of a 690. A 780 is 85% of a 690. 85% is 170% of 50%, a 70% difference.
 
You've clearly missed my point, but what you've written here is wrong anyway.

A 680 is 50% of a 690. A 780 is 85% of a 690. 85% is 170% of 50%, a 70% difference.

GTX 680 IS NOT 50% of a GTX 690 omg....
 
I've decided to save the $900 bucks to buy this card and get a new awesome television and PS4 or Xbox 720 instead. How can that not be a better choice?

I hope that this price point doesn't become the standard for high end single GPU cards.

I kinda feel this is year is a turning point in the GPU business and in the broader picture the PC business. Low end will disappear replaced by integrated graphics, mid range cards will be hard to sell over 200$ and there will be an enthusiast niche covered by the likes of Titanium and whatever AMD comes up with. The rest of us will game on consoles, tablets, laptops and smartphones. It's the same as with the music industry, there are a handful of people who still buy vinyl records, stare at cover and have expensive Hi-Fi equipment, they like to clean their records and upgrade the equipment while the rest are enjoying low-fi MP3s played through portable devices which they will throw away gladly when a new gadget will come along. Mind you, these are priced in the vicinity of a few pizzas or a good night out.

There will be a market for Titanium same as there's a market for outrageously priced DACs and headphones. Crysis 3 (if one's interested) can be played on a console. No big deal, single player, couple of days of fun then forget about it. Life goes on.
 
Back
Top