Monday, January 21st 2013
NVIDIA to Name GK110-based Consumer Graphics Card "GeForce Titan"
2013 started off on a rather dull note for the PC graphics industry. NVIDIA launched its game console platform "Project: Shield," while AMD rebranded its eons-old GPUs to Radeon HD 8000M series. Apparently it could all change in late-February, with the arrival of a new high-end single-GPU graphics card based on NVIDIA's GK110 silicon, the same big chip that goes into making the company's Tesla K20 compute accelerator.
NVIDIA may have drawn some flack for extending its "GTX" brand extension too far into the mainstream and entry-level segment, and wants its GK110-based card to stand out. It is reported that NVIDIA will carve out a new brand extension, the GeForce Titan. Incidentally, the current fastest supercomputer in the world bears that name (Cray Titan, located at Oak Ridge National Laboratory). The GK110 silicon physically packs 15 SMX units, totaling 2,880 CUDA cores. The chip features a 384-bit wide GDDR5 memory interface.
Source:
SweClockers
NVIDIA may have drawn some flack for extending its "GTX" brand extension too far into the mainstream and entry-level segment, and wants its GK110-based card to stand out. It is reported that NVIDIA will carve out a new brand extension, the GeForce Titan. Incidentally, the current fastest supercomputer in the world bears that name (Cray Titan, located at Oak Ridge National Laboratory). The GK110 silicon physically packs 15 SMX units, totaling 2,880 CUDA cores. The chip features a 384-bit wide GDDR5 memory interface.
203 Comments on NVIDIA to Name GK110-based Consumer Graphics Card "GeForce Titan"
Remember, this will be out 14 months after the 7970 released and it looks like it's the old Nvidia technique of massive die size. One monolithic gpu to rule them all, as it were.
I wish we knew the whole truth about this card. Is this to the GK100 what the 580 was to the 480? Is this the card they couldn't make a year ago? Remember the initial rumours were that Nvidia were going to learn from the past and release the lower end cards first? Maybe it's taken this long to get it right (and they didn't have to rush due in part to the initial low clock and immature driver performance of the 7970).
This looks like the card I wanted 1 1/2 years ago but not at that price. :ohwell:
You beat me to it:
There are allready known means to cure each and every disease one can think of. The fact they are kept hidden from public, and you have no clue about it, thats another thing my friend.
One of these means is 5000 years old. Lookup Amaroli on google, check Fasting too.
The one ruling the world, who purposely try to kill as much of us as possible, dont want these knowledge loose, thats why they are not generaly accepted, nor reasearched, but they do work let me tell yah that.
And nobody needed 2880 CUDA GPU to come up with these cures let me tell yah that.
Bassicaly, these FOLDING at home sh.it, is using your resource, your electricity, your money, your work, for god knows what, that only they, the elite few will have in the end acces to.
So you, the poor user, gets nothing in the end, your money are beeing taken, and dont kid yourself, you wont be getting any cancer cure anytime soon.
Poor fools ! Imagine all these folding team, burning electricity, and besides a place in a highscore list, get nothing in return.
Take the 5870 @ 2.72 tflops vs the GTX 480's measly 1.35 tflops for example. Doesn't exactly paint an accurate picture of a cards performance.
Edit: Neah... Honestly, they should probably name it GTX 685 (Ti) or something...
:roll:
Time to leave GTX680SLI and pimp up my gaming rig? Not sure. On the other hand GTX680 is a great card in single mode, but tends to be a bit warm in SLI, no need to heat my gaming chamber in winter.
Yes, I think I will not be strong enough to resist this card - given that, it doubles performance. Early speculation talked about only 30% increase what would be a no-go.
Just found this:
GeForce Titan, GeForce Titanium, GeForce GTX 780 Ti Specifications:
Kepler GPU with 2688 CUDA Cores
6GB GDDR5 memory
Core Clock: 732 MHz
Memory Clock: 5200 MHz
MSRP: 899 USD
videocardz.com/39143/geforce-titanium-the-allmighty-gk110-based-graphics-card
6GB video ram would be great
OC two 670s and spank this stupid card back to nonexistance.
And who are buying these as compute cards? They must be higher than a kite.
or better yet, let 'em redesign the pcb and power delivery....:rockout: speak for yourself....im all in for a free and open net....
do i agree with all he says....no
is there SOME truth to what he says.... yes
i no more want him censored than your ability to tell him to fuck off....
Hmm...
What do you do when you want high profile PR (reviews) from consumer hardware, but would rather sell the die packaged as a Quadro or Tesla board for a whole lot more?
The pricing would likely cover any lack of actual availability, while ensuring (as a reference single GPU SKU) that it sits comfortably atop the benchmark charts even after the next round of releases. Kind of takes the pressure off the GK114 I would have thought.
So yeah, crap price, but wasn't it always destined to be if the GK110 made it to GeForce branding? Nice flamebait.
It's not flamebait, that's what immediately came to mind when I read the article and saw the name Titan.
:wtf:
Anyway...
February sounds good.
As for amd, they are well aware of this chip which imho should mean an apprpriate re action at some point.
The next few months will make good reading :)
also whats the compute pro version gpu clocked at?
2nd comment: Also AMD has not tipped their hand on how the HD8900 series will perform. (And I don't mean the HD8000M series which is a refresh. I mean the real HD8900 series) So if AMD has something to compete with this card..... Nvidia will not price this card at 899 Dollars.
3rd Comment: Even if AMD can not compete straight out the gate.... we have seen how that works well for us being the end consumer. That would mean we may see another GTX 280/285 vs HD4870 price battle..... Which would be great news for all of us. If you ask me.... Nvidia can keep their over priced card that can perform 25% better then the HD8970. I would get 4 HD8970's for the price of two Titans and have much more fun.
All speculation until we hear some real proof.... this article is not enough for me yet.