Monday, January 21st 2013

NVIDIA to Name GK110-based Consumer Graphics Card "GeForce Titan"

2013 started off on a rather dull note for the PC graphics industry. NVIDIA launched its game console platform "Project: Shield," while AMD rebranded its eons-old GPUs to Radeon HD 8000M series. Apparently it could all change in late-February, with the arrival of a new high-end single-GPU graphics card based on NVIDIA's GK110 silicon, the same big chip that goes into making the company's Tesla K20 compute accelerator.

NVIDIA may have drawn some flack for extending its "GTX" brand extension too far into the mainstream and entry-level segment, and wants its GK110-based card to stand out. It is reported that NVIDIA will carve out a new brand extension, the GeForce Titan. Incidentally, the current fastest supercomputer in the world bears that name (Cray Titan, located at Oak Ridge National Laboratory). The GK110 silicon physically packs 15 SMX units, totaling 2,880 CUDA cores. The chip features a 384-bit wide GDDR5 memory interface.

Source: SweClockers
Add your own comment

203 Comments on NVIDIA to Name GK110-based Consumer Graphics Card "GeForce Titan"

#1
N3M3515
by: N3M3515
GTX 680 IS NOT 50% of a GTX 690 omg....
98% vs 153% thats 64%, not 50%
Posted on Reply
#2
blibba
by: N3M3515
GTX 680 IS NOT 50% of a GTX 690 omg....
It is in the sense that a 780 will be 85% of a 690. I feel like you have understood none of what I have posted here.
Posted on Reply
#3
theoneandonlymrk
by: blibba
It is in the sense that a 780 will be 85% of a 690. I feel like you have understood none of what I have posted here.
yeh but your wrong and some of us do understand

see crap dadys chart prior in this thread ,85% of a 690's performance would hit 130% on that chart or, 30% fasted then the 7970 ,simples:cool:

also I may have been mislead but I thought Nvidia had put more double precision units in the gk110 ,are these counted amongst the 2880 shaders despite them being specialist units? ,not trying to spark a row im just interested and asking?
Posted on Reply
#4
blibba
by: theoneandonlymrk
yeh but your wrong and some of us do understand

see crap dadys chart prior in this thread ,85% of a 690's performance would hit 130% on that chart or, 30% fasted then the 7970 ,simples:cool:
The chart is utilised entirely inappropriately. The 85% stat is from Nvidia. If you asked them to put a 680 on the same scale, it'd be 55% tops.

In actual FPS (not that that's what we should care about), the 780 will vary from 100% to 170% of the performance of a 7970 depending on other bottlenecks and the demands of the game. It may exceed 170% in titles that favour its architecture (i.e. TWIMTBP titles).
Posted on Reply
#5
theoneandonlymrk
by: blibba
The chart is utilised entirely inappropriately. The 85% stat is from Nvidia. If you asked them to put a 680 on the same scale, it'd be 55% tops.

In actual FPS (not that that's what we should care about), the 780 will vary from 100% to 170% of the performance of a 7970 depending on other bottlenecks and the demands of the game. It may exceed 170% in titles that favour its architecture (i.e. TWIMTBP titles).
if I asked Amd to do that chart they would have beat the 690 with a 7970:p but those two charts would be useless to me, an unbiased customer:p

and all that your waffling can be restated via an Amd biased stance ie some games favour Amd gfx sooooo, that's why we read Tpu reviews ,that charts sound in my eyes bro ,wiz did it....

im out anyway dude your opinions all good ,I am an optimist too.
Posted on Reply
#6
N3M3515
by: blibba
The chart is utilised entirely inappropriately. The 85% stat is from Nvidia. If you asked them to put a 680 on the same scale, it'd be 55% tops.

In actual FPS (not that that's what we should care about), the 780 will vary from 100% to 170% of the performance of a 7970 depending on other bottlenecks and the demands of the game. It may exceed 170% in titles that favour its architecture (i.e. TWIMTBP titles).


That's 66% right there.
Posted on Reply
#8
N3M3515
by: blibba
TPU FPS charts are not even remotely relevant to the discussion we are having. I have explained why this is the case in multiple previous posts in this thread and will not do so again.
I'm talking about this:
by: Naito
Edit 2: Checked original article. Claims 85% of the performance of the GTX 690. Possibility of limited edition card? (when considering naming): "Partner companies must comply with Nvidias reference design to the letter and may not even put their stickers on graphics cards."
Last edited by Naito; Jan 21, 2013 at 08:54 AM.
Posted on Reply
#9
HumanSmoke
by: theoneandonlymrk
also I may have been mislead but I thought Nvidia had put more double precision units in the gk110 ,are these counted amongst the 2880 shaders despite them being specialist units?
The architecture is somewhat different to the GK104 where the 8 FP64 units are distinct (and operate at 1:1 FP64:FP32) from the 1536 shaders. The GK110 has 1920 FP32-only shaders, and 960 FP32/64 capable (GK110 operates on 1/3 double precision rate), so 2880 is the maximum shader/CUDA core/stream processor count of the die.
Posted on Reply
#10
blibba
by: N3M3515
I'm talking about this:
So am I. TPU average FPS graphs are not remotely relevant to this. I have repeatedly explained this. I will no unsubscribe from this thread, as I feel it is going nowhere. I advise that you read Tech Report's review when the card is launched.
Posted on Reply
#11
N3M3515
Well, we'll have to agree to disagree :)
Posted on Reply
#12
jihadjoe
On the bright side, the last time Nvidia had a $800 card was the 8800GTX.
A year later they followed up with the 8800GT that offered 90% of the performance for 1/2 of the price.

Just praying that lightning can, and does strike twice. :respect:
Posted on Reply
#13
TSX420J
Been waiting since last year for this. Was really let down by the dumbed down 6xx series. Just finished putting together 90% of my PC and this is the final piece of the puzzle. Can't wait. Hopefully it wont let me down.
Posted on Reply
#14
TAZ007
Just my humble opinion :)

Having read this thread start to finish i have to wonder, i live and die by one rule, if it aint broke, then dont fix it, i wont buy a 780TI or equivalent unless a game comes out that my PC cant play, some of you spend all that money on having the latest and greatest yet all you seem to do is debate on forums and benchmark all day, a tad overkill dont ya think, all that money just for bragging rights, well more money than sense i think, i can play Farcry 3 on my 560Ti 448 on ultra settings, perfectly playable too with 30 to 50 fps, can play BF3 on high settings and get 50 to 80 FPS, tho this game and only this game in my collection pushes my 1.25 memory to its limit, hence why i play on high settings and not ultra, no this dont apply to everyone, just the die hards, the ones with 680 sli set ups getting all excited cos they cant wait to get the hands on a new 780 even tho the dont really need a 780 at this moment in time, well dont moan about the price, cos it people like you that make the price so high in the first place, Nvida and AMD know this, and if i was them id be screwing you for all the money i could, cos there is a market for it, and really i should not moan, cos as a result i get to you old 680 a lot less than what you paid for them, that the only numbers in interested in, £££ ching ching ;)

Now for the rest of us that have more sense than money that dont feel sad when our FPS drop below 100 and are quite happy playing games than worrying the toss about how much faster a unreleased GPU is going to be or how high our 3Dmark score is going to be, well just like to say hello to you all, im Taz, and if you like to shoot the shit outta me on BF3 then look out for T7BSY, but if you a sad fk that uses an aim bot and other hacks the GET A LIFE!

Anyway in short, i got better things to do than argue the toss over something that is yet to be released, Bytales Intresting read if nothing else ;). QUBIT if you think that 200 FPS will look smother than 100 you just kidding yourself to justify your outlay, most users only have 60MHz screens and the eye cant see beyond that anyway and thats a fact, cos if that was the case then we would only ever see half the picture when watching normal telly, if you have a 120MHz screen you might have a point tho, and even then most would not be able to tell the difference ;)
Posted on Reply
#15
qubit
Overclocked quantum bit
@TAZ007

Judging by your very first post on TPU, it's clear that you just like a good flamebait rant, rather than presenting a coherent argument.

Therefore, I won't waste my time explaining to you where you've got it all so wrong about frame rates. I can't believe you replied all that to what was just a humorous remark to another member, lol.
Posted on Reply
#16
ALMOSTunseen
by: TAZ007
QUBIT if you think that 200 FPS will look smother than 100 you just kidding yourself to justify your outlay, most users only have 60MHz screens and the eye cant see beyond that anyway and thats a fact, cos if that was the case then we would only ever see half the picture when watching normal telly, if you have a 120MHz screen you might have a point tho, and even then most would not be able to tell the difference ;)
Ok, let me just put in my 5 cents here. Most would rather have 200 fps, over 100fps because it gives you "security", with frame drops, and recording. With frames drops, imagine you're at 100fps, and you drop 50fps, you're going to notice it. If you're at 200 fps, and you drop 50fps, you're not going to notice it. And if you're recording, say with fraps. You lock it at 60fps. The higher your maximum fps is, the smoother the recording, AND your gameplay. I have issue's where my recording is smooth, but my gameplay, gets quite laggy.
Posted on Reply
#17
tokyoduong
by: blibba
You've clearly missed my point, but what you've written here is wrong anyway.

A 680 is 50% of a 690. A 780 is 85% of a 690. 85% is 170% of 50%, a 70% difference.
This is as misleading as nvidia's slides
Posted on Reply
#18
TAZ007
by: qubit
@TAZ007

Judging by your very first post on TPU, it's clear that you just like a good flamebait rant, rather than presenting a coherent argument.

Therefore, I won't waste my time explaining to you where you've got it all so wrong about frame rates. I can't believe you replied all that to what was just a humorous remark to another member, lol.
Not at all, but i dont find arguing the toss about about numbers and percentages on something that nobody can possibly know yet pretty pointless, as far as frame rate go, dont take it personnel, i just dont agree with you simple as, i had a 670, i played with and with out vsync and seen not difference at all, hence i returned the 670 and kept the card i got now, anything from 60 FPS to whatever number you choose is wasted in, or on my eyes if you like, i would love to setup six screens and have someone like play high or lower with me ;)

by: ALMOSTunseen
Ok, let me just put in my 5 cents here. Most would rather have 200 fps, over 100fps because it gives you "security", with frame drops, and recording. With frames drops, imagine you're at 100fps, and you drop 50fps, you're going to notice it. If you're at 200 fps, and you drop 50fps, you're not going to notice it. And if you're recording, say with fraps. You lock it at 60fps. The higher your maximum fps is, the smoother the recording, AND your gameplay. I have issue's where my recording is smooth, but my gameplay, gets quite laggy.
Security? lol, have you read all the post on here then? not 100% sure what you mean but im guessing that you mean the GPU you have well last you longer in terms of new games coming out that might in time kill the FPS meaning its now time for a new card?

So are you saying that all those that have 680sli setups are now insecure because the 780Ti is due out? i think not, i find it hard to believe that there is a game out there that would see off two 680's in sli? even with 3 monitor setup a single 680/7970 can play play all games out there

And last but not least, you reckon i will notice a drop from 200 FPS to 50, but not 100FPS to 50 lol, I would not notice either, the brain cant count what the eye cant see, thats not to say you might not feel it, but that wont apply to many, ask any console user, they dont worry about FPS and they still enjoy the game they play just as much as you or I

Only thing i would notice is stuttering when frame drop to 5 to 30 fps frap has nothing to do with what we are talking about, anything above 60 FPS is the sweet spot, but again, its just my opinion, not saying either of you are right, or wrong, just saying does not apply to most of us

Edit; sorry i did misread what you was trying to saying, so to a point i agree, but here is an example, i can play FARCRY 3 on ultra settings, NO AA and SSAO and @ 1920 x 1200 and get between 25 to 50 FPS and this is smooth as, Vram has not gone over 1GB, now on BF3 even on high settings i get 50 to 80 fps, but still in multiplayer i can run out off vram which can cause me lag or drop in frames that cause stuttering, but thats not down to gpu power, thats due to lack of Vram, which hits FPS and causes it to become not so smooth, and thats the only reason you would notice, to put it another way, two 560ti overclocked GPU to score 9500 in 3Dmark11, so as good as a single 680 or 7970, but even tho FPS are in the hundreds playing BF3 its still not smooth, get lag and frame drops due to the lack of Vram, so imo its not all about FPS that equal smooth game play, its all about getting the right balance to limit any bottle necks that you will have at some point!
Posted on Reply
#19
Prima.Vera
Just to add a note here, anything about 30fps is perfect for me in games. And from experience I can tell you that anything above 60fps it makes no difference, I cannot feel the image being more smooth or something. The ones that claim this are just lying to themselves... ;)
Posted on Reply
#20
Calin Banc
by: TAZ007
And last but not least, you reckon i will notice a drop from 200 FPS to 50, but not 100FPS to 50 lol,
He was talking about a drop of 50FPS from 100 (to 50) or from 200 (to 150 in this case) is not the same.

by: Prima.Vera
Just to add a note here, anything about 30fps is perfect for me in games. And from experience I can tell you that anything above 60fps it makes no difference, I cannot feel the image being more smooth or something. The ones that claim this are just lying to themselves... ;)
The FPS can move a lot during a game session. If you want the best possible experience, then you cap that to ~ 60FPS for 60hz monitor, or above for 120hz. For that, you'll also need a minimum of 60FPS, which is much harder to get than an average 60fps.
Posted on Reply
#21
Slizzo
by: jihadjoe
On the bright side, the last time Nvidia had a $800 card was the 8800GTX.
A year later they followed up with the 8800GT that offered 90% of the performance for 1/2 of the price.

Just praying that lightning can, and does strike twice. :respect:
LESS than half the price. I bought an 8800GT on launch day for $300 from CompUSA.
Posted on Reply
#22
TheHunter
I dont buy that 6gb vram size, the one who started spreading this possible GK110 rumor somehow pulled all that info yes from Tesla K20X with 6gb ram. Also with same clocks which doesnt make any real sense..

I think something like 850-950mhz with 3gb seems more plausible and I hope its a real GK110 with all bells and whistles, not just another GK104 with more cores on it..


Also 800$ is a bit to much for me, but then again its some random number xD

I would go for 770GTX, i mean Titan. Hopefully for ~ 500$ max.
Posted on Reply
#23
TAZ007
by: Calin Banc
He was talking about a drop of 50FPS from 100 (to 50) or from 200 (to 150 in this case) is not the same.



The FPS can move a lot during a game session. If you want the best possible experience, then you cap that to ~ 60FPS for 60hz monitor, or above for 120hz. For that, you'll also need a minimum of 60FPS, which is much harder to get than an average 60fps.
This is what i do when i have a card thats has more power than i need, but still if i dont run fraps whist gaming there is no way on earth i cant tel when its in the hundreds, or when it drops down to 50FPS, can notice changes when FPS are between 60 and 20 FPS, that said, im playing FC3 at the mo, and on ultra and getting 30 to 50 and thats smooth enough, tho in BF3 that would be not so good, but i put that down to lack of Vram on my card, so play that game on high, i got a Palit GTX 680 4GB JetStream that i just got for £265 plus my card, and i will hold my hands up if i find it better with the FPS in the 100 plus, but i cant see that being the case, but i will report back if i ever get it back, its stuck in the post, dam snow :(

P.S yes i did mis read that about the frame drops from 200 ;) sorry, my bad
Posted on Reply
#24
Calin Banc
I can notice frame drops below 57, 58FPS or so. FC 3 at that FPS (30-50FPS), for me, it's a stuttering mess. That shows once more how relative this can be from one person to another.
Posted on Reply
#25
TAZ007
by: Calin Banc
I can notice frame drops below 57, 58FPS or so. FC 3 at that FPS (30-50FPS), for me, it's a stuttering mess. That shows once more how relative this can be from one person to another.
well again i can crank the aa up and the other setting and get around 20 to 40 and still its smooth as on FC3, and id be more than happy to try and make a vid and run around, the only reason i turn the aa and select ssao is because of heat issues, and thats the only reason im changing my card tbh, and £265 for a Palit GTX 680 4GB JetStream was too good to turn down :) otherwise id be just as happy with a 7950 or 670, id like the 660ti to if it had a little more bandwidth, but again it not just the hardware in play here, its the games you play too, if you have a card with 2gig of ram you should be fine, and a good fast SSD helps too, with out knowing what your hardware is it hard to say, just remember blu ray is only 24 FPS and the best part it plays smooth, tends to judder when camrea is panning around but best part its smooth as tho a few more FPS here would help out there but im talking 10 FPS more
Posted on Reply
Add your own comment