Monday, January 21st 2013

NVIDIA to Name GK110-based Consumer Graphics Card "GeForce Titan"

2013 started off on a rather dull note for the PC graphics industry. NVIDIA launched its game console platform "Project: Shield," while AMD rebranded its eons-old GPUs to Radeon HD 8000M series. Apparently it could all change in late-February, with the arrival of a new high-end single-GPU graphics card based on NVIDIA's GK110 silicon, the same big chip that goes into making the company's Tesla K20 compute accelerator.

NVIDIA may have drawn some flack for extending its "GTX" brand extension too far into the mainstream and entry-level segment, and wants its GK110-based card to stand out. It is reported that NVIDIA will carve out a new brand extension, the GeForce Titan. Incidentally, the current fastest supercomputer in the world bears that name (Cray Titan, located at Oak Ridge National Laboratory). The GK110 silicon physically packs 15 SMX units, totaling 2,880 CUDA cores. The chip features a 384-bit wide GDDR5 memory interface.

Source: SweClockers
Add your own comment

203 Comments on NVIDIA to Name GK110-based Consumer Graphics Card "GeForce Titan"

#1
qubit
Overclocked quantum bit
by: Prima.Vera
I played Crysis as a FPS and NFS Hot Pursuit as a race game, and both looked OK and very playable even if they were capped at 30fps. I have no issues either, so I think people are overreacting and exaggerating with this FPS crap discussion...
Have you ever played these games vsynced to 60Hz or 120Hz? I suspect you haven't. If you haven't, then that 30Hz cap might well seem ok to you.

In addition, LightBoost, makes a very big difference by eliminating motion blur. The effect is nothing short of awesome. You basically get to have your cake and eat it. :D
Posted on Reply
#2
Aquinus
Resident Wat-man
by: qubit
That's really rubbish - laggy and juddery all the way. I'm glad I don't have a console.
So I take it that you find Blu-Ray video "laggy and jittery" as well because it's capped at either 24p, 25p, 23.976p, or 29.97i and will never exceed 30 FPS (Sans 29.97i but even that might jump because you have to deinterlace the video).

I think that what people think of as "jittery and laggy" is actually the render time from frame to frame. Just because you're running at 30 FPS doesn't mean that every frame is rendered in the same amount of time. So if you render 30 frames in 1 second but the first 25 get rendered in the first half three quarters of a second, and the last 5 takes that last quarter of a second, you have changes in frame-rate that causes the jittery-ness you describe and you have the reduced framerate for that quarter of a second that introduces lag (and if it reoccurs often, it introduces more jitter).

So all in all, the frame-rate argument is dumb. The only reason 60FPS and 120FPS feels "smoother" is because that difference in render time from frame to frame that that much less. So I'm willing to bet that a 30 FPS game with equal render times from frame to frame versus 60 FPS that doesn't, the 30 FPS has the potential to feel smoother if not just as smooth because the instantaneous frame rate would be consistant.

The most important thing to take away from this is:
Average frame rate is not the same thing as instantaneous frame rates and variation between frame render times. Also keep in mind that it will be hard to make it perfect, as scene's change the render time can change as well.
Posted on Reply
#3
qubit
Overclocked quantum bit
by: Aquinus
So I take it that you find Blu-Ray video "laggy and jittery" as well because it's capped at either 24p, 25p, 23.976p, or 29.97i and will never exceed 30 FPS (Sans 29.97i but even that might jump because you have to deinterlace the video).

I think that what people think of as "jittery and laggy" is actually the render time from frame to frame. Just because you're running at 30 FPS doesn't mean that every frame is rendered in the same amount of time. So if you render 30 frames in 1 second but the first 25 get rendered in the first half three quarters of a second, and the last 5 takes that last quarter of a second, you have changes in frame-rate that causes the jittery-ness you describe and you have the reduced framerate for that quarter of a second that introduces lag (and if it reoccurs often, it introduces more jitter).

So all in all, the frame-rate argument is dumb. The only reason 60FPS and 120FPS feels "smoother" is because that difference in render time from frame to frame that that much less. So I'm willing to bet that a 30 FPS game with equal render times from frame to frame versus 60 FPS that doesn't, the 30 FPS has the potential to feel smoother if not just as smooth because the instantaneous frame rate would be consistant.

The most important thing to take away from this is:
Average frame rate is not the same thing as instantaneous frame rates and variation between frame render times. Also keep in mind that it will be hard to make it perfect, as scene's change the render time can change as well.
Wrong. Wrong. Wrong. Seriously people, I don't see what's so difficult to understand? I suspect that there's a certain amount of denialism going on here.

I might just write that article on framerate sooner rather than later.
Posted on Reply
#4
Aquinus
Resident Wat-man
by: qubit
Wrong. Wrong. Wrong. Seriously people, I don't see what's so difficult to understand? I suspect that there's a certain amount of denialism going on here.

I might just write that article on framerate sooner rather than later.
Then do it because it seems like most people disagree with you. I'll read it if you write it. I'm not saying that 60 and 120 FPS isn't smoother. I'm just saying that the jitter that people notice with 30 FPS is more likely inconsistant render times rather than the rate. That has nothing to do with other rates.
Posted on Reply
#5
qubit
Overclocked quantum bit
by: Aquinus
Then do it because it seems like most people disagree with you. I'll read it if you write it. I'm not saying that 60 and 120 FPS isn't smoother. I'm just saying that the jitter that people notice with 30 FPS is more likely inconsistant render times rather than the rate. That has nothing to do with other rates.
I doubt that "most" people disagree. There's just a few vocal ones that insist on trying to negate what I'm saying, lol.

This isn't rocket science. I've seen all this stuff for myself and the effects are all very obvious, so I know I'm right. There's no way someone can "prove" me wrong with a counter argument, as it's inevitably flawed.
Posted on Reply
#6
NeoXF
by: Crap Daddy
Here's a better comparison for you, GTX690 being 53% faster than the GE at 1200p

http://tpucdn.com/reviews/AMD/HD_7970_GHz_Edition/images/perfrel_1920.gif

Now, math is not my strong point but I reckon going by the presumed 85% performance of the Titan (ium) compared to GTX690, this GK110 based card would be 45% faster than the 7970GE.
Aside from what other people stated about 85% of GTX 690, I'd also want to point out that those are pre-Catalyst 12.11b/13.1WHQL scores, so the difference might be less than 30%... Even so, I take most these wide-margin calculations with shovel full of salt.
Posted on Reply
#7
Calin Banc
by: Aquinus
So I take it that you find Blu-Ray video "laggy and jittery" as well because it's capped at either 24p, 25p, 23.976p, or 29.97i and will never exceed 30 FPS (Sans 29.97i but even that might jump because you have to deinterlace the video).

I think that what people think of as "jittery and laggy" is actually the render time from frame to frame. Just because you're running at 30 FPS doesn't mean that every frame is rendered in the same amount of time. So if you render 30 frames in 1 second but the first 25 get rendered in the first half three quarters of a second, and the last 5 takes that last quarter of a second, you have changes in frame-rate that causes the jittery-ness you describe and you have the reduced framerate for that quarter of a second that introduces lag (and if it reoccurs often, it introduces more jitter).

So all in all, the frame-rate argument is dumb. The only reason 60FPS and 120FPS feels "smoother" is because that difference in render time from frame to frame that that much less. So I'm willing to bet that a 30 FPS game with equal render times from frame to frame versus 60 FPS that doesn't, the 30 FPS has the potential to feel smoother if not just as smooth because the instantaneous frame rate would be consistant.

The most important thing to take away from this is:
Average frame rate is not the same thing as instantaneous frame rates and variation between frame render times. Also keep in mind that it will be hard to make it perfect, as scene's change the render time can change as well.
Again, don't compare a movie with a game, the frame rates are "packed" waaaay different, Earth to Moon different. Besides, I believe we all want that virtual image, that virtual experience to be more life like, no? That also needs a high frame rate, just in the same way, a movie is more true to life close to 60 rather than 24.

Plus, a higher frame rate gives better control over your avatar.
Posted on Reply
#8
tokyoduong
by: Calin Banc
Again, don't compare a movie with a game, the frame rates are "packed" waaaay different, Earth to Moon different. Besides, I believe we all want that virtual image, that virtual experience to be more life like, no? That also needs a high frame rate, just in the same way, a movie is more true to life close to 60 rather than 24.

Plus, a higher frame rate gives better control over your avatar.
Higher framerates can certainly lower frame latencies. But like the techreport reviews have proven that it can go from 2ms between frames all the way to 50+ms between frames and still maintain a higher avg frame rates. There's no special "packed" way a movies does, it's just a smooth consistent frame rate throughout. That is why 24 fps works in theaters and 30/60 fps works for TVs. Nobody ever complained about that! Your brain will adjust to a consistent frame rates(eg. you can't see fluorescent lights blinking). Inconsistent frame rates will need to have a minimum latency for you to perceive it as smooth. I think anything below 16.7ms(60hz) is hard to detect. I can tell a slight difference between 60 and 75hz but not too many people can.

by: qubit
Wrong. Wrong. Wrong. Seriously people, I don't see what's so difficult to understand? I suspect that there's a certain amount of denialism going on here.

I might just write that article on framerate sooner rather than later.
Write it and stop wasting data and bandwidth posting this crap. You're just being rude and annoying.
Posted on Reply
#9
Slizzo
by: Aquinus
So I take it that you find Blu-Ray video "laggy and jittery" as well because it's capped at either 24p, 25p, 23.976p, or 29.97i and will never exceed 30 FPS (Sans 29.97i but even that might jump because you have to deinterlace the video).

I think that what people think of as "jittery and laggy" is actually the render time from frame to frame. Just because you're running at 30 FPS doesn't mean that every frame is rendered in the same amount of time. So if you render 30 frames in 1 second but the first 25 get rendered in the first half three quarters of a second, and the last 5 takes that last quarter of a second, you have changes in frame-rate that causes the jittery-ness you describe and you have the reduced framerate for that quarter of a second that introduces lag (and if it reoccurs often, it introduces more jitter).

So all in all, the frame-rate argument is dumb. The only reason 60FPS and 120FPS feels "smoother" is because that difference in render time from frame to frame that that much less. So I'm willing to bet that a 30 FPS game with equal render times from frame to frame versus 60 FPS that doesn't, the 30 FPS has the potential to feel smoother if not just as smooth because the instantaneous frame rate would be consistant.

The most important thing to take away from this is:
Average frame rate is not the same thing as instantaneous frame rates and variation between frame render times. Also keep in mind that it will be hard to make it perfect, as scene's change the render time can change as well.
by: Aquinus
Then do it because it seems like most people disagree with you. I'll read it if you write it. I'm not saying that 60 and 120 FPS isn't smoother. I'm just saying that the jitter that people notice with 30 FPS is more likely inconsistant render times rather than the rate. That has nothing to do with other rates.
I swear I'm going to shoot the next person that tries to compare movie framerates to videogame framerates.

by: Calin Banc
Again, don't compare a movie with a game, the frame rates are "packed" waaaay different, Earth to Moon different. Besides, I believe we all want that virtual image, that virtual experience to be more life like, no? That also needs a high frame rate, just in the same way, a movie is more true to life close to 60 rather than 24.

Plus, a higher frame rate gives better control over your avatar.
Apparently we can try to inform these people until we're blue in the face but they'll keep skipping over what we're saying (and backing up with facts) and going on comparing movies/TV to games...
Posted on Reply
#10
tokyoduong
by: Slizzo
I swear I'm going to shoot the next person that tries to compare movie framerates to videogame framerates.
whoa, raging much? I agree with you but frame rates are not that serious. Let's just solve world hunger or cure cancer :)
Posted on Reply
#11
HammerON
The Watchful Moderator
I think it is time to move on folks. Feel free to continue this conversation about frame rates in a new thread or in PM's.
Carry on
Posted on Reply
#12
qubit
Overclocked quantum bit
by: tokyoduong
Write it and stop wasting data and bandwidth posting this crap. You're just being rude and annoying.
The only people being rude, annoying and thread crapping are those like yourself who take a pop at me for no good reason. :slap:

Anyway, it's the wrong thread to talk about framerates here, like HammerON said.
Posted on Reply
#13
Calin Banc
by: tokyoduong
There's no special "packed" way a movies does
I'm only going to say it once and be done with it: yes, there is. The frames overlap in such a way that fools the eye. I saw that someplace, but I can't find it any more. Believe or don't believe, your choice.

Yeah, I can settle with 30FPS locked on if I have to, but I don't like it, even if it's 33,(3)ms perfect. I want life like experience, faster response from me and prompt response from my character. That is given by a high FPS of at least 60. That rocks my boat.

And why on God's green Earth people think this card would be overkill? My 2500@4,5GHz and 7950 (1170/1600MHz - somewhere equal to 7970ghz ed./gtx680) can't give a solid 60FPS in all games even on 1680x1050 if I choose highest in game settings. There is never to much performance, just to expensive to get it. :)
Posted on Reply
#14
jihadjoe
The issue I have with that 85% of a 690 comment is that there's no unit of measurement.

Is it 85% in game performance? Then that pretty much says right there how the GK110 card will perform.

Is it 85% of the render/shader performance? In this case the actual in-game performance might be even better than the 690, because there aren't any SLI scaling issues to deal with.

I'm expecting this to play out much like 560Ti vs 580, which is really where GK104 probably stands against GK110.
Posted on Reply
#15
Am*
This had better have all of the 15 SMX units enabled. This card should've launched a year ago, I doubt anybody is going to want or accept anything less than a perfect card, especially at the rumoured price of $900...Nvidia have had a year to get their shit together so this had better be their full 2880 shader GPU.
Posted on Reply
#16
TAZ007
by: Calin Banc
I'm only going to say it once and be done with it: yes, there is. The frames overlap in such a way that fools the eye. I saw that someplace, but I can't find it any more. Believe or don't believe, your choice.
Yes, film makers use tricks like using the same frame 4 times, look read here and this will help to get a better understanding on both film, and games, then for the rest its sometime better to agree to disagree, beauty is in the eye of the beholder, and same can be said for FPS cos we are all different hence why we feel so strongly about our own view :) its cos that how we see things ;)
Posted on Reply
#17
HammerON
The Watchful Moderator
by: Calin Banc
I'm only going to say it once and be done with it: yes, there is. The frames overlap in such a way that fools the eye. I saw that someplace, but I can't find it any more. Believe or don't believe, your choice.

Yeah, I can settle with 30FPS locked on if I have to, but I don't like it, even if it's 33,(3)ms perfect. I want life like experience, faster response from me and prompt response from my character. That is given by a high FPS of at least 60. That rocks my boat.

And why on God's green Earth people think this card would be overkill? My 2500@4,5GHz and 7950 (1170/1600MHz - somewhere equal to 7970ghz ed./gtx680) can't give a solid 60FPS in all games even on 1680x1050 if I choose highest in game settings. There is never to much performance, just to expensive to get it. :)
by: TAZ007
Yes, film makers use tricks like using the same frame 4 times, look read here and this will help to get a better understanding on both film, and games, then for the rest its sometime better to agree to disagree, beauty is in the eye of the beholder, and same can be said for FPS cos we are all different hence why we feel so strongly about our own view :) its cos that how we see things ;)
Last warning. Any more discussion about frame rates/movies will lead to an infraction.
Posted on Reply
#18
Mombasa69
Worth the wait

I'll certainly get one, I deliberately skipped the 600 series as they weren't much of an upgrade compared to my 3 570's, and only a retard would upgrade EVERY year.
Posted on Reply
#19
qubit
Overclocked quantum bit
by: Mombasa69
I'll certainly get one, I deliberately skipped the 600 series as they weren't much of an upgrade compared to my 3 570's, and only a retard would upgrade EVERY year.
Would you pay $900 for it though? I'd have to be very well off to spend that much on a card.
Posted on Reply
#20
Calin Banc
If it's close to a gtx690 performance, some will. At least you'll get a much better experience and there is no need for SLI profiles anymore.
Posted on Reply
#21
erocker
by: Calin Banc
If it's close to a gtx690 performance, some will. At least you'll get a much better experience and there is no need for SLI profiles anymore.
I don't get this.. You're saying with every new generation of cards, the price should increase by almost twofold?

I remember when the 5870 replaced the 4870... The performance was almost double and the price was actually cheaper!
Posted on Reply
#22
Slizzo
by: Calin Banc
If it's close to a gtx690 performance, some will. At least you'll get a much better experience and there is no need for SLI profiles anymore.
I dont' think it makes sense at this price point. If it's coming out to pre-empt AMD's 8 series cards, it should be priced around what the GTX680 was when it was released.

I suppose this also depends on what this card is branded as.
Posted on Reply
#23
NdMk2o1o
by: Mombasa69
I'll certainly get one, I deliberately skipped the 600 series as they weren't much of an upgrade compared to my 3 570's, and only a retard would upgrade EVERY year.
Only a retard would use such a sweeping generalisation :slap: but seriously I probably do upgrade once a year, I have a Sapphire Vapor-X 7950 (overclocks to 1200/1450 on stock volts)

I buy mainly mid-high end (last few cards: GTX 470, GTX 570, HD 7950) so to some a marginal upgrade every generation but the way I look at it is that I sold my 470 for 2/3 the cost of the 570 and the 570 was better at stock, able to far exceed the 470 OC, used less power ran cooler etc etc. The same is true when I sold my 570 and went with my current 7950, I paid a small upgrade fee after I sold my card and benefitted from a newer architecture, far greater performance when taking overcloking into consideration and of course it's the latest gen hardware so will hold the same kind of resale value as the previous cards when the new generation of cards come out enabling me to again upgrade for a marginal expense whilst having the latest gen hardware, it's a no brainer to me.
Posted on Reply
#24
Calin Banc
Nope erocker, the huge ripoff price for the 7xxx series kept me away from it until the magic driver and bundle came true. What I'm saying guys, is that no matter how high the price will be, there are folks that will pay... at least if the performance is wright.

At the same time, if this would launch at the same price point as gtx 680, it will mean price drops for every card underneath and the killing of gtx680/690. Sure, that should happen if we are talking about a new series, but at least for now, we know to little. If AMD launches at a better price/performance, then it lowers the prices (they have a margin to play with), if not... not. It's a win-win for them.
Posted on Reply
#25
qubit
Overclocked quantum bit
by: erocker
I don't get this.. You're saying with every new generation of cards, the price should increase by almost twofold?

I remember when the 5870 replaced the 4870... The performance was almost double and the price was actually cheaper!
Yeah, it's annoying how the two graphics camps appear to have reversed the trend of a fixed price point (or lower one) for the top models while delivering better performance every generation. The price has been creeping up for some time now, while performance improvements have only been incremental. Presumably thermal, wattage and physical GPU & card size limits are to blame for this trend?

I paid the full £400 for my GTX 580 and that was too steep as it is. I'm not paying £500 for the next one, no way.

I still like that kitty. I'm a sucker for them. :D
Posted on Reply
Add your own comment