Discussion in 'News' started by btarunr, Jan 21, 2013.
I suppose this would make my 600w PSU beg for mercy when overclocked?
I'd love to have it! Not dishing out $900 bucks for console ports though.
that spec is exactly the same as tesla part. detail about tesla k20 and k20x here:
we know that the compute version always clock lower than it's geforce counter part so this Geforce 'Titan' could clock higher for the final revision. but nvidia might want to keep the clock lower just like the tesla cards to keep the TDP at 235w.
Not sure if you're gtx680sli is up to snuff, I recommend you upgrade to the gtx/titan780.
So I want this for rendering with Octane render, or possibly V-ray RT, but I have a sneaky feeling Nvidia is going to put in some sort of artificial hindrance, how would they else get to sell their absurdly expensive Maximus range... If there are no snags, I'll get three or four of the GeForce Titans.
you sound frazzled.
I am so enlightened now. Why am I folding when I can just drink my own pee and cure every aliment ?
As for the graphics card, sell it at US $599(not 800) or lower and this is an instant winner(for hi resolution gamers and folders alike...). I am only worried about heat like usual....
Trust me it'll look smoother.
Ah, yeah right. We've got "cures" for all these things, especially the godawful cancer. Sure we do.
Man never went to the moon either and everything is a conspiracy.
So the actual product name will be Titan? Then I guess we won't get a real 700 series chip until the end of the year. This product seems to be going from the wonder chip game changer, polished for months and months, to totally random one off high end part that changes nothing for nobody? The profits nvidia is making from kepler must be insane. Imagine if the 7000 series had been good this thing would be only $500 and the 680(660 ti) would be $300 or less. Talk about pulling your punches.
You forgot to mention Hemp oil
Ignore the close minded sheeple, take you about a decade to get through to them
Actually, acording to techpowerup charts, if GTX 780 is 85% of GTX 690, then GTX 780 would be to GTX 680:
45% faster at 2560x1600
29% faster at 1920x1080
21% faster at 1680x1050
6% faster at 1280x800
Avg. of 25% faster.
AMD needs something 25% avg faster than 7970 Ghz Ed., to keep the marginal lead they have.
if these specs end up the case then it would be such a fail for nvidia, remember gk104 is clocked around 1100 mhz and noway a huge chip like gk110 can clock that high
the max i expect the clock to be is around 800-900 if we are to be optimistic, 732 is the tesla version which is clocked lower because tesla have to be 24hour operation guaranteed
which geforce cards dont require such warantee.
but if 732 is the clock speed then nvidia is in for trouble, because if we were to do calculations
gtx680 has 1536cores at around 1100mhz 1536 x 1100= 1689600
gk110 has 2688 at 732mhz 2688 x 732 = 1967616
1689600/1967616 = so around .85 or 85% which is 15% extra theoretical power over gtx680 but offcourse add another 10% for the added memory bandwidth benefit. so to make a chip twice the size of gtx 680 for 15-25% extra performance is very meh, which reallly makes me doubt its a 732mhz part otherwise nvidia is much better off making a part closer to 2000 cores with the higher clockspeed advantage, thats definitly the smarter way to get 30% performance and be able to sell it at good prices.
and most likely this is what amd is doing next round, refining gcn for better efficiency to pack more cores in the same power envelope while maintaining the clock speed advantage
Agreed, clock speed sounds way too low. I think this chip has around 7b transistors in it, so I can imagine that getting a ghz out of it will be challenging. No doubt it would be more comfortable on a smaller process technology.
I love you.
I thought TPU is a friendly site. Guess it changed now when other people has different opinions than you. Well.
All I can say is read the initial post by Bytales. It is confrontational, conspiratorial and quite insulting to crunchers (who do what they do to help scientific research for a good cause).
I love the blatant hypocrisy people exhibit online. If the guy has fans here, fair enough. You can all live in the conspiracy theory bunker and think the world is out to get you. But we don't all have to hold hands and hug.
FTR, what the does the Titan Supercomputer do? Crunch numbers for science. Using the HPC variant of the very card we are talking about.
Oh yeah, why don't more people come help out?
kind of a lame name imo
well who knows maybe they will do something similar to the 200 series for the 700 series and use 2 process nodes next round, with gk110 being a lower clocked version then when 20nm comes out with the low capacity they will only shrink the top end part and market it as gtx785 with higher clockspeeds and lower tdp and sell it for an arm and a leg like the gtx285 did back in the day. that would be a decent strategy which will buy them time for maxwell and a shrink is always a better way to go than a shrink and new architecture from a technical engineering standpoint. especialy with nvidia they arent the best in moving to new nodes
This isn't the thread to discuss whether or not one should crunch or fold. Please take it to PM's.
Edit: On topic - I will be enjoying watching some of our extreme over clockers have fun with these cards while I sit this one out. Too expensive for me.
I will wait for more information about the correct specifications, since it seems they based the clocks on the K20x clocks which is not correct as this card will not be a passive telsa card, nvidia usually sets the clocks back on their telsa cards by about 20%-25% recently, I would guess the core clock on the geforce version to be about 900 - 950mhz.
but still at 900 dollars, . . . . . . not a chance I'm gonna buy it.
ASUS GeForce Titan 780 6GB GDDR5 MARS IV
Only way I see this card having revelancy to such a name is... if it's 4K-proof to AT LEAST a handful of modern titles... since I hate multi-monitor gaming setups with a passsion and even the puny-ish 2560x1440/1600 are no way near as common as they should be... Nor is the hardware or programing skill of most game studios up to snuff.
Also, I usually work these things backwards.
I personally think 7ghz/384-bit is a safe bet for 'a' gk110 card. Might not be a clock of one released, but it's a starting point of what to expect within 300w.
Working backwards from the 680, a 14 smx unit card with 384-bit at the same clocks as 680 would require that exact amount of bandwidth.
( 2688/1536 = 1.75/1.5 = 1.166_ x 6000 = 7000).
Now, the optimal (most efficient) amount of units (including sfu) in gaming for 48 ROPs is somewhere around >2800 and <2900. This would have 3136. AMD may have 2560, so that's kind of a wash.
Then it's obviously about voltage and clock potential/efficiency within a tdp (probably 300w). AMD may clock their product stock closer to nvidias max clock within the tdp, choosing rather for their max clock at 300w to be closer to the potential of the 28nm process (1200-1300mhz). nvidia's clock choice may be more power efficient, say if they scale from ~975 to 1100+mhz. AMD's choice may be more die size/cost efficient.
Point is, end of the day, if one has ~10% more usable units but more bloat, and one clocks ~10% higher, what is the better part? Does it really matter? They should be relatively close.
If you're making assumptions that the K20X cut-and-paste specs re memory are wrong, then might the SMX count be wrong also?
It isn't beyond the realms of possibility that the GeForce version has a full 15SMX. Tesla and Quadro generally have more functionality fused off then GF- presumably to fuse off the out-of-spec logic blocks and to reduce power requirement. A GeForce card probably wont be under the same restraint. Also not unheard of that TSMC's process might have improved and/or a revision from the first tranche of wafers might have taken place. If the original ~20000 GPUs going to HPC deployment are 87-93% functional then I'd assume that there must be a percentage of fully functional chips
Bla bla bla who cares, what i have now will last the next 7yrs anyway, not tlike we cant all play console ports anyway right? lol
Separate names with a comma.