Wednesday, March 7th 2012

GeForce GTX 680 Features Speed Boost, Arrives This Month, etc., etc.

Here are some key bits of information concerning the upcoming GeForce GTX 680, a performance single-GPU graphics card based on the 28 nm GK104 GPU by NVIDIA. The information, at face value, is credible, because we're hearing that a large contingent of the media that finds interest in the GPU industry, is attending the Game Developers Conference, where it could interact with NVIDIA, on the sidelines. The source, however, is citing people it spoke to at CeBIT.

First, and most interesting: with some models of the GeForce 600, NVIDIA will introduce a load-based clock speed-boost feature (think: Intel Turbo Boost), which steps up clock speeds of the graphics card when subjected to heavy loads. If there's a particularly stressing 3D scene for the GPU to render, it overclocks itself, and sees the scene through. This ensures higher minimum and average frame-rates.

Second, you probably already know this, but GK104 does indeed feature 1,536 CUDA cores, which lend it a strong number-crunching muscle that helps with shading, post-processing, and GPGPU.

Third, the many-fold increase in CUDA cores doesn't necessarily amount to a linear increase in performance, when compared to the previous generation. The GeForce GTX 680 is about 10% faster than Radeon HD 7970, in Battlefield 3. In the same comparison, the GTX 680 is slower than HD 7970 at 3DMark 11.

Fourth, the NVIDIA GeForce GTX 680 will very much launch in this month. It won't exactly be a paper-launch, small quantities will be available for purchase, and only through select AIC partners. Quantities will pick up in later months.

Fifth, there's talk of GK107, a mid-range GPU based on the Kepler architecture, being launched in April.

Next up, NVIDIA is preparing a dual-GPU graphics card based on the GK104, it is slated for May, NVIDIA will use Graphics Technology Conference (GTC), as its launch-pad.

Lastly, GK110, the crown-jewel of the Kepler GPU family, will feature as many as 2,304 CUDA cores. There's absolutely no word on its whereabouts. The fact that NVIDIA is working on a dual-GK104 graphics card indicates that we won't see this chip very soon.Source: Heise.de
Add your own comment

105 Comments on GeForce GTX 680 Features Speed Boost, Arrives This Month, etc., etc.

#1
btarunr
Editor & Senior Moderator
Many Thanks to CrapDaddy for the tip.
Posted on Reply
#2
phanbuey
whoa... dynamic oc ? interesting.

also... whatever happened to "CRUSHING THE 7970?" 10% faster and then slower? That hype.
Posted on Reply
#3
AthlonX2
HyperVtX™
Cant wait start saving your penny's
Posted on Reply
#4
Batou1986
by: btarunr
a load-based clock speed-boost feature (think: Intel Turbo Boost)
Am I the only one who sees this as a featureless feature its the same thing as QnQ and Intel Speed Step only the clock go's up and down.

IMO its basically saying here's a 750 hp engine that's listed as 650 hp but has this awesome feature where you press the red button and it has 750 hp
Posted on Reply
#6
theoneandonlymrk
by: btarunr
Third, the many-fold increase in CUDA cores doesn't necessarily amount to a linear increase in performance, when compared to the previous generation. The GeForce GTX 680 is about 10% faster than Radeon HD 7970, in Battlefield 3. In the same comparison, the GTX 680 is slower than HD 7970 at 3DMark 11.

Fourth, the NVIDIA GeForce GTX 680 will very much launch in this month. It won't exactly be a paper-launch, small quantities will be available for purchase, and only through select AIC partners. Quantities will pick up in later months.
relative to your ealier news post, GK104 (GTX 680) Has 17% Higher Compute Power Than Tahiti (HD7970): Report

all im now seeing is that, Yes nvidia have mearly been hyping their stuff AMD BD style.

17% better compute with equal to less performance else where is not in any way a big deal as they have been makeing out, ill still be buying a lower end sKu but im not overly impressed by the utter bull Nvidia (and AMD:)) spout, !its going to be epic!, and an AMD tahiti smasher:wtf: maybe the Gk110 but not this:rolleyes:

plus what the heck happend to all the delayed to get volume out chat ,mearly trying to spoil AMD's mass sale, since as ever 5 shops are getting 10 cards a piece:wtf:
Posted on Reply
#7
claylomax
by: btarunr
Lastly, GK110, the crown-jewel of the Kepler GPU family, will feature as many as 2,304 CUDA cores. There's absolutely no word on its whereabouts. The fact that NVIDIA is working on a dual-GK104 graphics card indicates that we won't see this chip very soon
Thank you for this info btarunr. So this is the high end Kepler card that's coming at the end of the year, right?
Posted on Reply
#8
phanbuey
by: theoneandonlymrk
relative to your ealier news post, GK104 (GTX 680) Has 17% Higher Compute Power Than Tahiti (HD7970): Report

all im now seeing is that, Yes nvidia have mearly been hyping their stuff AMD BD style.

17% better compute with equal to less performance else where is not in any way a big deal as they have been makeing out, ill still be buying a lower end sKu but im not overly impressed by the utter bull Nvidia (and AMD:)) spout, !its going to be epic!, and an AMD tahiti smasher:wtf: maybe the Gk110 but not this:rolleyes:

plus what the heck happend to all the delayed to get volume out chat ,mearly trying to spoil AMD's mass sale, since as ever 5 shops are getting 10 cards a piece:wtf:
I dont think their sales would have been much higher anyways... most people didnt buy because of the high price. If anything this should at least bring it down.
Posted on Reply
#9
btarunr
Editor & Senior Moderator
by: claylomax
Thank you for this info btarunr. So this is the high end Kepler card that's coming at the end of the year, right?
I personally speculate Q3 or Q4 (X'mas), 2012, depending on Tenerife.
Posted on Reply
#10
theoneandonlymrk
by: Batou1986
Am I the only one who sees this as a featureless feature its the same thing as QnQ and Intel Speed Step only the clock go's up and down.

IMO its basically saying here's a 750 hp engine that's listed as 650 hp but has this awesome feature where you press the red button and it has 750 hp
or for simple folk this card goes to 11 , you can run it at six but unlike others that go to 10 this goes all the way to 11 hahaha:roll::toast: effin brill nvidia, thats the shittest, sell out firmware update, used to hide unstable high clocks tactic ive ever heard of, people are going to go crazy down the pub for that:nutkick:


by: phanbuey
I dont think their sales would have been much higher anyways... most people didnt buy because of the high price. If anything this should at least bring it down.
agreed but the prices would have dropped more had they(NV) backed their big mouth , i hope this is bs and its better:D
Posted on Reply
#11
Salsoolo
thats it. been without a video card/pc for 2.5 weeks and cant afford staying longer.
i thought by march 15 ill see cards on market, but it looks like next month at least.
*orders a 7950*
Posted on Reply
#12
Fairlady-z
I just bought two MSI HD7970's, as my last set up was Nvidia based and I loved it, but I really like changing things up from time to time. In any case one of these cards from either camp is over kill in most cases.
Posted on Reply
#14
theoneandonlymrk
by: Fairlady-z
I just bought two MSI HD7970's, as my last set up was Nvidia based and I loved it, but I really like changing things up from time to time. In any case one of these cards from either camp is over kill in most cases.
yeh for a year or so, then ya buy the game that rips out ya heart :D

next thing ya got ya card out:eek:
Posted on Reply
#15
xkche
mmmm taking this info.. i don't see a "price-war" between Nvidia and AMD :mad:
Posted on Reply
#17
jpierce55
If this is true the only thing Nvidia could have on AMD is price for performance or power consumption..... the 2 things AMD is known for being better at. It sounds like the 7xxx and 6xx generations are not exactly winners for consumers.
Posted on Reply
#18
Crap Daddy
by: Salsoolo
thats it. been without a video card/pc for 2.5 weeks and cant afford staying longer.
i thought by march 15 ill see cards on market, but it looks like next month at least.
*orders a 7950*
Either way it's better for you to wait. The 7870 at 350$ makes the 7950 obsolete.
Posted on Reply
#19
faramir
by: xkche
mmmm taking this info.. i don't see a "price-war" between Nvidia and AMD :mad:
Nvidia has nothing to war with. When they deliver something solid rather than just press releases, the prices of AMD's newest cards should go down.
Posted on Reply
#20
TheMailMan78
Big Member
10%? Thats it?...........fail if true.
Posted on Reply
#21
AthlonX2
HyperVtX™
yeah 10%with a midrange card.....
Posted on Reply
#22
Spaceman Spiff
by: xkche
mmmm taking this info.. i don't see a "price-war" between Nvidia and AMD :mad:
Yup. Limited quantities from nvidia + already high prices from amd = F#*k the consumer! Hooray!:nutkick:
Posted on Reply
#23
TheMailMan78
Big Member
by: AthlonX2
yeah 10%with a midrange card.....
Its named as a top tier card. So I guess Nvidia is up to its name changing game again. They should just call it a 660 TI instead of the BS.
Posted on Reply
#24
Salsoolo
by: Crap Daddy
Either way it's better for you to wait. The 7870 at 350$ makes the 7950 obsolete.
what bugs me is that i already planned for waiting. but its taking too much.

i can wait untill 15 march, but it looks like we're going for april, and thats too much. and nothing is showing on the market, or will show up for like a month.

ps, i cant even find 7950 msi or sapphire cards.
Posted on Reply
#25
Benetanegia
by: TheMailMan78
Its named as a top tier card. So I guess Nvidia is up to its name changing game again. They should just call it a 660 TI instead of the BS.
Or maybe not, maybe they are just adjusting to the games of AMD. HD5800 was the high-end and 5700 midrange and then suddenly HD6900 is high-end and 6800 midrage and then again 7900 and 7800 even though there's no need to leave room on the low end this time around, because there's no low end.

So 680 against 7800 seems right, because what would marketng guys do otherwise? It's their job to try and eveluate if going to HD6900 is going to make it look faster and Nvidia's marketing job to evaluate if it did hurt them to have a lower number. :laugh: :rolleyes:
Posted on Reply
Add your own comment