Wednesday, March 7th 2012

GeForce GTX 680 Features Speed Boost, Arrives This Month, etc., etc.

Here are some key bits of information concerning the upcoming GeForce GTX 680, a performance single-GPU graphics card based on the 28 nm GK104 GPU by NVIDIA. The information, at face value, is credible, because we're hearing that a large contingent of the media that finds interest in the GPU industry, is attending the Game Developers Conference, where it could interact with NVIDIA, on the sidelines. The source, however, is citing people it spoke to at CeBIT.

First, and most interesting: with some models of the GeForce 600, NVIDIA will introduce a load-based clock speed-boost feature (think: Intel Turbo Boost), which steps up clock speeds of the graphics card when subjected to heavy loads. If there's a particularly stressing 3D scene for the GPU to render, it overclocks itself, and sees the scene through. This ensures higher minimum and average frame-rates.

Second, you probably already know this, but GK104 does indeed feature 1,536 CUDA cores, which lend it a strong number-crunching muscle that helps with shading, post-processing, and GPGPU.

Third, the many-fold increase in CUDA cores doesn't necessarily amount to a linear increase in performance, when compared to the previous generation. The GeForce GTX 680 is about 10% faster than Radeon HD 7970, in Battlefield 3. In the same comparison, the GTX 680 is slower than HD 7970 at 3DMark 11.

Fourth, the NVIDIA GeForce GTX 680 will very much launch in this month. It won't exactly be a paper-launch, small quantities will be available for purchase, and only through select AIC partners. Quantities will pick up in later months.

Fifth, there's talk of GK107, a mid-range GPU based on the Kepler architecture, being launched in April.

Next up, NVIDIA is preparing a dual-GPU graphics card based on the GK104, it is slated for May, NVIDIA will use Graphics Technology Conference (GTC), as its launch-pad.

Lastly, GK110, the crown-jewel of the Kepler GPU family, will feature as many as 2,304 CUDA cores. There's absolutely no word on its whereabouts. The fact that NVIDIA is working on a dual-GK104 graphics card indicates that we won't see this chip very soon.Source: Heise.de
Add your own comment

105 Comments on GeForce GTX 680 Features Speed Boost, Arrives This Month, etc., etc.

#1
TheMailMan78
Big Member
erocker said:
Tahiti is a mid-tier "sea-islands".
Tenerife could be the 8900 series too. Right now Tahiti is top tier.....unless AMD goes something like 7995 or some crap.

Benetanegia said:
That's right.



No it's not shaddy. They don't have anything better for now, either because they can't make it (TSMC 28nm issues or because of their own fault), or because when they saw the competition they decided to play safe and ensure a better situation than another GF100.

No matter what, they don't have anything faster for now. So if what they do have is faster than the competition they name it accordingly and I guess that they price it accordingly, although we dont know the price yet, so making assumptions is stupid. Like Crap Daddy said, we are probably screwed, but that's something I've been saying the minute I saw HD7970's performance, but of course I was flamed for that.
I never flamed you for the price. 7970 is way over priced.
Posted on Reply
#2
xkche
I see the performance of HD7870 so close to the HD7950... maybe the HD7900 is limited by drivers until nvidia release the GTX600???

Maybe i'm paranoic.... @.@
Posted on Reply
#3
Casecutter
Crap Daddy said:
Obviously you launch the card that's ready and "adapt".
Eactly, and if this works out and blindsided AMD... kuodos.

But, here’s me thinking… What happened or is happening with a GK110? Why so late? If GK104 came out this great, why not redeploy with a GK110 “death blow” at any price? Or, is it not working out right, how can a bigger die not be working, they can't correct it? ...

They're providing AMD time to engineer and release a re-spin? Something doesn't make sense with this; I mean is it that revolutionary size, performance, efficiency, and price… they just aren't compelled to stand the market on its ear?
Posted on Reply
#4
erocker
Senior Moderator
TheMailMan78 said:
Tenerife could be the 8900 series too. Right now Tahiti is top tier.....unless AMD goes something like 7995 or some crap.
It's all semantics and naming. All the 8900 series will be basically is a beefed up Tahiti. Same architecture.
Posted on Reply
#5
Benetanegia
TheMailMan78 said:
Tenerife could be the 8900 series too. Right now Tahiti is top tier.....unless AMD goes something like 7995 or some crap.
And GK104 is top tier now.
I never flamed you for the price. 7970 is way over priced.
I never said you did, but oh, I was flamed by many, because 15% over GTX580 was miraculous and Nvidia would never come up with something much faster and if they did it would cost $1000 and draw 500w and whatnot.
Posted on Reply
#6
HumanSmoke
Casecutter said:
But, here’s me thinking… What happened or is happening with a GK110? Why so late? If GK104 came out this great, why not redeploy with a GK110 “death blow” at any price? Or, is it not working out right, how can a bigger die not be working, they can't correct it? ...
Could be any number of reasons:
1. The larger GPU is obviously going to need a wider memory bus. Nvidia are lagging in memory controller implementation at the present time -hardly surprising since the GDDR5 controller was basically pioneered by AMD. Witness the relatively slow memory clocks for Fermi.
A 384 (or larger) bus width is likely a necessity for workstation, and particularly HPC, and for whatever else GK110 is, it will primarily earn back its ROI in the pro market.
2. Likewise cache
3. Double precision optimization ?
4. Maybe the sheer size of the die is problematic for yield, heat dissipation etc. Not an unknown factor with large GPU's in general and Nvidia's large monolithic dies in particular.
Posted on Reply
#7
erixx
Opening another beer before going to bed. Wake me up when we can order this. :)
Posted on Reply
#9
v12dock
Nvidia is no longer this magical super powerful and mysterious company that worshipers had once believed, performance levels are well within I expected. It's going to tricky picking a GPU for a build I have coming up in late April.
Posted on Reply
#10
TheMailMan78
Big Member
Benetanegia said:
And GK104 is top tier now.
Then what will they call the next one? 780 in the same year? Sorry I'm not buying it.

Benetanegia said:
I never said you did, but oh, I was flamed by many, because 15% over GTX580 was miraculous and Nvidia would never come up with something much faster and if they did it would cost $1000 and draw 500w and whatnot.
Well.....as its been said we havent seen the price or power draw yet. Could be 1000 bucks with a 500w power draw for 10% faster then the 7970 lol. I doubt it.....but NVIDIOTS would pay for it. I wouldnt put it past NVIDIA to charge it knowing this.
Posted on Reply
#11
OneCool
Sounds like their stressing a mid-range chip to be top dog.


Something is telling me their adding voltage to get the clocks up to compete.

"Speed Boost" come on!! They already have 3 clock profiles now.Why some other kind voltage control unless your worried the damn thing is going to overheat in 3D situation.I can just hear the fan going up and down,up and down :rolleyes:

I hope im wrong but ...... we shall see :shadedshu
Posted on Reply
#12
bear jesus
I have to wonder what effect the "clock speed-boost feature" could have on overclocking and if it could be turned off.

Hopefully GK104 clocks well as if it is only a relatively small percentage ahead of a stock 7970 then surly the 7970s with high clocks (1.1ghz+) would be so close or in theory even beat it.
Whatever happens it looks like things could get interesting but in a kind of unexpected way.

As far as the name goes obviously after seeing all the dual mid range GPU cards Nvidia chose to make the 680 just 660 SLI on a chip but the yields failed them so now the 660 is the 680 and GK100 is the 780 when AMD brings out the 89xx cards :p
Posted on Reply
#13
HumanSmoke
TheMailMan78 said:
Then what will they call the next one? 780 in the same year? Sorry I'm not buying it.
So when's launch day for the GTX 780? I'd like to get my pre-order in

BTW:
HD 2900 series ....May 2007
HD 3870 series.....Nov 2007

So, not exactly unheard of, even if you use the "same year" terminology rather than a calender year. If we're talking the same architecture, you might want to check on the GF100/GF104 launch timeable.

TheMailMan78 said:
Well.....as its been said we havent seen the price or power draw yet. Could be 1000 bucks with a 500w power draw for 10% faster then the 7970 lol. I doubt it.....but NVIDIOTS would pay for it. I wouldnt put it past NVIDIA to charge it knowing this.
Something to be said for building a brand. Maybe if ATi/AMD had shown more than a passing interest in dev support (GiTG) and pro graphics we wouldn't be looking at this situation.

Still, no pleasing some people....as your avatar proclaims.
Posted on Reply
#14
farquaid
bear jesus said:
I have to wonder what effect the "clock speed-boost feature" could have on overclocking and if it could be turned off.
Question i ask is how well it will work. It would be a really good thing if can remove dips in the fps. Those happen very sudden so I think it would be hard to instantaneously boost clock speed and if it doesnt boost speed instantaneously it would have to predict the future.
Posted on Reply
#15
TheMailMan78
Big Member
HumanSmoke said:
So when's launch day for the GTX 780? I'd like to get my pre-order in

BTW:
HD 2900 series ....May 2007
HD 3870 series.....Nov 2007

So, not exactly unheard of, even if you use the "same year" terminology rather than a calender year. If we're talking the same architecture, you might want to check on the GF100/GF104 launch timeable.
And if you owned a 2900 series you would also know what a bitter taste that left in your mouth. Why do you think its not common place anymore? Hmmmmm.

Also I love all the "But, but AMD does it too" crap. Some of it isn't even remotely the same. Yet people use it as an excuse for what NVIDIA is doing. Guess what? This thread is about NVIDIA not AMD.

There I bit. Ya happy? Do you really wanna troll me?
Posted on Reply
#16
Crap Daddy
TheMailMan78 said:
There I bit. Ya happy? Do you really wanna troll me?
I heard there's a GTX780 special edition handmade and signed by Jen Hsun Huang waiting for you in the lobby at NV headquarters in Santa Clara. I heard it beats the heck out of Tenerife.
Posted on Reply
#17
TheMailMan78
Big Member
Crap Daddy said:
I heard there's a GTX780 special edition handmade and signed by Jen Hsun Huang waiting for you in the lobby at NV headquarters in Santa Clara. I heard it beats the heck out of Tenerife.
Buying plane ticket nowz!
Posted on Reply
#18
the54thvoid
I still don't really know why folk say the 7970 is over priced. It's a consumer article made by a private company for profitable means. The stark reality is it is better than the 580 by a reasonable margin and can grossly overclock without any hassle to make it vastly superior (to me that means 40-50% faster).

The 3GB AMD card is on par (or cheaper) than the 3GB GTX 580 versions. Likewise the 6970 was priced reasonably high at launch (although the premium to move to the 580 was not proportional to it's superiority). The 7970 requires to be priced higher than the previous best performing single gpu card - that is just reality.

As for the 680, if it has a lower production cost (than the 580 had) then it is not unreasonable to assume it will sell at a competitive price. Many reports mention it is an efficient chip, unlike Fermi. If that is the case, it does not need an exhorbitant price tag. NV marketing knows how to sell (for better or worse, ethically) - It is not unreasonable to suggest they release a superior card and use AMD's high pricing to make consumers double take AMD's prices. "Hey look at those AMD douchebags ripping you off" scenario.

As for people harking on about AMD will just release higher clocked cards to 'hump' the 680, that's an invalid point. IF GK104 is efficient and conservatively clocked, then it may also be an overclocking dream - we dont know yet. My 580 can run at 950 (23% overclock). A 7970 at stock is 925, a lot of reviewers topped out at 1125 (TPU review hit 1075). That's a 21% overclock. Okay, so my 580 is a Lightning but the point is the same, overclocking can be done on both sides.

The 680 will also be the contemporary top tier NV card. It doesn't matter if it is not the uber perfoming card of myth. It is NV's top and possibly the worlds top performing single gpu card. If all the reasonable rumours are true, GK110(112, whatever), the daddy Kepler card IS the be all and end all and NV are in no rush with it. They've seen Tahiti and thought, "oh, is that it!" and focussed on the GK104 launch because they know they can beat it. It's a stern possibility that whatever AMD come up with, Big GK will win. Reasoning?
GCN is AMD's new design. They'll evolve their compute design for better or worse to compete with GK. NV have CUDA well under control. They can shrink it onto the current fab process and make it a monster.

I really think this round of gfx cards are little 'offerings'. AMD saying, "oh looky at our new compute stuff" and NV saying, "oh looky at our new efficient card". I think Q4 2012 will be when the real shit hits the fan and both camps make tweaks and redesigns that establish their proper power play.

Oh, Charlie at S/A says TSMC has halted ALL 28nm processes for now due to an issue.
http://semiaccurate.com/2012/03/07/tsmc-suddenly-halts-28nm-production/

Anyway, all of this is just logical personal opinion. I'm just as eager as all to see the real benchmarks from reviews.
Posted on Reply
#19
bear jesus
farquaid said:
Question i ask is how well it will work. It would be a really good thing if can remove dips in the fps. Those happen very sudden so I think it would be hard to instantaneously boost clock speed and if it doesnt boost speed instantaneously it would have to predict the future.
That is a very good point, if it could respond fast enough and with enough of a boost it could in theory possibly improve the game play experience across the board by at least dampening the fps dips.

I would expect it to act kind of like AMDs powertune but in reverse.
Posted on Reply
#20
Casecutter
Benetanegia said:
I was flamed by many, because 15% over GTX580 was miraculous and Nvidia would never come up with something much faster and if they did it would cost $1000 and draw 500w and whatnot.
The reference HD7950 3Gb is showing 16% better @2650x with an $550 MSRP. While a GTX580 1.5Gb originally MSRP at $500! So AMD gave another 1.GB memory 16% performance, better efficiency and at that time didn't see Nvidia challenging with a GK104, so that price was not totally out of line.
Posted on Reply
#21
phanbuey
bear jesus said:
That is a very good point, if it could respond fast enough and with enough of a boost it could in theory possibly improve the game play experience across the board by at least dampening the fps dips.

I would expect it to act kind of like AMDs powertune but in reverse.
I wondered that same thing, there is definitely no obvious way of doing it... like turboboost makes sense because it can detect when an application is bound by clockspeed bc it is a single thread, and then boosts that core with that thread....

Unless it dynamically overclocks the bottlenecking parts of the GPU, I don't see how could benefit. I mean, it is clear that it will save power by doing this but power saving always = more latency and reduced perf. Maybe it detects a safe overclock and applies it during games? The only other option is if the card boosts to an unstable long-term clock... but something that is stable for short bursts.
Posted on Reply
#22
newtekie1
Semi-Retired Folder
Batou1986 said:
Am I the only one who sees this as a featureless feature its the same thing as QnQ and Intel Speed Step only the clock go's up and down.

IMO its basically saying here's a 750 hp engine that's listed as 650 hp but has this awesome feature where you press the red button and it has 750 hp
I don't see it as such. I think what they are talking about is more of a short speed boost that if ran at constantly would overheat the card. When really high loads are detected, the card overclocks itself for a short period of time, which will overload the cooler if done for a long time.

The cards already do power saving when not under load, but this detects extremely heavy load and cranks up the speeds to overcome. For example:

Image you are playing a FPS and someone throws a grenade and there is an explosion. This is an instance of high load, where a normal card would experience a framerate drop(or lag spike). But the GK104 detects this high load and momentarily boosts the clock speed to help mitigate the lag experience.

Using your example, it would be a 750HP engine that has to use a 650HP engine's cooling system due to space constraints, but you can push a button and for a few seconds get 750HP.

farquaid said:
Question i ask is how well it will work. It would be a really good thing if can remove dips in the fps. Those happen very sudden so I think it would be hard to instantaneously boost clock speed and if it doesnt boost speed instantaneously it would have to predict the future.
They already have the "Render 3 Frames in Advance" option, so....

But I think it could be a matter of only taking a frame or two to boost the speed.

Frame 1: This frame is really hard to render.
Frame 2: Speed boost kicks in.

We know the cards are already measuring load, so it probably isn't hard to detect hard to render frames and give a momentary speed boost.
Posted on Reply
#23
phanbuey
^^ cant wait to see the reviews and what this does to aftermarket overclocking

There is probably a time limit too... what if you're playing a game that gives the card an all-round general hard time...
Posted on Reply
#24
newtekie1
Semi-Retired Folder
phanbuey said:
^^ cant wait to see the reviews and what this does to aftermarket overclocking
Actually, now that I think about it, it only really has to detect framerate. The drivers are already monitoring framerate in real time, that is how OSD programs like FRAPS work. So, when it detects a drop in framerate, speed boost kicks in for 30 seconds(or whatever) to help through it, then some kind of cool off period between boosts or something to keep the card form overheatings as well as a maximum temp for the cards to run at where beyond that temp there will be no speed boosts until the card cools off.
Posted on Reply
#25
m1dg3t
I think this speed boost thingy is going to cause them problem's :o

the54thvoid said:
I really think this round of gfx cards are little 'offerings'. AMD saying, "oh looky at our new compute stuff" and NV saying, "oh looky at our new efficient card". I think Q4 2012 will be when the real shit hits the fan and both camps make tweaks and redesigns that establish their proper power play.
It's always this way; new arch = new sale's = $$ for R/D = better chip/s = more sale's! Rinse & repeat :)
Posted on Reply
Add your own comment