Discussion in 'Reviews' started by W1zzard, Mar 16, 2012.
EVGA, ZOTAC, Asus, Gigabyte
.............meanwhile, hd 7970 .........$550......
1.70 GHz core?
Actually, you've got it completely the wrong way around. AMD uses an on-chip power controller that simply drops power according to GPU load, and DOESN'T rely on software in any way, shape or form. Nvidia uses the drivers to throttle power to the card by on-board power limiters based on pre-set limits in the drivers. That makes a world of difference in how it works and why AMD's does and Nvidia's doesn't.
lol EVGA, Gigabyte are now sold out again... aww
not for long.
i think its a nice idea to include the 78xx series in the benchmarks.
anyway, nice review and thanks!!!
I sat there for about 2 minutes hit refresh and they were already sold out. Geeze these people are serious about their cards haha.
On a side note i noticed evga is not offering a lifetime warranty on the 680s what's up with that?
Good work W1zz...
While, I've been trying to check other reviews (in between doing work). Several things:
- Nvidia now appears to have absolutely changed the game as the Boost found untapped potential without any reviews seeing any glitchy-ness during actual play; although most test aren’t covering real game play. While what the Adaptive V-Sync was different than Turbo Boost but near shown how it impacts FpS?
- I thought that first side saying a “new class of enthusiast card” because the 195W/2-pin GTX680 was misleading to say 7970/580 required a 6 & 8pins and had a TDP of 250W. That’s more a issue for the GTX680 as just because Nvidia can’t/doesn’t offer the old stratosphere OC’s and has it fitting with no way to turn off dynamic clocking so you get a algorithm will" _try_ to respect what you'd like it to do". Such card aren't "bad" because they permit options that aren't there for the GTX680. A 7970 it could easily make-due with two 6-pins and a much lower TDP had AMD restrain what enthusiast could expect. Just look at the power consumption if the CSN didn’t kick in under maximum load they wouldn’t be all that different.
- I see why they priced it $50 less it would’ve basically “one-up-man-ship” a 7970 and with that alone not a great wow. I think when at 2650x the 7970 has more value as when in games where it matters in (low Fps) the 7970 holds more advantage look a Metro, A&P, both Crysis titles, Shogun or even Skyrim, while we need to see how bigger resolutions 3-panels performs.
- I don’t really see the efficiency considering the means of using the Turbo Boost "Clock-Speed-Nanny" (CSN) and a small chip. I thought the whole CSN was going to work more at the Cuda-core level also; I haven’t got any reviews outlining such details.
- Improper case airflow will curtain your performance, most folk won’t even realize.
- There's still a bunch of things to determine here, but for most general operators this is nice cause it promotes plug-n-play and work with many PSU’s and cases… Really idiot-proofs this "new level enthusiast".
- I don’t know if we’ll see any price war from this, 7970 might see more rebates, but until TSMC get’s things moving there’s not enough volume for either to go running scared.
- Now Nvidia needs to keep releasing cards like this down the price structure to really make an impact. Can they continue releasing lower cost sku’s with Clock Boost PCB’s/technology at the real mainstream price point’s. Because the whole lower cost chip/clock boost is what got them to eke-out over the 7970. If they can't this will be ho-hum in 4 months.
Haven't bought green team in years. Considering it!
Nice card but it has that 300€ card feeling all over it. PCB looks quite empty, vrm weakened, 2x 6pins...
Now, where is my GTX 690? Doesn't matter if its 2x GK104, I take that too if its priced reasonably.
Impressed with 680. NVIDIA finally figured out how to make a power efficient high end video card, and at a reasonable price.
But I'm still not spending $500 on a video card. If I were buying right now I'd go for a 7870 priced closer to $300. But if I were in that situation, maybe best to wait and see what GTX 670 looks like.
1.) So you are admitting that AMD does the same thing then?
2.) The chip on the GPU only measures those figures, the driver then uses them to estimate power consumption and drops the clocks.
Again, go read more detailed explanation that W1z gives. The chip on the GPU is nothing more than a controller chip that was already present on AMD's GPUs that reads the data, the driver is still what is doing the actual calculations for power consumption and clock/voltage adjustements.
But I'll stop arguing with you now, because I know you know it all, and AMD definitely "HAS NEVER used a driver to forcefully control power to the card", even though I just showed you they have/do.
What matter is it successfully competes with 7970 wich is more expensive and thats make the price not die size ;if 7970 would have been cheaper this would to.
If this is the power of GK104, I cannot wait to see what GK110 can do.
Finally some good product from the green side! A nice way to force AMD drop price. However I am not very impressed. So basically the card is already being pushed to its limit while only marginally beat stock 7970. Well, we all know how much 7970 can OC.
Now AMD drop the damn price so I can get some 7970 cheaper.
I agree, I just can't justify dropping 500 for either card right now. GPU is perfect for dual GPU card so it left me drooling with the idea of 690 already.
Reading this review was a real treat, thank you Wizz . Awesome review which explained every detail in depth, cant ask for more.
great review wizz
WTS my left kidney! anybody? j/k
great job green team but i think ima keep my gfx till next series
Anyone else noticed that the 3d stock clock is 1110.5?
.... You have everything set to show what the max was thats not stock.
As for those saying 7970 overclocks way more take a look at this from guru3d.com:
"on average our card was managing 1250 MHz without any kind of voltage tweaking perfectly fine. The 10K 3Dmark 11 score is certainly testimony of that."
1250 WITHOUT voltage tweaking.
Because it Auto does that for you...
I don't know if i got it wrong, but that's what i understood after reading this:
"Interesting to see is that the feature maintains itself while overclocking. We overclocked the card to roughly 1250 MHz and even then the Dynamic Clock Adjustment technology kicks in, but here's where it will often clock down a little bit. overall though it did not hinder the overclocking experience and on average our card was managing 1250 MHz without any kind of voltage tweaking perfectly fine. The 10K 3Dmark 11 score is certainly testimony of that."
So, if they voltage tweak the card i think 1350Mhz would be attainable.
yeah the GTX680 while on LN2 did 1.8GHZ.
Clocks like that don't matter for about 90% of people though. the most common max clocks on these cards seem to be 1200mhz to 1300mhz
There some variables you have to consider when comparing cards when overclocked. Mainly its whether its a cherry picked card or not. most cards won't overclock the same.
Heres a screenshot posted on facebook from Evga of the 680 doing 1400mhz. Also I think if its possible nvidia could include the option to turn off the dynamic clock thing through drivers. Its just a thought.
You are really doing great tests, the performance summary and for overall awesome. My only concern is the 3DMark11 tests. It would be much better to see the points in default performance mod, because these FPS numbers means me nothing. I cannot compare with my and other systems.
Separate names with a comma.