Discussion in 'News' started by btarunr, Mar 19, 2012.
That i3 bench has the i3 bottlenecking it I guess.. I can get 53 fps something with this 775 quad and the GTX580
more clear image
In case if someone wondering :
I think the 7970 and 680 are going to be neck an neck.
Where in the article does it say the stock clock is 706mhz on the core lol!
EDIT: I will definitely be getting one of these cards for sure around summer time.
I don't know if you're trolling... :shadedshu
Can you read this?
The Gpu-z capture.
god yeah wtf.
DX10.1 didn't do shit for ATI cards. The GTX260 Core 216 was still raping them.
and so then that means according to some other stuff that has been released they are capable of a 63% overclock.
That guys i3 is holding the 680 back a bit, but it stil lbeat a guys 7970 at the 706mhzx,unless the dynamic clock stuff took it to 1006. Either way once drivers are better(since the minimum FPS seems a bit low) I think this card will take off.
I notice that the driver version is different.
300.99 in the earlier SLI leak and 300.65 in this Gigabyte retail.
not that surprising that a retail box would have old drivers on a new card which will likely go though several driver updates in the months following release. Gotta pre package that stuff weeks in advance.
That version of GPU-z DOES NOT SUPPORT KEPLER. So we don't really know.
So you can not read...
Ha said the NV control panel showed the same clocks!
OK wise guy look here, same forum, same thread, another aparent owner of a GTX680 a little later:
The right GPU-z version.
Thanks for this at list my mind is at peace knowing it supports DX11.1 according to GPU-Z 0.6.0 now just to confirm if it supports 4K if so, i might just still use Nvidia for my next GPU upgrade, tho AMD Radeon HD7970 seems appealing.
I really don't think that's important right now. Things are really heating up over at Overclockers.
Look at the unigine result and compare.
There is one guy showing the card doing about 2000 at heaven benchmark.. And one guy showed a score of GTX 580 SLI, 2200...
linky. I want to watch/read it.
So GPU-z reads: 294mm2 and release date: Mar. 22.2012
Yep! that's what the whole Adaptive V-Sync (also what was termed Dynamic profiles) does... it manipulates the clocks of GPU/memory, along with stimulates or restrain sections of cuda cores as need to alter and maintain frame-rate transitions to be "fluid and/or organic". Basically when 3D scene starts dropping frame-rates below monitor refresh-rate or above, it limits or augments such resources trying to maintain close to 60Fps. As the scene loads the GPU and other chip/card resources provide exactly how much energy each single scene require smooth frame-rates. So within milliseconds the card juggles various resources dynamically against present profiles.
What's nice is now Nvidia doesn’t need to supply a cooling system built to handle the constant say 40-50% top OC’n spikes, it’s BTU performance can be scaled back, because chiefly the profiles will maintain the lowest clock for the 3D load and FpS required. The chance of a 40-50% OC boost might last a few seconds then be down again. That permits a theoretically higher max TDP, because they turn up the heat that high very infrequently. The only downside is the conventional OC’n enthusiasts are use to may no longer be there, if you disable the Adaptive V-Sync now that chip has a TDP of 160W consistently, OC’d at say 775Mhz.
NVIDIA implements a kit-bashed/subset of DX10.1 with thier "DX10.0" CUDA cards i.e. NVIDIA actively destroyed MS's DX10.1 standard.
I still am worried about this...its 256-bit and not 384-bit ...it's 2GB and not 3GB ...so if you were to play n-finity (nvidia's eyefiniti) you won't get the best bang for you buck. For the price to be 15% higher than a 7970 it better damn do better than a 7970! I just don't see the comparison that much here ... BTW, what's with the 6-pin connectors?
It's not fugly at all, also why would you care how the PCI-E power ports look as you will never see them when the card is plugged in, that's a pretty lame gripe . .. but to each their own I suppose.
Separate names with a comma.