• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Will you wait for NVIDIA's next-generation Kepler GPU?

Will you wait for NVIDIA's next-generation Kepler GPU?

  • Yes

    Votes: 2,512 38.5%
  • No

    Votes: 1,494 22.9%
  • My GPU is strong enough to let me skip this new generation

    Votes: 2,512 38.5%

  • Total voters
    6,518
  • Poll closed .
I just swapped my GTX 580 for an HD 7950 on the basis of saving power, heat, and noise levels. The GTX 580 was supremely powerful and easily handled any game I could throw at it on high settings at 2560x1440, though a few I never bothered maxing out in terms of AA/AF or Ultra eye candy. The HD 7950 does make those few games fast enough, and having twice the ram surely helps as well. Not to mention overclockability on this Sapphire card, just for fun the first thing I try is 1ghz core and fire up Dirt 3 and before I know it I'm playing for almost 2 hours rock stable :laugh:

AMD has made incredible strides in efficiency with these cards and paired with what Intel has managed with Sandy Bridge high end gaming has never been so good. As I type this my Kill a Watt is saying I'm drawing only 83w, and during that dirt 3 run it was about 225-230w. But this is significantly lower than the GTX 580, some 90 watts lower on gaming, 20-30w lower on light use/idle, and with my fans turned down and this fanless PSU my system is inaudible...the Sapphire cooler paired with the efficient HD 7950 is just a great match, typically the only way to get a silent GPU is either to buy a passive card or to water cool...high end gpus just don't usually get this quiet, I'm amazed :laugh:

So given how fast the GTX 580 is and how easily it manages games on the market today, even if Kepler is significantly faster it would only mean Nvidia gets my upgrade dollars if they manage to out-efficient AMD. I'm not really interested in having a box that draws 300w at idle and 600w at load, it is just not necessary right now. In a year or two, perhaps with the next round of big title games, the Crysis 3, Stalker 2, Battlefield 4, or what ever we have coming...but for right now gimmie low power, low heat, and low noise if you want me money. :rockout:
 
Wickerman, I almost draw your load wattage at idle :roll:, I agree that Tahiti is very good on watt/perf
 
I just swapped my GTX 580 for an HD 7950 on the basis of saving power, heat, and noise levels. The GTX 580 was supremely powerful and easily handled any game I could throw at it on high settings at 2560x1440, though a few I never bothered maxing out in terms of AA/AF or Ultra eye candy. The HD 7950 does make those few games fast enough, and having twice the ram surely helps as well. Not to mention overclockability on this Sapphire card, just for fun the first thing I try is 1ghz core and fire up Dirt 3 and before I know it I'm playing for almost 2 hours rock stable :laugh:

AMD has made incredible strides in efficiency with these cards and paired with what Intel has managed with Sandy Bridge high end gaming has never been so good. As I type this my Kill a Watt is saying I'm drawing only 83w, and during that dirt 3 run it was about 225-230w. But this is significantly lower than the GTX 580, some 90 watts lower on gaming, 20-30w lower on light use/idle, and with my fans turned down and this fanless PSU my system is inaudible...the Sapphire cooler paired with the efficient HD 7950 is just a great match, typically the only way to get a silent GPU is either to buy a passive card or to water cool...high end gpus just don't usually get this quiet, I'm amazed :laugh:

So given how fast the GTX 580 is and how easily it manages games on the market today, even if Kepler is significantly faster it would only mean Nvidia gets my upgrade dollars if they manage to out-efficient AMD. I'm not really interested in having a box that draws 300w at idle and 600w at load, it is just not necessary right now. In a year or two, perhaps with the next round of big title games, the Crysis 3, Stalker 2, Battlefield 4, or what ever we have coming...but for right now gimmie low power, low heat, and low noise if you want me money. :rockout:

This is all an excellent reason why I'm keen to see Kepler. I'll make the same 580=>7970 jump (and overclock on water for even less noise) but only if kepler isn't like you say, far better or equal but better efficiency.

The one problem i see with AMD right now is piss poor driver support for their 79xx cards given the posts I'm seeing popping up about 120Hz tearing issues, crossfire and other mini-quibbles (some gaming anomalies).
 
that will seriously depending on if Nvidia will start putting atleast 2gb memory on their cards as a minimum or else i think i will keep my EVGA GTX 570 and or maybe buy a Cheap used EVGA GTX 580 card.

i wish that you had desired something else too as they stated that the new series will be alot different and 2gb memory would be there as minimum.
 
i see no real reason for any more than the 1.5gb on the 580 currently for my own uses. res never exceeds 1920x1080. even with all the eye candy on in every game... 2x 580's are more than enough. and yet... i will be getting next gen :P
 
This is all an excellent reason why I'm keen to see Kepler. I'll make the same 580=>7970 jump (and overclock on water for even less noise) but only if kepler isn't like you say, far better or equal but better efficiency.

The one problem i see with AMD right now is piss poor driver support for their 79xx cards given the posts I'm seeing popping up about 120Hz tearing issues, crossfire and other mini-quibbles (some gaming anomalies).


Honestly I've had nothing but trouble with Nvidia drivers as of late. Ive had to keep updating to the latest beta drivers to avoid desktop manager crashing issues with Splashtop and that seems to have caused trouble with a lot of different games. BF3 has never worked right (flashing textures), The Darkness 2 crashes, and I remember having minor issues with a few other games that would pop up off and on depending on which driver was used and which version of the physx engine (I think PAYDAY: The Heist was particularly sensitive). But the 7950 has only been using the latest pre-release driver and have only had 1 issue. That issue being the task bar flickering randomly, which I solved by swapping my dual link DVI cable back to a DisplayPort cable, like I used with my HD 5850 before the GTX 580. I know traditionally I'd agree, ATI has always been a step behind Nvidia and they still lack a lot of features I liked about the Nvidia control panel but thus far (and I'd hate to jynx it...) I've been rock stable.

and radrok, your system is absolutely beast lol. But with the tech in the new 7900 series, the dual gpu and crossfire solutions will draw no more power than a single gpu since they are able to power down the "slave" cards. So the days of high idle power even on systems like yours are numbered :toast: Dual "7990" probably would only sit within 5-10w of a single 7970, depending on their final configuration if they ever exist.

So is anyone else here in agreement then? Would performance be the only factor here or would efficiency play a larger role? If Nvidia's answer to the 7950 and 7970 launch and consume significantly higher amounts of power and produce enough heat that lead to more aggressive cooling...would that push anyone away?...Or is it still just about the most performance you can get at the lowest price, damning all other factors to get it :D
 
I honestly won't care about the power consumption of Kepler if it's going to yield lots of performance over the 7970, but I admit that the Zerocore feature is a step in the right direction for multi gpu platforms and has tempted me to jump on Tahiti, still the performance jump isn't there for me, so I am going to settle until Kepler comes and I'll see what to do :toast:
 
WTF. 12%? how'd you come up with that one?

Thats how many gray hairs I had on my left nut this morning. 12. Therefor Kepler will be 12% faster then the 7970.
 
My crossfire 5850's still hold up pretty well.
 
780Ti incoming...
 
Back
Top