Discussion in 'Reviews' started by W1zzard, Nov 12, 2013.
To read this review go to: http://www.techpowerup.com/reviews/EVGA/GTX_780_Ti_SC_ACX_Cooler/
Dammit I want one!
But IMO... What money you are SAVING buying a 290X you end up paying for in your electric bill. Obviously thats not true for everyone but here in the UK, people are being forced to decide between having skipping meals to pay for heating or vice versa.
But I digress.... Its gonna be a harsh winter and the 95'c of the 290X may well come in handy as a room heater.
My crossfired 6970s that routinely ran up to 85-90'c surely did (thank you XFX)
I used to warm my hands with the exhaust from my old 4890
In Furmark it ranks up to a 36W difference though, which absolutely no one can use as an argument for this card being more more economic than a 290x. That's less than an old lightbulb.
And if you have to choose to buy food or heating you're probably not in a position to buy a high end GPU anyway.
That... is neat!
Awesome card with that stock speed... but I can't agree with FreedomEclipse's point... We are taking about at 10W difference with the 290X average usage... that's not even a light bulb's consumption... the difference is negligible... its not like 250W vs 400W....
On the other hand, as Wizz says, let's see what will happen with the retail cards. On AMD's side things look terrible according to some reviews with retail models being 10+% slower than review samples... (due to throttling from heat and driver changes)
Wtf is the EVGA 780Ti Classified going to do?
780 TI Classified PCB:
That's a 680 Classified PCB
FYI - Im comparing the max power draw, not the peak draw
Though, even 11w bulbs are pretty bright now days.
though starving or not it doesnt make my point any less valid - the X290 still isnt as power efficient as the 780Ti however marginal the differences are - It still has an impact on the electricity bill at the end of the month.
I went from an older system - C2Q with 6970s in crossfire and there was quite a substantial drop in my electricity bill when that particular rig was retired and I transfered over to a firstname.lastname@example.orgGhz and 680s in SLi
36W then. Which is an unrealistic load (Furmark), and you can get the same savings from ... spending less time at the can? Yes, there are savings, but at that point they are negligible. Now if you had like 4 cards and ran Furmark 24/7 it would make a nice differance.
Different systems has different power draws, that is obvious. I don't get your point with that example.
BTW, the jump from the 780Ti to this particular model is nearly the same as the jump from this model to a 290x.
No need to harp on power savings here, the performance gap is enough. You can talk about power when some custom 290X tries to close the gap with factory overclocks. Then you'll have a ridiculous power consumption difference.
780Ti still has less power draw.
That card is awesome! I wish I had the money for this! Thanks W1zz
Peak is gaming in a loaded condition and closest to real world, looking at Furmark means nothing. What would be more correct on such cards is pull the power usage over a group of titles (6 minimum) then also translate that Perf/watt.
I find it odd that Wizz uses Crysis 2 at 1920x1080, Extreme profile for those Peak numbers while that isn't even a title that FpS results are provided for? How is perf/watt calculated?
Honestly, do we really care about perf/watt when we are talking about $700+ video cards?
that's up to you to decide. I simply provide the data. I'm usually not looking at it too hard for high-end, unless it affects heat/noise.
We chose Crysis 2 as a standard test representing typical 3D gaming usage because it offers the following: very high power draw; high repeatability; is a current game that is supported on all cards because of its DirectX 9 roots; drivers are actively tested and optimized for it; supports all multi-GPU configurations; test runs in a relatively short time and renders a non-static scene with variable complexity.
If I change the test I'll have to retest all existing cards. I doubt any newer game will make a difference, but I'll give it a try before the next round of rebenches.
Edit: If we used something like BF4 at Ultra settings (which is probably what you are looking for), what will happen to all these cards with 2 GB and below? They will either not run or be bottlenecked by swapping from VRAM into system memory, showing wrong power consumption. What about IGP power consumption?
Don't worry, I do not pick tests randomly
We should "care" cards should provide some degree of improvement (probably something we can't truely expect holding to 28Nm), but not focus on that one limited data point to call it good, bad, or whatever isn’t a proper matrix.
Oh I understand the need for repeatable data, but at least that Crysis 2 should be still a game that the FpS is provided. As many these cards fluctuate clocks while spitting out frames it’s hard to correlate a single collected point and allocate that to all titles.
Perhaps a separate article that runs a half a dozen newer games at 2650x and pulls watts from each of those, then that into FpS, and then that averaged. I think it would at least in the present juncture it come set a base-line that using the current method is valid.
I swear this is a point many companies and people need to look into because a 1 grand pc folding on two gpu's plus an 8 core cpu puts out pretty much the same heat as most 1kwatt heaters yet does not use 1Kwatt and has the added bonus of doing more good then just warming fingers.
I really do think its a big opportunity going missed as any house heat generator should be made of multiple compute elements designed for high temps and long term use, wheres the IOT and why isnt it working on warming my ass(or fingers).
might as well get OT, card looks good, cant see any issues bar price and AIB sourced R9 290x's should help the price situation before xmass
Ha...talking about heat.....
I still toast my eggs on my GTX 480
Gee thanks your going to give my wife a reason to run the heater!
Hunny I'm trying to save mankind...
I've been curious of the effect on a high end graphics card if it ran out of memory.
Say something like a GTX 690 with "just" 2GB was averaging 100fps, but then ran out of memory on an especially intensive part of the benchmark, what sort of framerate figures would we see?
Thats the spirit, you could hunt for aliens or some such as an alternate
Separate names with a comma.