Discussion in 'Reviews' started by W1zzard, Jun 27, 2009.
To read this review go to: http://www.techpowerup.com/reviews/Inno3D/GeForce_GTX_295_Platinum/
Excellent review W1zzard. Thank you for the detailed and very thorough disassembly instructions. Also the High Resolution shots of the front and back are going to be helpful to those hard modders, or those freaks that set it as their wall paper.
Awesome review, I really can't imagine any room for improvement.
dont say that if you do he will go soft and start writing bad reviews.
anyway nice review.
there will be a few improvements in future reviews. 2560x1600, card power only measurements and several new benchmarks
I've read somewhere that each display output is linked to a NVIO chip. It is believed they did this so that the second core could be initiallized for folding and other CUDA apps under Windows Vista/7.
from what i understand cuda doesnt use the nvio at all because it doesnt output anything to a connected device. i'll ask nv tomorrow
Single NVIO chip would have saved so much space. Could have moved the NF200 in the other spot and moved also the memory MOSFETs to the middle. Liking those Renesas R2J20651NPs though, seem easy to cool.
Will be tuning in on what nvidia will say.
Correct, however on Windows Vista/7, you can't use the GPU for CUDA unless it outputting to a display.
This is why we have to use dummy plugs for folding on the second GPU.
so the GTX295 with 1 PCB uses more power than the GTX295 with 2
I guess its because . . . well . . . . theres still 2 actual cards on that 1 PCB
. . . . . i don't know
The single PCB uses a different power setup than the dual PCB version.
What exactly moves in an SLI bridge? I think it is display output (in some form), so the GPU managing the actual output could interleave the frames (or portions of it) in AFR or tiling. Hence I think the second GPU needs a display logic too. I can't back this.
Yes, the single PCB card has significantly higher power consumption figures compared to the dual-PCB one. Its performance/watt takes a hit at a result.
Very surprised to see DrMOS PWM being so much less efficient than Volterra chippery.
25-45W difference is really not that nice...
Very nice. =)
How do you measure the card power?
It can be done with a (clamp) ammeter and a motherboard that has isolated 12V PCIe slot power traces, or a board with a tad modified PCIe slot pins.
An ammeter that you run the leads into then out of will give you the current. The current clamps that you simply clamp around a wire to get the current (no splicing into, etc.) will only work with an AC current. Just an FYI
Good review for a good card. What do you mean by "ATITools successor," though? Where can we download that?
Great review.. those heatsinks remind me of the Sytrin KuFormula ones (the VF1 has 2 thick heatpipes, but the copper block was really thick).
ATI needs to hurry up and release 5870X2 cards!!!
as always, great review.
I was NOT expecting power draw to go up... thats not so good.
you can use the hall effect to measure DC
Oh yeah, that figures, I didn't know such motherboards existed. Do you mean you could mod any board so that the PCIe slot pins are fed 12V directly from the PSU? Because then the inline ammeter from the psu feed would give you the whole current easily, as Beertintedgoggles suggests.
@W1zz, I'm off to Wikipedia to read up on the Hall effect.
Yeah, I just bought the Gainward GTX 295 (singe PCB model) two days ago, been waiting 2 months for this card because I though it would just be nicer to have then the older version and now I find out that it eats heaps more power, I was expecting it to at least use a tiny bit less power as it was widely speculated the V2 295 to run more cool, I am actually quite fussy about power consumption and I even get the occasional emotional beating up speech by the person I live with about power bills, but I still wanted good GPU performance while only using a single PCIe slot.
Now I am going to have to keep this video card power usage discovery to my self.
I thought I was lucky to get one so early but now I see it was a bad move.
I feel like the worlds biggest sucker and even a bit of a looser now.
If only this review had come out a few days earlier I would of just got the old version.
I guess I can just cross my fingers and hope some how that magically the Gainward version some how only uses the original versions amount of power compared to the Inno3d version, but I am pretty sure thats just wishful thinking.
I do have to thank the reviewer here though, at least he had the guts to tell the truth compared to so many other "wanky suck up" or "kiss ass" tech sites out there these days.
ATX24 power connectors on PCIe motherboards feature additional 12V & 3.3V lines (compared to ATX20) that, afaik, are not used for anything else but PCIe slot power so one might even get valid a reading by measuring that single 12V wire from the ATX connector. Then again, some of those twisted mobo makers may have put other loads on the same 12V plane so it might give too high power draw values...
Mobos often have a molex connector near PCIe slots "to provide more power for GFX". So there's the possibility to cut off the connection between the molex and ATX24 plug by cutting the power trace on the PCB and feeding the slot via this onboard molex only.
OR, one could desolder power pins from the slot and isolate them from mobo PCB and solder wires on them directly. Though, this may be mighty hard feat to accomplish as those pins are bloody small and pin holes the slot is mounted on are not that large.
OR, one could put electric tape on certain five fingers of a PCIe x16 GFX card that deliver 12V (3.3V draw is insignificant) to the card and then solder a 12V wire on the card and bypass slot power altogether.
Separate names with a comma.