Discussion in 'Reviews' started by W1zzard, Mar 22, 2011.
Nevermind I've finally looked at 720p -> it is the card. lol
Good going Green Goblin!!!! Oh and great review as always W1zz
The Green team is rushing again to catch up with AMD (ATI) :shadedshu...
WOW is all I gota say!
All ready posted by Eastcoasthandle on page 3 so please try to go true all the pages before posting...
From what I am reading on other sites the GTX 590 is under clocked. So I do beleive it is faster then the 6990. And the power draw from other sites did a comparison between the GTX 590 and the GTX 570 in SLI. The GTX 570's In SLI took more power then the GTX 590. I believe the GTX 590 uses 2 GTX 570 chips.
One other thing only one other site had a problem with the fire thing. So out of all the other reviews only 2 had the problem.
Nope on all of the above. Go back and do more reading
Ok I am back in the game. Is this image the easter egg? Because we have never before seen a glimpse of Wiz's test system in a review.
Well guys, you can mock the card all you want but fact is, the GTX 590 gives the best bang for the buck.
You don't know> That is what the new GTX 590's have now they light up when there's not enough power.
Its quite hard to detect sarcasm in the internet, so I think we all can do with less. And if its not, what kind of pot are you smoking to think that GTX590 gives the best bang for buck?
Probably a divisible by 8 bug. Normally happens on console ports. Bulletstorm had this effect wherein it runs crap on 1680x1050. Because 1050 is not divisible by 8. Making a custom resolution of 1680x1048 would probably fix this.
Easter Egg Wizz got a new display toy ?
LG Flatron W3000H 30" 2560x1600 sponsored by Zotac this mean we get a GPUZ contest too ?
Also drivers had nothing to do with the card going pop he used 267.71 driver.
It's called pun.
Best bang for buck
Didn't think I'd have to explain that one.
Northern lights? I dunno
I read that sarcasm from across the obvious room...
I'll always remember this card as the Bang for your buck LuLzzz.
bring those goddamn 460 2win already !!
beast of a card but it seems so limited for such a high end product. I'd like to see non reference cards with more power phases and perhaps even another pci-e power connecter just to be sure.
for the time being though, I think there is a lot of potential for a GF114 SLi on a stick card, preferably with 2gb each. from a few reviews where they are also tested, theyre not far behind the 590 in stock trim, clock them @ 900mhz, give them 2gb each and it will likely still consume less power.
Milla Jovovich telling to Bruce Willis how the Nvidia GTX 590 debut-test went:
All puns and sarcasm are hard to detect in the interwebs, so please add a /pun or /sarcasm tag to it next time for the benefit of linguistically handicapped people
Couple of thermal pics.
so , you shove 1.2v up its ass , it blows up and then you blame nvidia? 1.2 is alot for a stock 580 (they blow up on XS with that) but again your blaming nv? bs.
the card is advertised as that being a supported feature. Also, nvidia claimed to have drivers with power throttling to prevent that exact situation.
It looks like graphics card performance is being held back more by game developers. Probably be able to dust off the old Geforce 4200 soon.....if only it were PCI-E!!
Interesting too that these cards don't overload one PCI-E slot.
Or do they??? Perhaps this is why they are so close....
Good thing theyve migrated from the TnT naming, and 'detonator' drivers.
I guess all those who preordered will have a blast with their new toy.
You're right, game devs are holding back PC graphics, unfortunately. Also, the 40nm process is running into the brick wall of power dissipation and heat. We need 28nm asap.
And no, these cards cannot overload the PCI-E slot, or they'd burn out your mobo. Therefore, they take all the extra juice from the PCI-E connectors.
Oh and welcome to TPU.
i run a lot more voltage through other cards during testing and this never happens .. on gtx 590 with nvidia's power capping feature which is designed for that purpose it doesnt work. i think it's my obligation to tell you, no ?
Separate names with a comma.