Discussion in 'Reviews' started by W1zzard, Nov 7, 2010.
To read this review go to: http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_580_SLI/
The best review on the Internet; as per usual.
Well done to nVidia on the power consumption and I only hope the 580 makes the cards I'm interested in cheaper to buy!
nice w1z, first sli review, dig it guys
New upcoming drivers will definitely improve SLI scaling
Cards are pretty boss... but too expensive. I'll be just fine with my $480 CrossFired 6870s.
(But if anyone feels generous, I'll take two 580s. )
Hey W1z, I think it would be cool in reviews like this to include some other Crossfire/SLi configuration for comparision as well. Like maybe in this one it would be nice to see two HD5970s in there to see how things compete(but I know this would be impossible if you don't have two HD5970s).
Other than that, another great review!
+1 on that. Maybe some 6870 crossfire comparison, or later a 6970 crossfire comparison.
Great review as always no complaints here.
Yes. A comparison with multicard 6870s, 6850s, and 460(1G and 768)s would provide more bang per buck and bang per watt information.
Thanks for the review.
Value and Conclusion
I differently agree, and I see this as a good move forward with Nvidia.
Sli GTX 580 and only 33 fps in Metro 2033 at 2560 res. $1000 for 33 fps. That game must be the shittest coded game in history. Its programmers should be shot.
It's no different than how Crysis was 2 years ago. Both developers should be shot!
Very much agree. They both have developers...
Very nice review, extremely informative and detailed.
Looks like the GTX 580 is an awesome card. Now I will wait and see what Cayman can do. I am leaning more towards Nvidia but then again we shall see. I keep hopping over the fence all the time.
Can't agree with that really:
580 SLI = 2 GPU's
5970 CF = 4 GPU's
I imagine like crysis everything is rendered in real time, which actually makes metro and crysis very very very well coded
All those subtle effects the games have all add up you see. it makes them more life like, or more dynamic ( what ever effect they're after) but they do eat up resources.
It's just a case of diminishing returns though, other games look nearly as good so people think bad coding.
But it's just a case of people not well, I guess seeing all those subtle things!
It isn't shitty coding, it is the fact that they have actually used DX11 heavily, including heavy tessellation. The main problem is actually memory at that resolution, which is why if you look W1z had to drop down to 0xAA just to get usable results. Whenever a game pushes a huge amount of detail along with a huge amount of tessellation you are going to run into poor performance at extreme resolutions.
I say the developers should be praised, because without people like them pushing the envolope of performance and demanding more from the hardware, we would see advances in hardware and we'd all be playing crappy DX9 console ports at 720p...
How many GPUs doesn't matter to me. The two cards are priced similarly and perform similarly, so I'd be nice to see how they do when paired. Kind of like a "This is what $1000 buys your from nVidia, and this is what $1000 buys you from ATi." That is information that someone looking to put out $1000 might be interested in, IMO.
Once again, great thorough review. I'm sure driver improvement will have SLI performance doing much better. It is rather odd though that in Metro it does OK at 2560 but is severely crippled at 1920. I'm beginning to think even Cryostasis is far better optimized than Metro.
1 grand is a lot, but it's not a whole lot of money. It's less than 100USD per month, per year, and around 50-100USD per month electric depending on how you use them.
This market is for serous gamers that just want to play games full out, or someone who only upgrades every few years. That's how I view the fastest of fast GPU market.
2 GPU vs 4 GPU it does not matter, because it's two cards on top of being almost at the same price point. I agree with newtekie
What other planet do you live on?
If someone develops a game that completely ignores the capability of the hardware available to the average person, then that is also shitty coding.
That is why there are lower settings in the game. If they forced you to play at the maximum settings, then they would be ignoring the capability of the hardware available to the average person. They are using all the feature sets available to them as much as possible, that is a good thing.
What is the point of DX11 if it isn't used? Why still code games with DX9/10 when we have DX11 GPUs? That is essentially what you are wanting them to do, right? You don't want them to use DX11 to its fullest.
Its called a port from PS3/Xbox360 which is what pcgamers have been receiving of late.
Show me where this game is playable at any resolution that anyone here wants to play at, without spending a $1000 on video cards.
What is the point of arguing if you're going to ignore facts?
I have no problem playing it with a single $190 GTX460 and $50 Celeron at 1920x1080...
One of us is ignorant of the facts, but it ain't me.
My mistake. 27fps is playable. Have fun with that.
Separate names with a comma.