• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA GeForce GTX 580 SLI

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
28,776 (3.74/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
NVIDIA's GeForce GTX 580 has claimed the single GPU performance throne today. We take two of these cards for a spin to see what performance users can expect from this $1000 GPU combination. The review will also give insight into potential performance numbers for 3D Vision Surround.

Show full review
 
Last edited:
The best review on the Internet; as per usual.


Thanks.



Well done to nVidia on the power consumption and I only hope the 580 makes the cards I'm interested in cheaper to buy! :toast:
 
nice w1z, first sli review, dig it guys
 
New upcoming drivers will definitely improve SLI scaling:toast:
 
Cards are pretty boss... but too expensive. I'll be just fine with my $480 CrossFired 6870s.
(But if anyone feels generous, I'll take two 580s. :D)
 
Hey W1z, I think it would be cool in reviews like this to include some other Crossfire/SLi configuration for comparision as well. Like maybe in this one it would be nice to see two HD5970s in there to see how things compete(but I know this would be impossible if you don't have two HD5970s).:toast:

Other than that, another great review!
 
+1 on that. Maybe some 6870 crossfire comparison, or later a 6970 crossfire comparison.
 
Great review as always no complaints here.
 
+1 on that. Maybe some 6870 crossfire comparison, or later a 6970 crossfire comparison.

Yes. A comparison with multicard 6870s, 6850s, and 460(1G and 768)s would provide more bang per buck and bang per watt information.

Thanks for the review.
 
Value and Conclusion

NVIDIA GeForce GTX 580 SLI has it all -screaming performance, and additional features. If one GTX 580 wasn't enough, two of them in SLI will certainly help you bulldoze through just about any game, at any resolution. If 2560 x 1600 isn't enough, you can actually span your display head across multiple physical displays using NVIDIA's 3D Vision Surround feature. The added horsepower will definitely help with stereoscopic 3D gaming, when your monitor is displaying twice the information.
I differently agree, and I see this as a good move forward with Nvidia.
 
Sli GTX 580 and only 33 fps in Metro 2033 at 2560 res. $1000 for 33 fps. That game must be the shittest coded game in history. Its programmers should be shot.
 
It's no different than how Crysis was 2 years ago. Both developers should be shot! :D
 
It's no different than how Crysis was 2 years ago. Both developers should be shot! :D

Very much agree. They both have developers...:roll:

Very nice review, extremely informative and detailed. :rockout:

Looks like the GTX 580 is an awesome card. Now I will wait and see what Cayman can do. I am leaning more towards Nvidia but then again we shall see. I keep hopping over the fence all the time.
 
Hey W1z, I think it would be cool in reviews like this to include some other Crossfire/SLi configuration for comparision as well. Like maybe in this one it would be nice to see two HD5970s in there to see how things compete(but I know this would be impossible if you don't have two HD5970s).:toast:

Other than that, another great review!

Can't agree with that really:

580 SLI = 2 GPU's
5970 CF = 4 GPU's

;)
 
Sli GTX 580 and only 33 fps in Metro 2033 at 2560 res. $1000 for 33 fps. That game must be the shittest coded game in history. Its programmers should be shot.

I imagine like crysis everything is rendered in real time, which actually makes metro and crysis very very very well coded :laugh:

All those subtle effects the games have all add up you see. it makes them more life like, or more dynamic ( what ever effect they're after) but they do eat up resources.

It's just a case of diminishing returns though, other games look nearly as good so people think bad coding.

But it's just a case of people not well, I guess seeing all those subtle things! :laugh:
 
Sli GTX 580 and only 33 fps in Metro 2033 at 2560 res. $1000 for 33 fps. That game must be the shittest coded game in history. Its programmers should be shot.

It isn't shitty coding, it is the fact that they have actually used DX11 heavily, including heavy tessellation. The main problem is actually memory at that resolution, which is why if you look W1z had to drop down to 0xAA just to get usable results.;) Whenever a game pushes a huge amount of detail along with a huge amount of tessellation you are going to run into poor performance at extreme resolutions.

I say the developers should be praised, because without people like them pushing the envolope of performance and demanding more from the hardware, we would see advances in hardware and we'd all be playing crappy DX9 console ports at 720p...

Can't agree with that really:

580 SLI = 2 GPU's
5970 CF = 4 GPU's

;)

How many GPUs doesn't matter to me. The two cards are priced similarly and perform similarly, so I'd be nice to see how they do when paired. Kind of like a "This is what $1000 buys your from nVidia, and this is what $1000 buys you from ATi." That is information that someone looking to put out $1000 might be interested in, IMO.
 
Once again, great thorough review. I'm sure driver improvement will have SLI performance doing much better. It is rather odd though that in Metro it does OK at 2560 but is severely crippled at 1920. I'm beginning to think even Cryostasis is far better optimized than Metro.
 
It isn't shitty coding, it is the fact that they have actually used DX11 heavily, including heavy tessellation. The main problem is actually memory at that resolution, which is why if you look W1z had to drop down to 0xAA just to get usable results.;) Whenever a game pushes a huge amount of detail along with a huge amount of tessellation you are going to run into poor performance at extreme resolutions.

I say the developers should be praised, because without people like them pushing the envolope of performance and demanding more from the hardware, we would see advances in hardware and we'd all be playing crappy DX9 console ports at 720p...



How many GPUs doesn't matter to me. The two cards are priced similarly and perform similarly, so I'd be nice to see how they do when paired. Kind of like a "This is what $1000 buys your from nVidia, and this is what $1000 buys you from ATi." That is information that someone looking to put out $1000 might be interested in, IMO.

1 grand is a lot, but it's not a whole lot of money. It's less than 100USD per month, per year, and around 50-100USD per month electric depending on how you use them.

This market is for serous gamers that just want to play games full out, or someone who only upgrades every few years. That's how I view the fastest of fast GPU market.


2 GPU vs 4 GPU it does not matter, because it's two cards on top of being almost at the same price point. I agree with newtekie:rolleyes:
 
It isn't shitty coding, it is the fact that they have actually used DX11 heavily, including heavy tessellation.

If someone develops a game that completely ignores the capability of the hardware available to the average person, then that is also shitty coding.
 
If someone develops a game that completely ignores the capability of the hardware available to the average person, then that is also shitty coding.

That is why there are lower settings in the game. If they forced you to play at the maximum settings, then they would be ignoring the capability of the hardware available to the average person.:slap: They are using all the feature sets available to them as much as possible, that is a good thing.

What is the point of DX11 if it isn't used? Why still code games with DX9/10 when we have DX11 GPUs? That is essentially what you are wanting them to do, right? You don't want them to use DX11 to its fullest.
 
If someone develops a game that completely ignores the capability of the hardware available to the average person, then that is also shitty coding.

Its called a port from PS3/Xbox360 which is what pcgamers have been receiving of late.
 
That is why there are lower settings in the game. If they forced you to play at the maximum settings, then they would be ignoring the capability of the hardware available to the average person.:slap: They are using all the feature sets available to them as much as possible, that is a good thing.

What is the point of DX11 if it isn't used? Why still code games with DX9/10 when we have DX11 GPUs? That is essentially what you are wanting them to do, right? You don't want them to use DX11 to its fullest.

Show me where this game is playable at any resolution that anyone here wants to play at, without spending a $1000 on video cards.

What is the point of arguing if you're going to ignore facts?:nutkick:
 
Show me where this game is playable at any resolution that anyone here wants to play at, without spending a $1000 on video cards.

What is the point of arguing if you're going to ignore facts?:nutkick:

I have no problem playing it with a single $190 GTX460 and $50 Celeron at 1920x1080...

One of us is ignorant of the facts, but it ain't me.:nutkick:
 
I have no problem playing it with a single $190 GTX460 and $50 Celeron at 1920x1080...

One of us is ignorant of the facts, but it ain't me.:nutkick:

My mistake. 27fps is playable. Have fun with that.:confused:
 
Back
Top