1. Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce GTX 580 1.5 GB

Discussion in 'Reviews' started by W1zzard, Nov 7, 2010.

  1. the54thvoid

    the54thvoid

    Joined:
    Dec 14, 2009
    Messages:
    3,372 (1.90/day)
    Thanks Received:
    1,603
    Location:
    Glasgow - home of formal profanity
    If 300W is the PCI-e limit i dont see it being an issue. Under current protocols for meeting specs, i dont think any game coding would be 'valid' that did that. The design spec is after all 300W. Why design games that require more power than a single card can meet by specification. Given the console domination of gaming design, we're still not even getting DX 11 up to a good standard yet.

    In the future i dont see it happening either as the manufacture processes shrink.

    As for the car analogy, most super sport production cars have speed limiters (150mph for many BMW/Mercedes etc) built in, so we do buy cars with metaphorical chokes built in.
     
  2. HTC

    HTC

    Joined:
    Apr 1, 2008
    Messages:
    2,240 (0.93/day)
    Thanks Received:
    303
    You were right. I'm man enough to admit when i'm wrong and judging by W1zzard's quote below, i was indeed wrong.

    It's all about interpretation: Until now, W1zzard hadn't stated what you have been claiming as fact (the card really does react to Furmark and OCCT) and that's what i was clinging onto.

    The thing is, when i'm convinced i'm right, i'll argue and argue, and then argue some more ... until someone proves me wrong, just like W1zzard did.
     
  3. bear jesus

    bear jesus New Member

    Joined:
    Aug 12, 2010
    Messages:
    1,535 (1.00/day)
    Thanks Received:
    200
    Location:
    Britland
    The one thing that makes me think you could be right about that is the fact so many people quoted the 480's power usage as what it used in furmark and not real in game power usage when complaining about how much power the card used, i assume so many people did that because they are talking about the max power the card could possibly use and with this limit makes it seam much better when quoting the absolute max power.
     
  4. LAN_deRf_HA

    LAN_deRf_HA

    Joined:
    Apr 4, 2008
    Messages:
    4,555 (1.90/day)
    Thanks Received:
    952
    That's a horrible idea. Sometimes it takes a good 5 hours for a game to crash from a bad overclock, OCCT will find it in 10-20 minutes, and then you don't need to worry about finding stability with hours of testing for each individual program. And "the furry donut" is only good for heating up your card or telling you you're way past the stability limit, it's not sensitive enough for real stress testing. At least not with current cards. If that or programs based on it is the only test you use you're not going to have a truly stable overclock, then you'll get crashes in games and blame the games or the drivers when it's really user error.
     
  5. newtekie1

    newtekie1 Semi-Retired Folder

    Joined:
    Nov 22, 2005
    Messages:
    20,038 (6.15/day)
    Thanks Received:
    6,101
    Its cool man, I don't hold a grudge or anything, and it wasn't like I was really angry or anything. And I'm the same way when I'm convinced I'm right.:toast:


    The problems I have with the whole idea that nVidia did it to give false power consumption reading is that if they wanted to do that they would have done a better job at it. The power consumption with the limitter on under Furmark is like 150w, that is lower than game power consumption. So it makes it pretty obvious what is going on there, and anyone taking power consumption numbers would have instantly picked up on that. If they were really trying to do this to provide false power consumption numbers they would have tuned it so that power consumption under Furmark was at least at a semi-realistic level.
     
    Crunching for Team TPU 50 Million points folded for TPU
  6. HTC

    HTC

    Joined:
    Apr 1, 2008
    Messages:
    2,240 (0.93/day)
    Thanks Received:
    303
    Agreed. Furmark and other such programs "find" a bad OC quicker but that doesn't mean it's full proof.

    Sometimes, you run the stress progs for several hours on your OCs and it all checks out fine and then, while playing some game, you get crashes. Who's to blame: the game? The VGA drivers? Most of the time it's the OCs, be them CPU related or GPU related.
     
  7. a_ump

    a_ump

    Joined:
    Nov 21, 2007
    Messages:
    3,612 (1.43/day)
    Thanks Received:
    376
    Location:
    Smithfield, WV
    yea, for me to feel my overclock is stable usually takes 1-3days of messing around, stress tests, gaming, everything. Its not when you can game or when you can pass a stress test that its stable, its when it can do everything :p. If anything starts being faulty after an OC i always bounce back to square 1.
     
  8. newtekie1

    newtekie1 Semi-Retired Folder

    Joined:
    Nov 22, 2005
    Messages:
    20,038 (6.15/day)
    Thanks Received:
    6,101
    That is one of the things about Furmark I've notices, it doesn't use a whole lot of VRAM. So if your RAM overclock is slightly unstable it will almost never find it. That is usually when I fire up Unigine at full tessellation settings to really fill that VRAM up.:toast:
     
    Crunching for Team TPU 50 Million points folded for TPU
  9. AddSub

    AddSub

    Joined:
    Aug 9, 2006
    Messages:
    1,001 (0.33/day)
    Thanks Received:
    152
    A total of 307 comments? Make it 308 now. Only a passionate hate of nVidia can make a thread grow this fast and this large. Whatever, this card is pretty much as fast as two 5870 GPU's (5970) as per the following really cool link and all without all the CrossFire scaling issues, since sadly (for CrossFire tech users that is) SLI is still better tech of the two.

    Till the next round then, although I don't think AMD will stick around for that long since their Abu Dhabi sugar daddies... ummm, investors, yeah that's it, "investors", well they aren't doing too well themselves. Let's see, who's got half a dozen to a dozen billion dollars (US) sitting around to be spent in this time of global economic downturn in order to bail out and save AMD yet again? IBM? Microsoft? Sony? Fat chance!

    Let me put it this way for hard-core nVidia haters: come Christmas time 2011 (maybe even a few months earlier the way things are going) it's either nVidia GPU or nVidia GPU when it comes to your upgrading purposes.
     
    Frag Maniac says thanks.
  10. LAN_deRf_HA

    LAN_deRf_HA

    Joined:
    Apr 4, 2008
    Messages:
    4,555 (1.90/day)
    Thanks Received:
    952
    It's interesting with occt, the vram testing part never found any errors at all. It was letting me crank it all the way up to 4000mhz effective. The occt gpu test though was able to find vram errors, probably because both clocks are really tied together in the 4xx series. The vram test must just be showing what the chips can do, not what the controller can handle.
     
  11. Steevo

    Steevo

    Joined:
    Nov 4, 2005
    Messages:
    8,370 (2.55/day)
    Thanks Received:
    1,221
    I don't hate Nvidia anymore than I hate Ford cars and trucks. My company car is a Ford Explorer.


    I use what works best for me, and right now it is ATI for the money.


    Back to your comment about AMD, they have paid off millions of their debts that is why they were not showing a profit, if you understand balance sheets and finance you would understand this.

    Yep, and some don't have the limiters.

    A game does not have anything to do with power consumption anymore than a movie has to do with power use. The game specs don't list how many watts you to have to run it. Nvidia chooses the power consumption of a card based on the coolers ability, and other specs. They made a card that pulls 350+ watts in a real world performance test. Then they put a self limiting throttle on it to keep it from pulling that amount. They claim they have the most powerful card, and in some games they do, but when pushed to the max by a program designed to do so it has to self limit to maintain standards. Like a dragster that has a self deployment chute when you go full throttle. Or a block of wood under the pedal.
     
    Last edited: Nov 15, 2010
    10 Million points folded for TPU
  12. newtekie1

    newtekie1 Semi-Retired Folder

    Joined:
    Nov 22, 2005
    Messages:
    20,038 (6.15/day)
    Thanks Received:
    6,101
    Furmark is hardly a real world performance test. It is a torture test more than even a benchmark, though it does have the benchmark function built in. And even then it isn't a real world benchmark, it is a synthetic benchmark.

    And according to W1z it doesn't pull 350+ watts, it pulls ever so slightly over 300w.
     
    Crunching for Team TPU 50 Million points folded for TPU
  13. CDdude55

    CDdude55 Crazy 4 TPU!!!

    Joined:
    Jul 12, 2007
    Messages:
    8,179 (3.07/day)
    Thanks Received:
    1,277
    Location:
    Virginia
    It's the way you come across, a lot of your posts come across as the stereotypical rabid ignorant fanboy. You should really be aware of that , because if you really are non bias, comments like ''Nfags can leave this thread, I hope Nvidia doesn't drop their prices and you continue to get assraped by them.'', aren't actually in your favor of being non bias.. just saying.
     
    Steevo says thanks.
  14. Frag Maniac

    Frag Maniac

    Joined:
    Nov 9, 2010
    Messages:
    2,643 (1.83/day)
    Thanks Received:
    551
    Actually at launch the 580 did have serious scaling issues. It was stomped by the 480 in dual GPU SLI in almost every test. Here's hoping driver maturation will sort that out, because right now that's the only thing keeping me from buying one, other than maybe trying to hit a holiday sale.
     
  15. Steevo

    Steevo

    Joined:
    Nov 4, 2005
    Messages:
    8,370 (2.55/day)
    Thanks Received:
    1,221
    :toast: Yes, I do get a bit heated when some people piss me off.


    Part of the reason why is in the last few years this digital dream bullcrap that all these companies have been promising hasn't come true. I get really pissed when either of them start spewing numbers, then hold users back from enjoying and using the hardware they purchase by limiting it, just to spin the wheel again at a later date with nothing more than a new paint job. ATI has failed me, Adobe has failed me, Canon has failed me, Intel is promising crap they can't deliver, Motorola has failed me, Nvidia has failed to deliver, Microsoft is not pushing people to get standardized programming.


    I bought a canon high def camcorder, it records M2TS, the same as blu-ray. According to ATI we should be processing that on stream processors with this shiny new....but this $600 software, then install this patch, then install this set of codecs, then you have to export it to this format, then burn it.

    Intel is still fiddle Fing around with crap they are to large and clumsy to do right the first three times.

    My phone still doesn't support flash and they promise "its coming, just wait" Sounds like Atari who still haven't fixed their TDU game, or many other issues with games that just get thrown to the side.

    Nvidia pushes proprietary crap like Physx, that works on all of 13 GPU enabled titles, despite others showing it works just as fast on CPU when moved up from antiquated code, besides it now being a part of DX11. Also Nvidia and Adobe seem to be stuck in a 69 swap meat, they disable the hardware stream acceleration when at ATI card is present, some forum members have learned how to bypass it, and wonder of all wonders, it still works using the ATI GPU to perform the calculations, and not CUDA according to them as long as it doesn't get shut down.


    So this shiny new future, is bullshit. it is the same crap we have had from day one. I'm tired of spending thousands of dollars to be told I still have it wrong.
     
    CDdude55 and motasim say thanks.
    10 Million points folded for TPU
  16. motasim

    motasim New Member

    Joined:
    Aug 20, 2010
    Messages:
    205 (0.13/day)
    Thanks Received:
    21
    Location:
    Mostar, Bosnia & Herzegovina
    ... well put ... :rockout:


    ... if nVidia becomes the only choice of discrete GPU (although I know that it's never going to happen), I think that it'll be the day on which I switch to Intel integrated graphics, or better still AMD Fusion ... in fact, with its current management; I believe that nVidia will eventually be acquired by Intel ... again; I'm not Red nor Green, but I hate it when idiot fan boys try to transform every single discussion on these forums into an nVidia/ATI trashing circus ...
     
    Last edited: Nov 15, 2010
    N3M3515 and the54thvoid say thanks.
  17. the54thvoid

    the54thvoid

    Joined:
    Dec 14, 2009
    Messages:
    3,372 (1.90/day)
    Thanks Received:
    1,603
    Location:
    Glasgow - home of formal profanity
    Odd. I'm an ATI owner and i've been praising the GTX 580, trying to not buy one so i can gauge the competition when it comes out in December. Your post is ignorant with regards to scaling:

    http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_580_SLI/24.html GTX 580
    1 GTX 580 is 77% of GTX 580 sli (all resolutions)
    http://www.techpowerup.com/reviews/ATI/Radeon_HD_6870_CrossFire/23.html HD 6870
    1 HD6870 is 73% of HD 6870 crossfire (all resolutions)

    So the 6 series scales better in dual gpu config. Granted, on 5 series, the sli option is better but the 6 series nailed it well.

    As for hard core NVidia haters (not a nice comment to use - hate is such a strong word) - i think at christmas we'll get a fair choice. My personal feeling is that indeed the 6970 isn't faster than a 580. I think if it was faster there would be some leaks out from AMD PR to say, look, our card is better - hold off buying that 580. But if it doesn't perform as well, there's nothing to leak - better safe to stay quiet.
    Hope i'm wrong because if i'm not the 580's will go up in price.

    I think though that you're way off base. Most people do tend to take sides but 'hating' isn't part of it. It more shows your own predisposition against AMD. But at least you wear your colours on your sleeve. It makes you prone to make erroneous statements (a la the one above ref: scaling).
     
    N3M3515 says thanks.
  18. HalfAHertz

    HalfAHertz

    Joined:
    May 4, 2009
    Messages:
    1,893 (0.95/day)
    Thanks Received:
    380
    Location:
    Singapore
    It's always easier to have a stress testing program open, don't get me wrong. But what I was trying to say was that pointless and needless to run it for 5+ hours. I usually set a clock, test for a couple of mins, go higher, test for a couple of mins, go higher, test for a couple of mins. The moment I get artifacts, i go back 10 MHz and try again.Once I'm bored of that, I fire a game and if crashes, I just go back 10-20 MHz on both RAM and Core and try again...

    I agree that it's unrealistic to think that a game can go over the 300W limit because of the way game code is written and because of the randomness that the human player creates.
    The game-play is always random and that means that the environment is always created in real time. Thus every scene has to go through the entire pipeline and spend finite ammounts in each step of it.
    To be fair stress testing tools are more like advanced HPC calculations or even folding, where a specific part is stressed over and over for long periods of time.

    Edit:
    And if we're talking about corporate takeovers, I think Nvidia will be snatched up first, not because they're in danger of going down or anything crazy like that, but because it would be a smart purchase. Their cards are doing great in the HPC space and it would be a smart move for someone like IBM or Oracle(or even HP and Dell) to snatch them up while Nvidia hasn't gotten too much momentum and are still cheaper. That would allow them to add them to their server farm line up and have an ace up their sleeves compared to the opposition.
     
    Last edited: Nov 15, 2010
  19. GC_PaNzerFIN

    GC_PaNzerFIN

    Joined:
    Oct 9, 2009
    Messages:
    511 (0.28/day)
    Thanks Received:
    500
    Location:
    Finland
    Do I run Furmark 24/7? No.
    Does it break if I do run Furmark without the limiter? No.
    Does the limiter kick in games even with overvolting and overclocking? No.
    Does it prevent someone breaking card if they don't know what they are doing with voltages? Quite possibly.
    Card priced right compared to previous gens? Yes.
    Fastest single GPU at least for the moment? Yes.
    Does it run relatively quiet and at reasonable temps? Yes.
    Do I need new card? Yes.

    = Ordered GTX 580

    Seriously, this bullshit whining about limiters in programs as Furmark is silly, it is not even new thing and even AMD has done driver level limiters. There is huge total of 0 people to whom it is a problem, except in their heads and yet another thing to bash NV about with no intentions to ever even look in the direction of their cards.

    Oh and just to be sure: I have had over 10 ATi and 10 NV cards in past 2 years, go figure.

    If 580 isn't for you then please move along, I am sure the HD 69xx will come and deliver too. But stop this nonsense please.

    /End of rant
     
  20. Stoogie New Member

    Joined:
    Nov 15, 2010
    Messages:
    5 (0.00/day)
    Thanks Received:
    0
    Location:
    Australia
    Wtf at 5970 scores in WoW?

    compare these two

    http://www.techpowerup.com/reviews/ATI/Radeon_HD_6870_CrossFire/18.html

    http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_580/20.html

    5970 is in totally different places in these 2 tests, while the other gpus are at the exact same fps.

    are we %100 sure that this site is trustable ?

    i looked into this regarding the 6870's CF performance with WoW however the score seems to be half that of just 1 card, i believe that this is a mistake on your end techpowerup when u benchmarked the 6870 cards.

    Please give a logical explanation for the 2 entirely different answers to the same benchmark.
     
    Last edited: Nov 15, 2010
  21. W1zzard

    W1zzard Administrator Staff Member

    Joined:
    May 14, 2004
    Messages:
    14,962 (3.92/day)
    Thanks Received:
    11,768
    dont trust this site!! read the test setup page before making accusations
     
  22. the54thvoid

    the54thvoid

    Joined:
    Dec 14, 2009
    Messages:
    3,372 (1.90/day)
    Thanks Received:
    1,603
    Location:
    Glasgow - home of formal profanity
    Can I swear?

    BASTARDS!

    Overcockers, sorry OverclockersUK are price gouging for sure. Only have the ASUS board in stock and it's £459.99. They'll do this until the HD 6970 comes out. Same way the 6850 and 6870 prices generally increased almost immediately.
     
  23. Stoogie New Member

    Joined:
    Nov 15, 2010
    Messages:
    5 (0.00/day)
    Thanks Received:
    0
    Location:
    Australia
    so the catalyst 10.10 drivers fixed the issue from the 10.7 drivers?
     
  24. Stoogie New Member

    Joined:
    Nov 15, 2010
    Messages:
    5 (0.00/day)
    Thanks Received:
    0
    Location:
    Australia
    if so can you rebenchmark WoW with 10.10 drivers for 6870's in CF?

    Edit: my initial post was copied from a overclockers site, maybe i shouldve removed the trust bit lol XD my bad
     
  25. W1zzard

    W1zzard Administrator Staff Member

    Joined:
    May 14, 2004
    Messages:
    14,962 (3.92/day)
    Thanks Received:
    11,768
    just go by the 5970 numbers and the 5970 vs 6870 relative performance in other games

    and please go to that forum and tell them what's going on with the numbers, so no need to cry conspiracy
     
    Stoogie says thanks.

Currently Active Users Viewing This Thread: 1 (0 members and 1 guest)

Share This Page