• We've upgraded our forums. Please post any issues/requests in this thread.

NVIDIA GeForce GTX 580 1.5 GB

Joined
Dec 14, 2009
Messages
6,580
Likes
5,813
Location
Glasgow - home of formal profanity
System Name New Ho'Ryzen
Processor Ryzen 1700X @ 3.82Ghz
Motherboard Asus Crosshair VI Hero
Cooling TR Le Grand Macho & custom GPU loop
Memory 16Gb G.Skill 3200 RGB
Video Card(s) GTX1080ti (Heatkiller WB) @ 2Ghz core/1.5(12)Ghz mem
Storage Samsumg 960 Pro m2. 512Gb
Display(s) Dell Ultrasharp 27" (2560x1440)
Case Lian Li PC-V33WX
Audio Device(s) On Board
Power Supply Seasonic Prime TItanium 850
Software W10
Benchmark Scores Look, it's a Ryzen on air........ What's the point?
What happens once a game comes along that uses more than 300W, nvidia just expects users to live with underperformance then? Or do they expect you to upgrade cards?


Would you buy a car with a wood block under the throttle?
If 300W is the PCI-e limit i dont see it being an issue. Under current protocols for meeting specs, i dont think any game coding would be 'valid' that did that. The design spec is after all 300W. Why design games that require more power than a single card can meet by specification. Given the console domination of gaming design, we're still not even getting DX 11 up to a good standard yet.

In the future i dont see it happening either as the manufacture processes shrink.

As for the car analogy, most super sport production cars have speed limiters (150mph for many BMW/Mercedes etc) built in, so we do buy cars with metaphorical chokes built in.
 
Joined
Apr 1, 2008
Messages
2,817
Likes
604
System Name HTC's System
Processor Ryzen 5 1600
Motherboard Asrock Taichi
Cooling NH-C14
Memory G.Skill Kit 16GB DDR4 F4 - 3200 C16D - 16 GTZB
Video Card(s) Sapphire Nitro+ Radeon RX 480 OC 4 GB
Storage 2 * Samsung 1 TB HD103UJ
Display(s) LG 27UD58
Case Corsair Obsidian 650D
Audio Device(s) Onboard
Power Supply Corsair TX750
Mouse Logitech Performance MX
Software Ubuntu 16.04 LTS
Oh Jesus Christ!!! I quoted him saying exact that! Do you have a hard time reading? I'm not quoting him again, you can read my post above to see where he said it, or even better read the review!:shadedshu

Again, why are you taking a statement from the overclocking section about temperature and trying to say it has anything to do with the Over Current Protection? Did you not get that the temperature protection is a totally different thing from the current protection? I believe I already told you this.

You can read pretty much the same statement about GTX480s at well: http://www.techpowerup.com/reviews/MSI/N480GTX_GTX_480_Lightning/31.html

But I guess you will still want to go on about the temp limit like it is the same thing as the current limit, but now I'm sure you'll say the GTX480 must have had it too...right? Because this is like the 3rd time you've went on about the temperature limit like it is related to the current limit.:shadedshu

Again, temperature is not the same as current, they are two different things and hence two different protection systems.

Correct he did say that, and I bolded the important part. And when it is put in context with the fact that he said the OCP is only activated when it detects OCCT and Furmark.

Again, I don't know why you can't be bothered to read the review as W1z already said this in it. The driver detects Furmark and OCCT and activates the limitter. It is limitted by those 2 programs specifically because those are the only two programs the driver detects. In all other programs the driver doesn't monitor the overcurrent sensors.
You were right. I'm man enough to admit when i'm wrong and judging by W1zzard's quote below, i was indeed wrong.

What I was stating was that NVIDIA never stated that only OCCT and Furmark triggered the OCP protection cap.
thats exactly what nvidia told me
It's all about interpretation: Until now, W1zzard hadn't stated what you have been claiming as fact (the card really does react to Furmark and OCCT) and that's what i was clinging onto.

The thing is, when i'm convinced i'm right, i'll argue and argue, and then argue some more ... until someone proves me wrong, just like W1zzard did.
 
Joined
Aug 12, 2010
Messages
1,534
Likes
200
Location
Britland
System Name Gaming temp// HTPC
Processor AMD A6 5400k // A4 5300
Motherboard ASRock FM2A75 PRO4// ASRock FM2A55M-DGS
Cooling Xigmatek HDT-D1284 // stock phenom II HSF
Memory 4GB 1600mhz corsair vengeance // 4GB 1600mhz corsair vengeance low profile
Storage 64gb sandisk pulse SSD and 500gb HDD // 500gb HDD
Display(s) acer 22" 1680x1050
Power Supply Seasonic G-450 // Corsair CXM 430W
The did it to either mislead the public in power use, or to protect the card from being used to the full in a optimized way.
The one thing that makes me think you could be right about that is the fact so many people quoted the 480's power usage as what it used in furmark and not real in game power usage when complaining about how much power the card used, i assume so many people did that because they are talking about the max power the card could possibly use and with this limit makes it seam much better when quoting the absolute max power.
 
Joined
Apr 4, 2008
Messages
4,659
Likes
1,009
System Name Obelisc
Processor i7 3770k @ 4.8 GHz
Motherboard Asus P8Z77-V
Cooling H110
Memory 16GB(4x4) @ 2400 MHz 9-11-11-31
Video Card(s) GTX 780 Ti
Storage 850 EVO 1TB, 2x 5TB Toshiba
Case T81
Audio Device(s) X-Fi Titanium HD
Power Supply EVGA 850 T2 80+ TITANIUM
Software Win10 64bit
Well only stress testing programs use it, so I don't see a need to disable it.I mean if all you're using the card for is gaming, and you want to over-clock, then why not stress test with a demanding game instead? Sometimes I've had OCs which would have corruptions and artefacts in The Furry Donut™, but would work perfectly in games.
That's a horrible idea. Sometimes it takes a good 5 hours for a game to crash from a bad overclock, OCCT will find it in 10-20 minutes, and then you don't need to worry about finding stability with hours of testing for each individual program. And "the furry donut" is only good for heating up your card or telling you you're way past the stability limit, it's not sensitive enough for real stress testing. At least not with current cards. If that or programs based on it is the only test you use you're not going to have a truly stable overclock, then you'll get crashes in games and blame the games or the drivers when it's really user error.
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
24,274
Likes
10,358
Location
Indiana, USA
Processor Intel Core i7 4790K@4.6GHz
Motherboard AsRock Z97 Extreme6
Cooling Corsair H100i
Memory 32GB Corsair DDR3-1866 9-10-9-27
Video Card(s) ASUS GTX960 STRIX @ 1500/1900
Storage 480GB Crucial MX200 + 2TB Seagate Solid State Hybrid Drive with 128GB OCZ Synapse SSD Cache
Display(s) QNIX QX2710 1440p@120Hz
Case Corsair 650D Black
Audio Device(s) Onboard is good enough for me
Power Supply Corsair HX850
Software Windows 10 Pro x64
You were right. I'm man enough to admit when i'm wrong and judging by W1zzard's quote below, i was indeed wrong.





It's all about interpretation: Until now, W1zzard hadn't stated what you have been claiming as fact (the card really does react to Furmark and OCCT) and that's what i was clinging onto.

The thing is, when i'm convinced i'm right, i'll argue and argue, and then argue some more ... until someone proves me wrong, just like W1zzard did.
Its cool man, I don't hold a grudge or anything, and it wasn't like I was really angry or anything. And I'm the same way when I'm convinced I'm right.:toast:

The one thing that makes me think you could be right about that is the fact so many people quoted the 480's power usage as what it used in furmark and not real in game power usage when complaining about how much power the card used, i assume so many people did that because they are talking about the max power the card could possibly use and with this limit makes it seam much better when quoting the absolute max power.

The problems I have with the whole idea that nVidia did it to give false power consumption reading is that if they wanted to do that they would have done a better job at it. The power consumption with the limitter on under Furmark is like 150w, that is lower than game power consumption. So it makes it pretty obvious what is going on there, and anyone taking power consumption numbers would have instantly picked up on that. If they were really trying to do this to provide false power consumption numbers they would have tuned it so that power consumption under Furmark was at least at a semi-realistic level.
 
Joined
Apr 1, 2008
Messages
2,817
Likes
604
System Name HTC's System
Processor Ryzen 5 1600
Motherboard Asrock Taichi
Cooling NH-C14
Memory G.Skill Kit 16GB DDR4 F4 - 3200 C16D - 16 GTZB
Video Card(s) Sapphire Nitro+ Radeon RX 480 OC 4 GB
Storage 2 * Samsung 1 TB HD103UJ
Display(s) LG 27UD58
Case Corsair Obsidian 650D
Audio Device(s) Onboard
Power Supply Corsair TX750
Mouse Logitech Performance MX
Software Ubuntu 16.04 LTS
That's a horrible idea. Sometimes it takes a good 5 hours for a game to crash from a bad overclock, OCCT will find it in 10-20 minutes, and then you don't need to worry about finding stability with hours of testing for each individual program. And "the furry donut" is only good for heating up your card or telling you you're way past the stability limit, it's not sensitive enough for real stress testing. At least not with current cards. If that or programs based on it is the only test you use you're not going to have a truly stable overclock, then you'll get crashes in games and blame the games or the drivers when it's really user error.
Agreed. Furmark and other such programs "find" a bad OC quicker but that doesn't mean it's full proof.

Sometimes, you run the stress progs for several hours on your OCs and it all checks out fine and then, while playing some game, you get crashes. Who's to blame: the game? The VGA drivers? Most of the time it's the OCs, be them CPU related or GPU related.
 
Joined
Nov 21, 2007
Messages
3,684
Likes
402
Location
Smithfield, WV
System Name Felix777
Processor Core i5-3570k@stock
Motherboard Biostar H61
Memory 8gb
Video Card(s) XFX RX 470
Storage WD 500GB BLK
Display(s) Acer p236h bd
Case Haf 912
Audio Device(s) onboard
Power Supply Rosewill CAPSTONE 450watt
Software Win 10 x64
yea, for me to feel my overclock is stable usually takes 1-3days of messing around, stress tests, gaming, everything. Its not when you can game or when you can pass a stress test that its stable, its when it can do everything :p. If anything starts being faulty after an OC i always bounce back to square 1.
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
24,274
Likes
10,358
Location
Indiana, USA
Processor Intel Core i7 4790K@4.6GHz
Motherboard AsRock Z97 Extreme6
Cooling Corsair H100i
Memory 32GB Corsair DDR3-1866 9-10-9-27
Video Card(s) ASUS GTX960 STRIX @ 1500/1900
Storage 480GB Crucial MX200 + 2TB Seagate Solid State Hybrid Drive with 128GB OCZ Synapse SSD Cache
Display(s) QNIX QX2710 1440p@120Hz
Case Corsair 650D Black
Audio Device(s) Onboard is good enough for me
Power Supply Corsair HX850
Software Windows 10 Pro x64
Agreed. Furmark and other such programs "find" a bad OC quicker but that doesn't mean it's full proof.

Sometimes, you run the stress progs for several hours on your OCs and it all checks out fine and then, while playing some game, you get crashes. Who's to blame: the game? The VGA drivers? Most of the time it's the OCs, be them CPU related or GPU related.
That is one of the things about Furmark I've notices, it doesn't use a whole lot of VRAM. So if your RAM overclock is slightly unstable it will almost never find it. That is usually when I fire up Unigine at full tessellation settings to really fill that VRAM up.:toast:
 
Joined
Aug 9, 2006
Messages
1,003
Likes
156
System Name [Primary Workstation]
Processor Intel Core i7-920 Bloomfield @ 3.8GHz/4.55GHz [24-7/Bench]
Motherboard EVGA X58 E758-A1 [Tweaked right!]
Cooling Cooler Master V8 [stock fan + two 133CFM ULTRA KAZE fans]
Memory 12GB [Kingston HyperX]
Video Card(s) constantly upgrading/downgrading [prefer nVidia]
Storage constantly upgrading/downgrading [prefer Hitachi/Samsung]
Display(s) Triple LCD [40 inch primary + 32 & 28 inch auxiliary displays]
Case Cooler Master Cosmos 1000 [Mesh Mod, CFM Overload]
Audio Device(s) ASUS Xonar D1 + onboard Realtek ALC889A [Logitech Z-5300 Spk., Niko 650-HP 5.1 Hp., X-Bass Hp.]
Power Supply Corsair TX950W [aka Reactor]
Software This and that... [All software 100% legit and paid for, 0% pirated]
Benchmark Scores Ridiculously good scores!!!
A total of 307 comments? Make it 308 now. Only a passionate hate of nVidia can make a thread grow this fast and this large. Whatever, this card is pretty much as fast as two 5870 GPU's (5970) as per the following really cool link and all without all the CrossFire scaling issues, since sadly (for CrossFire tech users that is) SLI is still better tech of the two.

Till the next round then, although I don't think AMD will stick around for that long since their Abu Dhabi sugar daddies... ummm, investors, yeah that's it, "investors", well they aren't doing too well themselves. Let's see, who's got half a dozen to a dozen billion dollars (US) sitting around to be spent in this time of global economic downturn in order to bail out and save AMD yet again? IBM? Microsoft? Sony? Fat chance!

Let me put it this way for hard-core nVidia haters: come Christmas time 2011 (maybe even a few months earlier the way things are going) it's either nVidia GPU or nVidia GPU when it comes to your upgrading purposes.
 
Joined
Apr 4, 2008
Messages
4,659
Likes
1,009
System Name Obelisc
Processor i7 3770k @ 4.8 GHz
Motherboard Asus P8Z77-V
Cooling H110
Memory 16GB(4x4) @ 2400 MHz 9-11-11-31
Video Card(s) GTX 780 Ti
Storage 850 EVO 1TB, 2x 5TB Toshiba
Case T81
Audio Device(s) X-Fi Titanium HD
Power Supply EVGA 850 T2 80+ TITANIUM
Software Win10 64bit
That is one of the things about Furmark I've notices, it doesn't use a whole lot of VRAM. So if your RAM overclock is slightly unstable it will almost never find it. That is usually when I fire up Unigine at full tessellation settings to really fill that VRAM up.:toast:
It's interesting with occt, the vram testing part never found any errors at all. It was letting me crank it all the way up to 4000mhz effective. The occt gpu test though was able to find vram errors, probably because both clocks are really tied together in the 4xx series. The vram test must just be showing what the chips can do, not what the controller can handle.
 
Joined
Nov 4, 2005
Messages
9,946
Likes
2,309
System Name MoFo 2
Processor AMD PhenomII 1100T @ 4.2Ghz
Motherboard Asus Crosshair IV
Cooling Swiftec 655 pump, Apogee GT,, MCR360mm Rad, 1/2 loop.
Memory 8GB DDR3-2133 @ 1900 8.9.9.24 1T
Video Card(s) HD7970 1250/1750
Storage Agility 3 SSD 6TB RAID 0 on RAID Card
Display(s) 46" 1080P Toshiba LCD
Case Rosewill R6A34-BK modded (thanks to MKmods)
Audio Device(s) ATI HDMI
Power Supply 750W PC Power & Cooling modded (thanks to MKmods)
Software A lot.
Benchmark Scores Its fast. Enough.
A total of 307 comments? Make it 308 now. Only a passionate hate of nVidia can make a thread grow this fast and this large. Whatever, this card is pretty much as fast as two 5870 GPU's (5970) as per the following really cool link and all without all the CrossFire scaling issues, since sadly (for CrossFire tech users that is) SLI is still better tech of the two.

Till the next round then, although I don't think AMD will stick around for that long since their Abu Dhabi sugar daddies... ummm, investors, yeah that's it, "investors", well they aren't doing too well themselves. Let's see, who's got half a dozen to a dozen billion dollars (US) sitting around to be spent in this time of global economic downturn in order to bail out and save AMD yet again? IBM? Microsoft? Sony? Fat chance!

Let me put it this way for hard-core nVidia haters: come Christmas time 2011 (maybe even a few months earlier the way things are going) it's either nVidia GPU or nVidia GPU when it comes to your upgrading purposes.
I don't hate Nvidia anymore than I hate Ford cars and trucks. My company car is a Ford Explorer.


I use what works best for me, and right now it is ATI for the money.


Back to your comment about AMD, they have paid off millions of their debts that is why they were not showing a profit, if you understand balance sheets and finance you would understand this.

If 300W is the PCI-e limit i dont see it being an issue. Under current protocols for meeting specs, i dont think any game coding would be 'valid' that did that. The design spec is after all 300W. Why design games that require more power than a single card can meet by specification. Given the console domination of gaming design, we're still not even getting DX 11 up to a good standard yet.

In the future i dont see it happening either as the manufacture processes shrink.

As for the car analogy, most super sport production cars have speed limiters (150mph for many BMW/Mercedes etc) built in, so we do buy cars with metaphorical chokes built in.
Yep, and some don't have the limiters.

A game does not have anything to do with power consumption anymore than a movie has to do with power use. The game specs don't list how many watts you to have to run it. Nvidia chooses the power consumption of a card based on the coolers ability, and other specs. They made a card that pulls 350+ watts in a real world performance test. Then they put a self limiting throttle on it to keep it from pulling that amount. They claim they have the most powerful card, and in some games they do, but when pushed to the max by a program designed to do so it has to self limit to maintain standards. Like a dragster that has a self deployment chute when you go full throttle. Or a block of wood under the pedal.
 
Last edited:

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
24,274
Likes
10,358
Location
Indiana, USA
Processor Intel Core i7 4790K@4.6GHz
Motherboard AsRock Z97 Extreme6
Cooling Corsair H100i
Memory 32GB Corsair DDR3-1866 9-10-9-27
Video Card(s) ASUS GTX960 STRIX @ 1500/1900
Storage 480GB Crucial MX200 + 2TB Seagate Solid State Hybrid Drive with 128GB OCZ Synapse SSD Cache
Display(s) QNIX QX2710 1440p@120Hz
Case Corsair 650D Black
Audio Device(s) Onboard is good enough for me
Power Supply Corsair HX850
Software Windows 10 Pro x64
They made a card that pulls 350+ watts in a real world performance test.
Furmark is hardly a real world performance test. It is a torture test more than even a benchmark, though it does have the benchmark function built in. And even then it isn't a real world benchmark, it is a synthetic benchmark.

And according to W1z it doesn't pull 350+ watts, it pulls ever so slightly over 300w.
 

CDdude55

Crazy 4 TPU!!!
Joined
Jul 12, 2007
Messages
8,178
Likes
1,269
Location
Virginia
System Name CDdude's Rig!
Processor AMD Athlon II X4 620
Motherboard Gigabyte GA-990FXA-UD3
Cooling Corsair H70
Memory 8GB Corsair Vengence @1600mhz
Video Card(s) XFX HD 6970 2GB
Storage OCZ Agility 3 60GB SSD/WD Velociraptor 300GB
Display(s) ASUS VH232H 23" 1920x1080
Case Cooler Master CM690 (w/ side window)
Audio Device(s) Onboard (It sounds fine)
Power Supply Corsair 850TX
Software Windows 7 Home Premium 64bit SP1
I don't hate Nvidia anymore than I hate Ford cars and trucks. My company car is a Ford Explorer.


I use what works best for me, and right now it is ATI for the money.
It's the way you come across, a lot of your posts come across as the stereotypical rabid ignorant fanboy. You should really be aware of that , because if you really are non bias, comments like ''Nfags can leave this thread, I hope Nvidia doesn't drop their prices and you continue to get assraped by them.'', aren't actually in your favor of being non bias.. just saying.
 
Joined
Nov 9, 2010
Messages
5,157
Likes
1,704
Processor Intel i7 950 @ 3.2GHz
Motherboard ASUS P6X58D-E
Cooling Corsair H50 push/pull
Memory Kingston HyperX 1600 8GB
Video Card(s) Sapphire HD 7970 OC
Storage Plextor M5P 128GB/WD Black 2x1TB,1x6TB/Seagate 1TB
Display(s) Panasonic TC-L32U3
Case Antec DF-85
Audio Device(s) Yamaha RX-V371 AVR
Power Supply XFX 850w Black Edition
Mouse Logitech G402
Keyboard Logitech K120
Software W10 Pro 64 bit
...this card is pretty much as fast as two 5870 GPU's (5970) as per the following...and all without all the CrossFire scaling issues...
Actually at launch the 580 did have serious scaling issues. It was stomped by the 480 in dual GPU SLI in almost every test. Here's hoping driver maturation will sort that out, because right now that's the only thing keeping me from buying one, other than maybe trying to hit a holiday sale.
 
Joined
Nov 4, 2005
Messages
9,946
Likes
2,309
System Name MoFo 2
Processor AMD PhenomII 1100T @ 4.2Ghz
Motherboard Asus Crosshair IV
Cooling Swiftec 655 pump, Apogee GT,, MCR360mm Rad, 1/2 loop.
Memory 8GB DDR3-2133 @ 1900 8.9.9.24 1T
Video Card(s) HD7970 1250/1750
Storage Agility 3 SSD 6TB RAID 0 on RAID Card
Display(s) 46" 1080P Toshiba LCD
Case Rosewill R6A34-BK modded (thanks to MKmods)
Audio Device(s) ATI HDMI
Power Supply 750W PC Power & Cooling modded (thanks to MKmods)
Software A lot.
Benchmark Scores Its fast. Enough.
It's the way you come across, a lot of your posts come across as the stereotypical rabid ignorant fanboy. You should really be aware of that , because if you really are non bias, comments like ''Nfags can leave this thread, I hope Nvidia doesn't drop their prices and you continue to get assraped by them.'', aren't actually in your favor of being non bias.. just saying.
:toast: Yes, I do get a bit heated when some people piss me off.


Part of the reason why is in the last few years this digital dream bullcrap that all these companies have been promising hasn't come true. I get really pissed when either of them start spewing numbers, then hold users back from enjoying and using the hardware they purchase by limiting it, just to spin the wheel again at a later date with nothing more than a new paint job. ATI has failed me, Adobe has failed me, Canon has failed me, Intel is promising crap they can't deliver, Motorola has failed me, Nvidia has failed to deliver, Microsoft is not pushing people to get standardized programming.


I bought a canon high def camcorder, it records M2TS, the same as blu-ray. According to ATI we should be processing that on stream processors with this shiny new....but this $600 software, then install this patch, then install this set of codecs, then you have to export it to this format, then burn it.

Intel is still fiddle Fing around with crap they are to large and clumsy to do right the first three times.

My phone still doesn't support flash and they promise "its coming, just wait" Sounds like Atari who still haven't fixed their TDU game, or many other issues with games that just get thrown to the side.

Nvidia pushes proprietary crap like Physx, that works on all of 13 GPU enabled titles, despite others showing it works just as fast on CPU when moved up from antiquated code, besides it now being a part of DX11. Also Nvidia and Adobe seem to be stuck in a 69 swap meat, they disable the hardware stream acceleration when at ATI card is present, some forum members have learned how to bypass it, and wonder of all wonders, it still works using the ATI GPU to perform the calculations, and not CUDA according to them as long as it doesn't get shut down.


So this shiny new future, is bullshit. it is the same crap we have had from day one. I'm tired of spending thousands of dollars to be told I still have it wrong.
 
Joined
Aug 20, 2010
Messages
209
Likes
22
Location
Mostar, Bosnia & Herzegovina
System Name Micro Mule
Processor Intel i7 950 Stock + Noctua NH-C14
Motherboard Asus Rampage III Gene MicroATX
Cooling Noctua 120mm/80m Fans
Memory Crucial Ballistix 6GB DDR3 1600MHz
Video Card(s) Asus nVidia GTX 580
Storage Samsung 850 Pro SSD, WD Caviar Black 2TB HDD
Display(s) LG 42LD650 42" LCD HDTV
Case Silverstone Fortress FT03
Audio Device(s) Creative SB X-Fi Titanium HD + Sennheiser PC360 Headset
Power Supply Corsair AX850 - 850W Modular Gold
Software Windows 7 Ultimate 64 bit
:toast: Yes, I do get a bit heated when some people piss me off.


Part of the reason why is in the last few years this digital dream bullcrap that all these companies have been promising hasn't come true. I get really pissed when either of them start spewing numbers, then hold users back from enjoying and using the hardware they purchase by limiting it, just to spin the wheel again at a later date with nothing more than a new paint job. ATI has failed me, Adobe has failed me, Canon has failed me, Intel is promising crap they can't deliver, Motorola has failed me, Nvidia has failed to deliver, Microsoft is not pushing people to get standardized programming.


I bought a canon high def camcorder, it records M2TS, the same as blu-ray. According to ATI we should be processing that on stream processors with this shiny new....but this $600 software, then install this patch, then install this set of codecs, then you have to export it to this format, then burn it.

Intel is still fiddle Fing around with crap they are to large and clumsy to do right the first three times.

My phone still doesn't support flash and they promise "its coming, just wait" Sounds like Atari who still haven't fixed their TDU game, or many other issues with games that just get thrown to the side.

Nvidia pushes proprietary crap like Physx, that works on all of 13 GPU enabled titles, despite others showing it works just as fast on CPU when moved up from antiquated code, besides it now being a part of DX11. Also Nvidia and Adobe seem to be stuck in a 69 swap meat, they disable the hardware stream acceleration when at ATI card is present, some forum members have learned how to bypass it, and wonder of all wonders, it still works using the ATI GPU to perform the calculations, and not CUDA according to them as long as it doesn't get shut down.


So this shiny new future, is bullshit. it is the same crap we have had from day one. I'm tired of spending thousands of dollars to be told I still have it wrong.
... well put ... :rockout:


A total of 307 comments? Make it 308 now. Only a passionate hate of nVidia can make a thread grow this fast and this large. Whatever, this card is pretty much as fast as two 5870 GPU's (5970) as per the following really cool link and all without all the CrossFire scaling issues, since sadly (for CrossFire tech users that is) SLI is still better tech of the two.

Till the next round then, although I don't think AMD will stick around for that long since their Abu Dhabi sugar daddies... ummm, investors, yeah that's it, "investors", well they aren't doing too well themselves. Let's see, who's got half a dozen to a dozen billion dollars (US) sitting around to be spent in this time of global economic downturn in order to bail out and save AMD yet again? IBM? Microsoft? Sony? Fat chance!

Let me put it this way for hard-core nVidia haters: come Christmas time 2011 (maybe even a few months earlier the way things are going) it's either nVidia GPU or nVidia GPU when it comes to your upgrading purposes.
... if nVidia becomes the only choice of discrete GPU (although I know that it's never going to happen), I think that it'll be the day on which I switch to Intel integrated graphics, or better still AMD Fusion ... in fact, with its current management; I believe that nVidia will eventually be acquired by Intel ... again; I'm not Red nor Green, but I hate it when idiot fan boys try to transform every single discussion on these forums into an nVidia/ATI trashing circus ...
 
Last edited:
Joined
Dec 14, 2009
Messages
6,580
Likes
5,813
Location
Glasgow - home of formal profanity
System Name New Ho'Ryzen
Processor Ryzen 1700X @ 3.82Ghz
Motherboard Asus Crosshair VI Hero
Cooling TR Le Grand Macho & custom GPU loop
Memory 16Gb G.Skill 3200 RGB
Video Card(s) GTX1080ti (Heatkiller WB) @ 2Ghz core/1.5(12)Ghz mem
Storage Samsumg 960 Pro m2. 512Gb
Display(s) Dell Ultrasharp 27" (2560x1440)
Case Lian Li PC-V33WX
Audio Device(s) On Board
Power Supply Seasonic Prime TItanium 850
Software W10
Benchmark Scores Look, it's a Ryzen on air........ What's the point?
A total of 307 comments? Make it 308 now. Only a passionate hate of nVidia can make a thread grow this fast and this large. Whatever, this card is pretty much as fast as two 5870 GPU's (5970) as per the following really cool link and all without all the CrossFire scaling issues, since sadly (for CrossFire tech users that is) SLI is still better tech of the two.

Till the next round then, although I don't think AMD will stick around for that long since their Abu Dhabi sugar daddies... ummm, investors, yeah that's it, "investors", well they aren't doing too well themselves. Let's see, who's got half a dozen to a dozen billion dollars (US) sitting around to be spent in this time of global economic downturn in order to bail out and save AMD yet again? IBM? Microsoft? Sony? Fat chance!

Let me put it this way for hard-core nVidia haters: come Christmas time 2011 (maybe even a few months earlier the way things are going) it's either nVidia GPU or nVidia GPU when it comes to your upgrading purposes.
Odd. I'm an ATI owner and i've been praising the GTX 580, trying to not buy one so i can gauge the competition when it comes out in December. Your post is ignorant with regards to scaling:

http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_580_SLI/24.html GTX 580
1 GTX 580 is 77% of GTX 580 sli (all resolutions)
http://www.techpowerup.com/reviews/ATI/Radeon_HD_6870_CrossFire/23.html HD 6870
1 HD6870 is 73% of HD 6870 crossfire (all resolutions)

So the 6 series scales better in dual gpu config. Granted, on 5 series, the sli option is better but the 6 series nailed it well.

As for hard core NVidia haters (not a nice comment to use - hate is such a strong word) - i think at christmas we'll get a fair choice. My personal feeling is that indeed the 6970 isn't faster than a 580. I think if it was faster there would be some leaks out from AMD PR to say, look, our card is better - hold off buying that 580. But if it doesn't perform as well, there's nothing to leak - better safe to stay quiet.
Hope i'm wrong because if i'm not the 580's will go up in price.

I think though that you're way off base. Most people do tend to take sides but 'hating' isn't part of it. It more shows your own predisposition against AMD. But at least you wear your colours on your sleeve. It makes you prone to make erroneous statements (a la the one above ref: scaling).
 
Joined
May 4, 2009
Messages
1,940
Likes
409
Location
Singapore
System Name penguin
Processor i3-4160
Motherboard Asus H81 Mini-ITX
Cooling Stock
Memory 2x4GB Kingston 1600MHz
Video Card(s) Saphire Radeon 7850 2GB
Storage Plextor M5S 120GB+1TB Seagate
Display(s) 23' Dell
Case CM Elite 130
Audio Device(s) stock
Power Supply Corsair CX430m
Software W7/Lubuntu
That's a horrible idea. Sometimes it takes a good 5 hours for a game to crash from a bad overclock, OCCT will find it in 10-20 minutes, and then you don't need to worry about finding stability with hours of testing for each individual program. And "the furry donut" is only good for heating up your card or telling you you're way past the stability limit, it's not sensitive enough for real stress testing. At least not with current cards. If that or programs based on it is the only test you use you're not going to have a truly stable overclock, then you'll get crashes in games and blame the games or the drivers when it's really user error.
It's always easier to have a stress testing program open, don't get me wrong. But what I was trying to say was that pointless and needless to run it for 5+ hours. I usually set a clock, test for a couple of mins, go higher, test for a couple of mins, go higher, test for a couple of mins. The moment I get artifacts, i go back 10 MHz and try again.Once I'm bored of that, I fire a game and if crashes, I just go back 10-20 MHz on both RAM and Core and try again...

I agree that it's unrealistic to think that a game can go over the 300W limit because of the way game code is written and because of the randomness that the human player creates.
The game-play is always random and that means that the environment is always created in real time. Thus every scene has to go through the entire pipeline and spend finite ammounts in each step of it.
To be fair stress testing tools are more like advanced HPC calculations or even folding, where a specific part is stressed over and over for long periods of time.

Edit:
And if we're talking about corporate takeovers, I think Nvidia will be snatched up first, not because they're in danger of going down or anything crazy like that, but because it would be a smart purchase. Their cards are doing great in the HPC space and it would be a smart move for someone like IBM or Oracle(or even HP and Dell) to snatch them up while Nvidia hasn't gotten too much momentum and are still cheaper. That would allow them to add them to their server farm line up and have an ace up their sleeves compared to the opposition.
 
Last edited:
Joined
Oct 9, 2009
Messages
675
Likes
656
Location
Finland
System Name :P~
Processor Intel Core i7-5930K (ES)
Motherboard Asus Rampage V Extreme/3.1
Cooling Phanteks PH-TC14PE
Memory 32GB Corsair Vengeance LPX 2400 MHz
Video Card(s) Asus GTX 1080 Strix
Storage 400GB Intel 750 PCI-E SSD, 512GB Crucial MX100 SSD, 3TB WD RED HDD
Display(s) QNIX QX2710LED OC @ 96 Hz 27"
Case Corsair Obsidian 750D
Audio Device(s) Audioquest Dragon Red + Sennheiser HD 650
Power Supply Corsair HX1000i + Cablemod sleeved cables kit
Mouse Logitech G500s
Keyboard Logitech Ultra X Flat Premium
Software Windows 10 64-bit
Do I run Furmark 24/7? No.
Does it break if I do run Furmark without the limiter? No.
Does the limiter kick in games even with overvolting and overclocking? No.
Does it prevent someone breaking card if they don't know what they are doing with voltages? Quite possibly.
Card priced right compared to previous gens? Yes.
Fastest single GPU at least for the moment? Yes.
Does it run relatively quiet and at reasonable temps? Yes.
Do I need new card? Yes.

= Ordered GTX 580

Seriously, this bullshit whining about limiters in programs as Furmark is silly, it is not even new thing and even AMD has done driver level limiters. There is huge total of 0 people to whom it is a problem, except in their heads and yet another thing to bash NV about with no intentions to ever even look in the direction of their cards.

Oh and just to be sure: I have had over 10 ATi and 10 NV cards in past 2 years, go figure.

If 580 isn't for you then please move along, I am sure the HD 69xx will come and deliver too. But stop this nonsense please.

/End of rant
 

Stoogie

New Member
Joined
Nov 15, 2010
Messages
5
Likes
0
Location
Australia
Wtf at 5970 scores in WoW?

compare these two

http://www.techpowerup.com/reviews/ATI/Radeon_HD_6870_CrossFire/18.html

http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_580/20.html

5970 is in totally different places in these 2 tests, while the other gpus are at the exact same fps.

are we %100 sure that this site is trustable ?

i looked into this regarding the 6870's CF performance with WoW however the score seems to be half that of just 1 card, i believe that this is a mistake on your end techpowerup when u benchmarked the 6870 cards.

Please give a logical explanation for the 2 entirely different answers to the same benchmark.
 
Last edited:

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
17,021
Likes
17,869
Processor Core i7-4790K
Memory 16 GB
Video Card(s) GTX 1080
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 7
5970 is in totally different places in these 2 tests [reviews], while the other gpus are at the exact same fps.

are we %100 sure that this site is trustable ?
dont trust this site!! read the test setup page before making accusations
 
Joined
Dec 14, 2009
Messages
6,580
Likes
5,813
Location
Glasgow - home of formal profanity
System Name New Ho'Ryzen
Processor Ryzen 1700X @ 3.82Ghz
Motherboard Asus Crosshair VI Hero
Cooling TR Le Grand Macho & custom GPU loop
Memory 16Gb G.Skill 3200 RGB
Video Card(s) GTX1080ti (Heatkiller WB) @ 2Ghz core/1.5(12)Ghz mem
Storage Samsumg 960 Pro m2. 512Gb
Display(s) Dell Ultrasharp 27" (2560x1440)
Case Lian Li PC-V33WX
Audio Device(s) On Board
Power Supply Seasonic Prime TItanium 850
Software W10
Benchmark Scores Look, it's a Ryzen on air........ What's the point?
Can I swear?

BASTARDS!

Overcockers, sorry OverclockersUK are price gouging for sure. Only have the ASUS board in stock and it's £459.99. They'll do this until the HD 6970 comes out. Same way the 6850 and 6870 prices generally increased almost immediately.
 

Stoogie

New Member
Joined
Nov 15, 2010
Messages
5
Likes
0
Location
Australia
if so can you rebenchmark WoW with 10.10 drivers for 6870's in CF?

Edit: my initial post was copied from a overclockers site, maybe i shouldve removed the trust bit lol XD my bad
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
17,021
Likes
17,869
Processor Core i7-4790K
Memory 16 GB
Video Card(s) GTX 1080
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 7
if so can you rebenchmark WoW with 10.10 drivers for 6870's in CF?

Edit: my initial post was copied from a overclockers site, maybe i shouldve removed the trust bit lol XD my bad
just go by the 5970 numbers and the 5970 vs 6870 relative performance in other games

and please go to that forum and tell them what's going on with the numbers, so no need to cry conspiracy