• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce GTX 480 Fermi

TheMailMan78

Big Member
Joined
Jun 3, 2007
Messages
22,599 (3.66/day)
Location
'Merica. The Great SOUTH!
System Name TheMailbox 5.0 / The Mailbox 4.5
Processor RYZEN 1700X / Intel i7 2600k @ 4.2GHz
Motherboard Fatal1ty X370 Gaming K4 / Gigabyte Z77X-UP5 TH Intel LGA 1155
Cooling MasterLiquid PRO 280 / Scythe Katana 4
Memory ADATA RGB 16GB DDR4 2666 16-16-16-39 / G.SKILL Sniper Series 16GB DDR3 1866: 9-9-9-24
Video Card(s) MSI 1080 "Duke" with 8Gb of RAM. Boost Clock 1847 MHz / ASUS 780ti
Storage 256Gb M4 SSD / 128Gb Agelity 4 SSD , 500Gb WD (7200)
Display(s) LG 29" Class 21:9 UltraWide® IPS LED Monitor 2560 x 1080 / Dell 27"
Case Cooler Master MASTERBOX 5t / Cooler Master 922 HAF
Audio Device(s) Realtek ALC1220 Audio Codec / SupremeFX X-Fi with Bose Companion 2 speakers.
Power Supply Seasonic FOCUS Plus Series SSR-750PX 750W Platinum / SeaSonic X Series X650 Gold
Mouse SteelSeries Sensei (RAW) / Logitech G5
Keyboard Razer BlackWidow / Logitech (Unknown)
Software Windows 10 Pro (64-bit)
Benchmark Scores Benching is for bitches.
I've been trying to find any info that suggested that what I'm going to expose has changed but I didn't find any so here we go:

First of all, I'm not saying that Fermi isn't too power hungry or hot, but it's definately not as dramatic as many have claimed. Most claims are based on Furmark readings, especially those who are claiming that Nvidia is lying about TDP and those readings are absolutely misleading when it comes to any brand comparison. The reason is simple, Ati cards throttle down under Furmark to prevent going too high:

http://www.techpowerup.com/index.php?69799

Renaming Furmark will no longer help as AMD succesfully "fixed" that "problem" since Cat 9.8:

http://www.geeks3d.com/20090914/catalyst-9-8-and-9-9-improve-protection-against-furmark/

But that's not all! HD5xxx cards have hardware protection (throttling when a limit is exceeded) against stress tests like Furmark and although that's a good thing for the product, since no single game will stress the cards as much as Furmark, numbers are totally misleading. Furmark numbers don't represent absolute max load as they do on Nvidia cards.

http://www.geeks3d.com/20090925/ati...on-against-power-virus-like-furmark-and-occt/

That feature in the HD5xxx series is fantastic, don't get me wrong, but fact remains true though, that such a protection absolutely denies any attempt of comparison under Furmark load as a valid point.

HD5970 throttling back:

http://www.legionhardware.com/articles_pages/ati_radeon_hd_5970_overclocking_problems,4.html

Hmm but Nvidia doesn't have that feature so it will in fact run hot and go WAY past its listed power draw which equals lie. Plus thats a 5890 your talking about. Duel GPU? I'm afraid your grabbing at straws here man.
 

Benetanegia

New Member
Joined
Sep 11, 2009
Messages
2,680 (0.50/day)
Location
Reaching your left retina.
Hmm but Nvidia doesn't have that feature so it will in fact run hot and go WAY past its listed power draw which equals lie. Plus thats a 5890 your talking about. Duel GPU? I'm afraid your grabbing at straws here man.

It's not lie, since only under Furmark it will go beyond the specified TDP. AMD has put protection so that Furmark does not stress the GPU to those limits. If at all, it's AMD who is lying in that respect, because Furmark is not showing the real max consumption of the cards, Nvidia is. Except that it is not lying, since a card will never reach those limits in any REAL application. Same goes to Nvidia cards, they will never reach those levels under gaming or CUDA apps or whatever you throw at them, as far as is not a synthetic app especifically designed to stress the GPU that far. Any real application will do much more than just stress the shader processors, the SPs do their work, but that work has to go somewhere and has to be treated there too and then go elsewhere, etc. That's why AMD's raw flop numbers are totally meaningless for real apps, because although the SPs can essentially work that hard, the data generated would never be able to go out and be useful. And that's what Furmark does, stress the shaders without the need for the generated data to be useful, without the need for the data to go outside the SPs.

Taking the above into account and the links I posted, which cards are worse when under Furmark before throttling kicks in? Well both the HD4850 and the HD5970 went way above 100C before throttling kicked in, in just 40 seconds of Furmark!!! God knows how far they could go in some minutes under full load. On the other hand the GTX480 stays at around 95C even if there's no throttling going on, so Nvidia took actions on hardware itself to keep the card cool, while AMD used artificial measures. Both use what I would call legit measures, since both are rightly "assuming" that nobody will be able to reach those limits under any real condition and from several years of cards being out there, it's obvious they are right. HD4850s have not died while gaming right? Fermi won't either.

Now if you have to use Furmark all day... yeah you'd need a card that artificially cripples the cards performance to prevent it from burning inside your PC. Maybe Nvidia should release a similar protection on their next drivers? Would you all be happy? Thing is, I doubt it. In fact I bet that although AMD did it first AND still is doing it, if Nvidia did that on their next drivers and Fermi power consumtion went lower especially on Furmark, we would see many many complaints about how Nvidia is cheating, because well, it's Nvidia. Sad but true.
 
Joined
Mar 23, 2005
Messages
4,061 (0.58/day)
Location
Ancient Greece, Acropolis (Time Lord)
System Name RiseZEN Gaming PC
Processor AMD Ryzen 7 5800X @ Auto
Motherboard Asus ROG Strix X570-E Gaming ATX Motherboard
Cooling Corsair H115i Elite Capellix AIO, 280mm Radiator, Dual RGB 140mm ML Series PWM Fans
Memory G.Skill TridentZ 64GB (4 x 16GB) DDR4 3200
Video Card(s) ASUS DUAL RX 6700 XT DUAL-RX6700XT-12G
Storage Corsair Force MP500 480GB M.2 & MP510 480GB M.2 - 2 x WD_BLACK 1TB SN850X NVMe 1TB
Display(s) ASUS ROG Strix 34” XG349C 180Hz 1440p + Asus ROG 27" MG278Q 144Hz WQHD 1440p
Case Corsair Obsidian Series 450D Gaming Case
Audio Device(s) SteelSeries 5Hv2 w/ Sound Blaster Z SE
Power Supply Corsair RM750x Power Supply
Mouse Razer Death-Adder + Viper 8K HZ Ambidextrous Gaming Mouse - Ergonomic Left Hand Edition
Keyboard Logitech G910 Orion Spectrum RGB Gaming Keyboard
Software Windows 11 Pro - 64-Bit Edition
Benchmark Scores I'm the Doctor, Doctor Who. The Definition of Gaming is PC Gaming...
Do you know why FurMark is one of the best ways to determine a GPU's power consumption along with other synthetic benchmarks? So far it’s the only way to measure a cards true power consumption in a consistent manner by utilising something that is constant, won’t change or surprise and change the GPU’s stress level just as you would find in real world gaming. This is the only thing that is great about Synthetic benchmarks. It’s the best way to measure a cards performance vs. previous gen performances.

I believe the results speak for themselves ;)
While there are a few games in which the GTX 480 was faster, there are many resolutions in our test games where the HD 5870 comes out on top. Clearly, the GTX 480 is not the world's fastest single-GPU card.

3DMark06 Canyon Flight test, 1,280 x 1,024 0xAA 16xAF, Peak Temperature
Power Consumption (Idle and Gaming)

http://www.bit-tech.net/hardware/2010/03/27/nvidia-geforce-gtx-480-1-5gb-review/10

QUOTE:
We've found that synthetic benchmarks such as FurMark thrash the GPU constantly, which simply isn't reflective of how GPU will be used when gaming.

It's such a hardcore test that any GPU under test is almost guaranteed to hit its thermal limit, the mark at which the card's firmware will kick in, speeding up the fan to keep the GPU within safe temperature limits.

As the test is so demanding and GPU limited, we've set 3DMark to run the test at 1,280 x 1,024 with 0xAA and 16xAF (enabled in the driver), constantly looping the test for thirty minutes and recording the maximum power consumption and GPU Delta T (the difference between the temperature of the GPU and the ambient temperature in our labs).
 
Joined
Mar 23, 2005
Messages
4,061 (0.58/day)
Location
Ancient Greece, Acropolis (Time Lord)
System Name RiseZEN Gaming PC
Processor AMD Ryzen 7 5800X @ Auto
Motherboard Asus ROG Strix X570-E Gaming ATX Motherboard
Cooling Corsair H115i Elite Capellix AIO, 280mm Radiator, Dual RGB 140mm ML Series PWM Fans
Memory G.Skill TridentZ 64GB (4 x 16GB) DDR4 3200
Video Card(s) ASUS DUAL RX 6700 XT DUAL-RX6700XT-12G
Storage Corsair Force MP500 480GB M.2 & MP510 480GB M.2 - 2 x WD_BLACK 1TB SN850X NVMe 1TB
Display(s) ASUS ROG Strix 34” XG349C 180Hz 1440p + Asus ROG 27" MG278Q 144Hz WQHD 1440p
Case Corsair Obsidian Series 450D Gaming Case
Audio Device(s) SteelSeries 5Hv2 w/ Sound Blaster Z SE
Power Supply Corsair RM750x Power Supply
Mouse Razer Death-Adder + Viper 8K HZ Ambidextrous Gaming Mouse - Ergonomic Left Hand Edition
Keyboard Logitech G910 Orion Spectrum RGB Gaming Keyboard
Software Windows 11 Pro - 64-Bit Edition
Benchmark Scores I'm the Doctor, Doctor Who. The Definition of Gaming is PC Gaming...
There’s no denying the facts, it's so evident that the GTX 480 & 470 are Hot, Power Hungry and sound like a jet engine. But anyway we've got more than 25+ reviews to prove this with different methods of tests. No point in defending something that cannot be defended. All we need to do right now is live with the results and wait for a possible Fermi re-fresh if and when it gets released. But we won’t see anything for another 6+ months IMO. Until then, everybody enjoy your nice cool running HD 5800 series cards O.K.
Thermals and Power Consumption
Living with a card's thermal characteristics, power consumption and noise levels are just as important as its graphics horsepower and it’s here that the GeForce GTX 480 really runs into trouble.

Power consumption at idle was the highest we’ve seen from a single GPU card at 186W system power draw, 18W more than the HD 5870. At load though it entered a whole new dimension for a single GPU card sucking down a massive 382W while looping the canyon flight demo in 3DMark 06. That’s a full 106W more than the Radeon HD 5870 in the same test, 30W more than the dual GPU Radeon HD 5970 and only 6W less than the dual GPU GeForce GTX 295!

Sucking down all that power has clear consequences for the card’s thermal output, and while the GTX 480 idles at a balmy 20°C above room temperature in our 22°C air conditioned labs with a low and utterly un-intrusive fan noise to match, things change for the worse at full load. The GPU temperature rapidly rises to a heady 94°C – 72°C above the ambient room temperature, where the fan speeds up to whatever speed necessary to keep the GPU from getting any hotter.
The result is a graphics card that runs extremely hot at full load, and that coupled with the unique external heatsink it could easily be rebranded the GTX 480 Griddle Edition - the heatsink in our test rig, which, bear in mind is a roomy Antec Twelve Hundred, hit 67°C, which is enough to burn your skin. Nvidia recommends spacing the cards at least two expansion slots apart in an SLI configuration and even then we suspect there will be raft of watercooled editions of the GTX 480 to counteract the massive thermal demands of the GPU.

Adding insult to injury the GTX 480 is also extremely noisy when under load, easily matching the racket of the HD 5970 and comparable to a DVD-ROM drive at full speed when striving to keep the GPU at 94°C. The 65mm paddle fan was easily the loudest component in our Antec 1200 test chassis and was clearly audible from 6ft away through a closed side panel.
http://www.bit-tech.net/hardware/2010/03/27/nvidia-geforce-gtx-480-1-5gb-review/12
And I would have to agree 100%
The higher price, the 100W of extra power consumption, scorchingly hot temperatures and a much noisier stock cooler are all extremely detrimental to its desirability. The HD 5870 remains a far better choice if you're a gamer; while we've yet to see how the GTX 480 performs with CUDA apps and Folding, at this stage Fermi looks like a flop.:D

GTX 480
Performance - 9/10
Features - 6/10
Value - 6/10
Overall - 6/10
 
Joined
May 4, 2009
Messages
1,970 (0.36/day)
Location
Bulgaria
System Name penguin
Processor R7 5700G
Motherboard Asrock B450M Pro4
Cooling Some CM tower cooler that will fit my case
Memory 4 x 8GB Kingston HyperX Fury 2666MHz
Video Card(s) IGP
Storage ADATA SU800 512GB
Display(s) 27' LG
Case Zalman
Audio Device(s) stock
Power Supply Seasonic SS-620GM
Software win10
@Benetanegia: Haha well let's see what do I prefer...to have a fried GPU or be "cheated" and have my GPU throttle down to safe temps...hm-mm what a hard decision. We've had throttling CPUs since the P4 days, and nobody complained, I don't think anyone ever will either considering the consequences otherwise. I think Nvidia's solution is simpler but just as effective.

About Furmark - yes it is a power virus because, just as you said, it stresses the SPs beyond what they were meant to do by means of bypassing the rest of the GPU pipeline and overloading them with calculations - a situation not usefull in a real life situation. You wouldn't see that in any other GPU application.

And why do you insist of saying that Ati's gflops numbers are wrong? I remember we had a similar discussion before. They aren't the only problem is that you'd need smart coding to get to the low level hardware functionality. Remember the SPs are in groups of 5 - 1 for complex calculations and 4 for simple calcs. But if you want to do only dual point calculatins, you can group the 4 simple ones and simulate a second complex SP, indeed reducing the SP count to 640 - the fact why DP gflops is only 2/5 of the SP gflops numbers. The numbers stated by Ati are indeed achievable but only with smart coding specifically for their architecture.

Edit: SuperXP stop spamming your negative propaganda :p I remember that the single slot HD4850s and the original 4870x2 also ran at 90+ degrees and nobody complained as much...
 
Joined
Feb 12, 2007
Messages
1,192 (0.19/day)
Location
scotland
System Name spuds K8-X2
Processor amd athlon X2 4200+ toledo s939 2794mhz 254x11 1.4 vcore
Motherboard MSI K8N Neo4-F v1.0 (MS-7125) nforce4 sata2 mod, laptop cpu heatpipe copper nb cooler
Cooling akasa evo "blue" + 90mm fan, 2x120mm front, 250mm side, 120mm rear, 120mm in psu, pci slot exhaust.
Memory OCZ Platinum XTC DDR PC3200 4GB(4x1024) @254mhz 3-3-3-8 2T
Video Card(s) sapphire HD3870 512mb GDDR4 vf900cu, several ramsinks on components / nvidia 7300gt 256mb secondary
Storage hitachi 160gb (slightly fried) / hitachi 120gb ATA / Seagate 160gb / 2x ps3 seagate 60gb
Display(s) CTX EX1300F 20" flat CRT, 1280x1024@100hz / 19" benq FP91G X / 19" hanns-g (all free)
Case mesh server/gaming black case, 9x 5.25' drive bays, silvestone auto fan controller
Audio Device(s) onboard realtek alc850 7.1/soundblaster LIVE! ct4780 + kxaudio - sony home theatre surround
Power Supply winpower 650w, system draws around 470-500w under load(+all screens)
Software win7 64bit
Benchmark Scores ~16m trips/sec using mty trip generator. triple monitor gaming using SoftTH. 3840x1024
sometimes not disengaging safety features is a good thing . . .


But that's not all! HD5xxx cards have hardware protection (throttling when a limit is exceeded) against stress tests like Furmark and although that's a good thing for the product, since no single game will stress the cards as much as Furmark, numbers are totally misleading. Furmark numbers don't represent absolute max load as they do on Nvidia cards.
 
Last edited:

Benetanegia

New Member
Joined
Sep 11, 2009
Messages
2,680 (0.50/day)
Location
Reaching your left retina.
@Benetanegia: Haha well let's see what do I prefer...to have a fried GPU or be "cheated" and have my GPU throttle down to safe temps...hm-mm what a hard decision. We've had throttling CPUs since the P4 days, and nobody complained, I don't think anyone ever will either considering the consequences otherwise. I think Nvidia's solution is simpler but just as effective.

I have not option but to wonder why you have to take all things personal... I talk about Ati fanboys in one post and you reply what you think. I talk about how people would complain and you reply about what you'd prefer. I'm thinking of an F word and it's not f--k?

And why do you insist of saying that Ati's gflops numbers are wrong? I remember we had a similar discussion before. They aren't the only problem is that you'd need smart coding to get to the low level hardware functionality. Remember the SPs are in groups of 5 - 1 for complex calculations and 4 for simple calcs. But if you want to do only dual point calculatins, you can group the 4 simple ones and simulate a second complex SP, indeed reducing the SP count to 640 - the fact why DP gflops is only 2/5 of the SP gflops numbers. The numbers stated by Ati are indeed achievable but only with smart coding specifically for their architecture.

I never said they were wrong. They are not achievable in any real application, not even AMD's internal apps are achieving anything beyond a 75% or so and that's on very very especific apps. Why do I insist? I was not insisting in the matter, I mentioned that because normal usage is way below the raw "potential" and that's why under normal usage a HD4850 would not go much higher than 90C, but on Furmark, where artificial stressing will increase the usage close to its potential, well, we don't know how high it could reach, all we know is that it would reach 105C++ and get fried. The reson is simple and it's where I was at when I mentioned it. Typical AMD shader usage is around a 40%, which is around 7% higher than the usage found under SGEMM. That's real usage. On Furmark it probably reaches something close to 100%, tbh I have no idea, but probaby nobody knows the exact number or even an aproximation, except AMD. From 33-40% to 100% there's a long way though enough to put temps through the roof.

As to the performance side of things, if it can't be achieved in normal escenarios, it can't be achieved period. Sure you can create an app that uses 4 simple and 1 complex one and bla bla bla, but that's not an application, that's a benchmark, a demo, a showcase. No single application (not even games, transcoding, SGEMM...) will be close to being able to do that, real apps need what they need in the exact moment they need them and AMD's architecture simply isn't suited for that. Period, you can argue as much as you want.[/QUOTE]
 
Joined
May 4, 2009
Messages
1,970 (0.36/day)
Location
Bulgaria
System Name penguin
Processor R7 5700G
Motherboard Asrock B450M Pro4
Cooling Some CM tower cooler that will fit my case
Memory 4 x 8GB Kingston HyperX Fury 2666MHz
Video Card(s) IGP
Storage ADATA SU800 512GB
Display(s) 27' LG
Case Zalman
Audio Device(s) stock
Power Supply Seasonic SS-620GM
Software win10
Uhm, I was actually trying to agree with you in the above post. I was basically saying that I don't care exactly how they prevent a GPU from frying - be it a throttling function or a powerful fan, as long as my expensive GPU doesn't turn into an expensive paperweight :eek:

I'm no mathematician nor a coder so I don't really know how hard it is to write a code utilizing all that hardware, but I'll tell you one thing - there are many people much smarter than you and I who can and will if given enough incentive to do so...
 

Benetanegia

New Member
Joined
Sep 11, 2009
Messages
2,680 (0.50/day)
Location
Reaching your left retina.
I'm no mathematician nor a coder so I don't really know how hard it is to write a code utilizing all that hardware, but I'll tell you one thing - there are many people much smarter than you and I who can and will if given enough incentive to do so...

You don't get what I'm trying to say, but it's probably my problem, it's usually difficult for me to explain such complicated things in a foreign language. Well, it's not only the fact that you can use all the shaders, it's that not always the fact that you are using all those shaders will suppose a true benefit. Take games for example, average shader (ALU) use has been stablished (Beyond3D, Devnet...) to be around 3.8 out of 5 on AMD SPs, which is a 76%, but even that number is not exact or true by any means. Let me explain, of course 75% of shaders are working, but not all of them are producing genuine results, many of them are duplicating work (couldn't find a better word than genuine). This becomes obvious as soon as you realize that 76% of 1.2 TFlops (HD4870) is 912 Gflops, way more than the theoretical 708 Gflops on a GTX285 or 536 Gflops on a GTX260, and those don't have 100% efficiency either, not at all. Basically the HD4870 is calculating twice as much for the same task, otherwise if every flop operation was genuine, that would mean that 900 Gflops was required for a certain performance and the GTX cards would be seriously bottlenecked by shaders. What most probably happens is that, like I said, the AMD card is duplicating many of the calculations and it just makes sense if you think about it: when you have many spare ALUs and your bandwidth is more limited, it doesn't make sense to store some results in vram, even if you know you will need them later, because you know you will have spare ALUs too, so you just calculate things (most things) as they come. Nvidia, on the other hand prefers efficiency over throughoutput and hence they store the output, and as a result they need better caches and intercommunications. Like I have always said, two different ways of achieving the same thing.
 
Joined
Sep 1, 2009
Messages
1,183 (0.22/day)
Location
CO
System Name 4k
Processor AMD 5800x3D
Motherboard MSI MAG b550m Mortar Wifi
Cooling Corsair H100i
Memory 4x8Gb Crucial Ballistix 3600 CL16 bl8g36c16u4b.m8fe1
Video Card(s) Nvidia Reference 3080Ti
Storage ADATA XPG SX8200 Pro 1TB
Display(s) LG 48" C1
Case CORSAIR Carbide AIR 240 Micro-ATX
Audio Device(s) Asus Xonar STX
Power Supply EVGA SuperNOVA 650W
Software Microsoft Windows10 Pro x64

shevanel

New Member
Joined
Jul 27, 2009
Messages
3,464 (0.64/day)
Location
Leesburg, FL
We relayed this information on to NVIDIA and they informed us that our dual monitor idle temp problem will be solved by another new VBIOS that will be released this week that will ramp up the fan speed starting in the 70s instead of the 80s.

yeah, wonderful solution.. Ramp that fan up boys!
 
Joined
May 15, 2005
Messages
3,516 (0.51/day)
System Name Red Matter 2
Processor Ryzen 5600X
Motherboard X470 Gaming Pro Carbon
Cooling Water is Masterliquid 240 Pro
Memory GeiL EVO X 3600mhz 32g also G.Skill Ripjaw series 5 4x8 3600mhz as backup lol
Video Card(s) Gigabyte Gaming Radeon RX 6800
Storage EVO 860. Rocket Q M.2 SSD WD Blue M.2 SSD Seagate Firecuda 2tb storage.
Display(s) ASUS ROG Swift PG32VQ
Case Phantek P400 Glass
Audio Device(s) EVGA NU Audio
Power Supply EVGA G3 850
Mouse Roccat Military/ Razer Deathadder V2
Keyboard Razer Chroma
Software W10
Thanks kids. Thanks for pissing off the admin. so much he's leaving TPU. Hope your all proud of yourselves.
 
Joined
Dec 27, 2007
Messages
8,519 (1.43/day)
Location
Kansas City
System Name The Dove Box Rev 3.0
Processor i7 8700k @ 4.7GHz
Motherboard Asus Maximus X APEX
Cooling Custom water loop
Memory 16GB 3600 MHz DDR4
Video Card(s) 2x MSI 780 Ti's in SLI
Storage 500GB Samsung 850 PCIe SSD, 4TB
Display(s) 27" Asus 144Hz
Case Enermax Fulmo GT
Audio Device(s) ON BOARD FTW
Power Supply Corsair 1200W
Keyboard Logitech G510
Software Win 10 64x
No shit, way to go. Bash on the reviewer and look at what happens.
 
Last edited by a moderator:

trickson

OH, I have such a headache
Joined
Dec 5, 2004
Messages
7,595 (1.07/day)
Location
Planet Earth.
System Name Ryzen TUF.
Processor AMD Ryzen7 3700X
Motherboard Asus TUF X570 Gaming Plus
Cooling Noctua
Memory Gskill RipJaws 3466MHz
Video Card(s) Asus TUF 1650 Super Clocked.
Storage CB 1T M.2 Drive.
Display(s) 73" Soney 4K.
Case Antech LanAir Pro.
Audio Device(s) Denon AVR-S750H
Power Supply Corsair TX750
Mouse Optical
Keyboard K120 Logitech
Software Windows 10 64 bit Home OEM
Great review .
I was thinking of getting one now I just may :D .
 

freaksavior

To infinity ... and beyond!
Joined
Dec 11, 2006
Messages
8,095 (1.28/day)
System Name ZeroUptime | M.A.S.S / MM1
Processor Xeon 2659 v3 / Xeon 2683 v4 / ARM A14
Motherboard Asus X99-E-10G WS / ASRock x99 usb 3.1 / Apple
Cooling NZXT Kraken / Noctua NH-L12 / Apple
Memory 16Gb DDR4 / 32Gb DDR4 / 16GB HBLM
Video Card(s) Powercooler ATI vega 64 / GT 7300 / ARM
Storage Samsung 970 512 Evo NVMe / A lot. / 256 + 512 External TB3
Display(s) Acer Predator X34 / Headless / Acer X34 Non predator
Case NZXT H630 |Rosewill 8bay 4u server chasiss / MMM1
Audio Device(s) Onboard / Onboard / Onboard
Power Supply Corais HX850 | Corsair TX750 / Internal 250w
Mouse g502 proteus core / Headless / g502 proteus core
Keyboard Corsair K95 Cherry Blue / Headless / K65 Cherry Red
Software Windows 10 / ESXI / Big Sur 11.2.2
Honestly, after seeing the review, i have no idea which card I want my girlfriend to buy me.

I really like the non reference cooler 5870's but it looks like the gtx480 does just about as good, and we all know how the driver game goes.

btw, thanks w1zzard :D
 
Joined
Feb 19, 2007
Messages
12,453 (1.99/day)
Location
Yankee lost in the Mountains of East TN
Processor 5800x(2)/5700g/5600x/5600g/2700x/1700x/1700
Motherboard MSI B550 Carbon (2)/ MSI z490 Unify/Asus Strix B550-F/MSI B450 Tomahawk (3)
Cooling EK AIO 360 (2)/EK AIO 240, Arctic Cooling Freezer II 280/EVGA CLC 280/Noctua D15/Cryorig M9(2)
Memory 32 GB Ballistix Elite/32 GB TridentZ/16GB Mushkin Redline Black/16 GB Dominator
Video Card(s) Asus Strix RTX3060/EVGA 970(2)/Asus 750 ti/Old Quadros
Storage Samsung 970 EVO M.2 NVMe 500GB/WD Black M.2 NVMe 500GB/Adata 500gb NVMe
Display(s) Acer 1080p 22"/ (3) Samsung 22" 1080p
Case (2) Lian Li Lancool II Mesh/Corsair 4000D /Phanteks Eclipse 500a/Be Quiet Pure Base 500/Bones of HAF
Power Supply EVGA Supernova 850G(2)/EVGA Supernova GT 650w/Phantek Amps 750w/Seasonic Focus 750w
Mouse Generic Black wireless (5)
Keyboard Generic Black wireless (5)
Software Win 10/Ubuntu
Stop the retarded arguing. If the negativity continues , I'll be handing out major custom infractions.
 
Last edited:
Joined
May 4, 2009
Messages
1,970 (0.36/day)
Location
Bulgaria
System Name penguin
Processor R7 5700G
Motherboard Asrock B450M Pro4
Cooling Some CM tower cooler that will fit my case
Memory 4 x 8GB Kingston HyperX Fury 2666MHz
Video Card(s) IGP
Storage ADATA SU800 512GB
Display(s) 27' LG
Case Zalman
Audio Device(s) stock
Power Supply Seasonic SS-620GM
Software win10
You don't get what I'm trying to say, but it's probably my problem, it's usually difficult for me to explain such complicated things in a foreign language. Well, it's not only the fact that you can use all the shaders, it's that not always the fact that you are using all those shaders will suppose a true benefit. Take games for example, average shader (ALU) use has been stablished (Beyond3D, Devnet...) to be around 3.8 out of 5 on AMD SPs, which is a 76%, but even that number is not exact or true by any means. Let me explain, of course 75% of shaders are working, but not all of them are producing genuine results, many of them are duplicating work (couldn't find a better word than genuine). This becomes obvious as soon as you realize that 76% of 1.2 TFlops (HD4870) is 912 Gflops, way more than the theoretical 708 Gflops on a GTX285 or 536 Gflops on a GTX260, and those don't have 100% efficiency either, not at all. Basically the HD4870 is calculating twice as much for the same task, otherwise if every flop operation was genuine, that would mean that 900 Gflops was required for a certain performance and the GTX cards would be seriously bottlenecked by shaders. What most probably happens is that, like I said, the AMD card is duplicating many of the calculations and it just makes sense if you think about it: when you have many spare ALUs and your bandwidth is more limited, it doesn't make sense to store some results in vram, even if you know you will need them later, because you know you will have spare ALUs too, so you just calculate things (most things) as they come. Nvidia, on the other hand prefers efficiency over throughoutput and hence they store the output, and as a result they need better caches and intercommunications. Like I have always said, two different ways of achieving the same thing.

Ok now I see what you mean. So basically if I got what you're saying, the built in scheduler sucks and does some of the calculations multiple times, thus wasting sp cycles?
 

Andy77

New Member
Joined
May 7, 2009
Messages
119 (0.02/day)
Hm... long story? I hope it was not because of the 9.12!

When I first saw the review I was like "9.12? WTF?!?!" But then I thought, well, he has lots of cards and lots of tests, and they go way back, so no reason bothering him about it, knowing that there will be kids that will do just that, and I hoped somehow that 10.3 will be released... and it did! :) Now I need one for the 5970... probably I'll find something out there in time.
 

qubit

Overclocked quantum bit
Joined
Dec 6, 2007
Messages
17,865 (2.99/day)
Location
Quantum Well UK
System Name Quantumville™
Processor Intel Core i7-2700K @ 4GHz
Motherboard Asus P8Z68-V PRO/GEN3
Cooling Noctua NH-D14
Memory 16GB (2 x 8GB Corsair Vengeance Black DDR3 PC3-12800 C9 1600MHz)
Video Card(s) MSI RTX 2080 SUPER Gaming X Trio
Storage Samsung 850 Pro 256GB | WD Black 4TB | WD Blue 6TB
Display(s) ASUS ROG Strix XG27UQR (4K, 144Hz, G-SYNC compatible) | Asus MG28UQ (4K, 60Hz, FreeSync compatible)
Case Cooler Master HAF 922
Audio Device(s) Creative Sound Blaster X-Fi Fatal1ty PCIe
Power Supply Corsair AX1600i
Mouse Microsoft Intellimouse Pro - Black Shadow
Keyboard Yes
Software Windows 10 Pro 64-bit

Fitseries3

Eleet Hardware Junkie
Joined
Oct 6, 2007
Messages
15,508 (2.57/day)
Location
Republic of Texas
Am i the only person who gets this?

the march 26th "event" was another conundrum** to help nvidia further delay the REAL release of the retail version of the 4 series cards so that nvidia could buy more time to fix the issues at hand.

the "review" samples that were given out were known to be faulty in several ways but it gave everyone something to talk about to shut the fuck up about "when is fermi comming out? its 6months late"

yes, maybe it looks "bad" that the cards are hot and draw a ton of power but it alleviates one problem and starts another.

the big thing i see here is... NO ONE CAN EVER BE HAPPY ABOUT A DAMN THING.

if the gtx480 was 30x faster than 5970 you would still bitch cause the price is too high. but why is the price so high? because its bleeding edge technology and thats the price you pay.

i notice alot of you guys bitching about "well you shoulda used the 10.X driver for ATI... its better" yes... perhaps it is but why is that? because ATI has had time to fix and optimize their drivers for better performance. has nvidia had time to do that? NO. does it cross your mind that perhaps the older driver was used so that both ati and nvidia's offerings could be compared as they were released?

if you are comparing 2 brand new cars off the show room floor would it be "fair" to let company A fix a bunch of their problems before the comparison while company B is forced to be judged on what they brought to the plate as it stands? NO.

these reviews are done with immature cards, and immature drivers. why do you expect so much from them?

perhaps im being an asshole here but i just want to remind you that you should take these early reviews with a grain of salt.

if you think you can do a better review then do so yourself. oh wait... .you cant.. you dont have any gtx480s or gtx470s.

give the man some respect.




**Conundrum is a logical postulation that evades resolution, an intricate and difficult problem
 

mdsx1950

New Member
Joined
Nov 21, 2009
Messages
2,064 (0.39/day)
Location
In a gaming world :D
System Name Knight-X
Processor Intel Core i7-980X [4.2Ghz]
Motherboard ASUS P6T7 WS
Cooling CORSAIR Hydro H70 + 2x CM 2000RPM 120mm LED Fans
Memory Corsair DOMINATOR-GT 12GB (6 x 2GB) DDR3 2000MHz
Video Card(s) AMD Radeon HD6970 2GB CrossFire [PhysX - EVGA GTX260(216SP) 896MB]
Storage 2x Corsair Perf. 512GB | OCZ Colossus 1TB | 2xOCZ Agilities 120GB SSDs
Display(s) Dell UltraSharp™ 3008WFP [2560x1600]
Case Cooler Master HAF-X + NZXT Temp LCD
Audio Device(s) ASUS Xonar D2X | ONKYO HTS9100THX 7.1
Power Supply Silverstone ST1500 1500W [80 PLUS Silver]
Software Windows 7 Ultimate X64 (Build 7600)
Benchmark Scores Heaven 2.0 with Extreme Tessellation at 1080p - 96FPS

DOM

Joined
May 30, 2006
Messages
7,628 (1.17/day)
Location
TX, USA
Processor Intel i7 4770K
Motherboard Asrock
Cooling Water
Memory Team Xtreem LV 16GB (2x8GB)
Video Card(s) EK Full WB HD7970
Display(s) CROSSOVER 27Q LED-P 27"
Case Danger Den Torture Rack
Audio Device(s) Onboard
Power Supply CORSAIR Professional Series Gold AX1200
Software W 10 Pro
Am i the only person who gets this?

the march 26th "event" was another conundrum** to help nvidia further delay the REAL release of the retail version of the 4 series cards so that nvidia could buy more time to fix the issues at hand.

the "review" samples that were given out were known to be faulty in several ways but it gave everyone something to talk about to shut the fuck up about "when is fermi comming out? its 6months late"

yes, maybe it looks "bad" that the cards are hot and draw a ton of power but it alleviates one problem and starts another.

the big thing i see here is... NO ONE CAN EVER BE HAPPY ABOUT A DAMN THING.

if the gtx480 was 30x faster than 5970 you would still bitch cause the price is too high. but why is the price so high? because its bleeding edge technology and thats the price you pay.

i notice alot of you guys bitching about "well you shoulda used the 10.X driver for ATI... its better" yes... perhaps it is but why is that? because ATI has had time to fix and optimize their drivers for better performance. has nvidia had time to do that? NO. does it cross your mind that perhaps the older driver was used so that both ati and nvidia's offerings could be compared as they were released?

if you are comparing 2 brand new cars off the show room floor would it be "fair" to let company A fix a bunch of their problems before the comparison while company B is forced to be judged on what they brought to the plate as it stands? NO.

these reviews are done with immature cards, and immature drivers. why do you expect so much from them?

perhaps im being an asshole here but i just want to remind you that you should take these early reviews with a grain of salt.

if you think you can do a better review then do so yourself. oh wait... .you cant.. you dont have any gtx480s or gtx470s.

give the man some respect.




**Conundrum is a logical postulation that evades resolution, an intricate and difficult problem

fit dont waste your time some ppl will never change, thats a fact look at the world we live in today ppl bitch about everything

and he does retest everytime he does a new review so idk what all the crying was all about if i had the money i would have two of every card to play with but i dont :ohwell:
 
Joined
May 19, 2007
Messages
7,662 (1.24/day)
Location
c:\programs\kitteh.exe
Processor C2Q6600 @ 1.6 GHz
Motherboard Anus PQ5
Cooling ACFPro
Memory GEiL2 x 1 GB PC2 6400
Video Card(s) MSi 4830 (RIP)
Storage Seagate Barracuda 7200.10 320 GB Perpendicular Recording
Display(s) Dell 17'
Case El Cheepo
Audio Device(s) 7.1 Onboard
Power Supply Corsair TX750
Software MCE2K5
Top