• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

ASUS Radeon R9 Fury STRIX 4 GB

Joined
Apr 25, 2013
Messages
127 (0.05/day)
I intentionally mentioned the Reference Sapphire card. It's a better comparison to the power phase efficiency of the Strix. At 1000Mhz for the Sapphire reference, it draws more power than the 1020(?) of the Strix. That's what the power phase is good for. It's clear you dont understand the point of a good power circuit.
Good power circuit = more consistent voltage. Yes, it could lead to better power consumption, but not much.
But, you didn't read my post, right?
"The higher power limit allows the card stay in boost clock in more circumstances, hence higher power consumption"
Sapphire have higher stock voltage and power limit. That's why it draws more power even at lower clock.
 
Joined
Apr 18, 2013
Messages
451 (0.19/day)
Is it not a tad ignorant to claim "awful drivers" if you never used the product?
What makes you even think they are awful?
Ive been using my HD6950 for years now and I have had no issues of any kind with the drivers
(in opposite of my previous Nvidia 8800GTS (G92) and 7900 GTO before it where settings reset and the control panel crashed upon trying to start it and having to DL extra software to be able to tweak stuff etc, personally I always found Nvidias software to feel a lot more crude and unsophisticated vs CCC, an opinion born out of experience).
It's strange that people while not knowing you speak as if they've known you your entire life. It's ever stranger that they try to humiliate you in a process.

Over the past 12 months my friends (who in fact own AMD GPUs) have had the following problems (some of them are still not resolved):

1) No fan speed management, it's running at full speed all the time.
2) Unable to uninstall AMD drivers (also see below).
3) Windows BSOD'ing after ostensibly uninstalling drivers (in fact they didn't uninstall cleanly thus the error - I could only uninstall them in safe mode).
4) Firefox working crazily slowly because of the drivers broken Direct2D acceleration.
5) CCC requiring .Net framework which taxes your CPU/HDD.
6) CCC weighing God knows how much.

I don't even want to touch brain-damaged AMD's website. At the same time practically a literal idiot can use NVIDIA's one.

I had problems only with leaked NVIDIA drivers over ten years ago (NVIDIA since fixed the leak).
 

nem

Joined
Oct 22, 2013
Messages
165 (0.07/day)
Location
Cyberdyne CPU Sky Net
You could always post a welcome and honest apology to @W1zzard for being hideously quick to judge and criticising him. He's probably one of the most flexible and thorough reviewers out there.
yeah i quite agree, lets see like arguin the reviews this point ? this is unsense we claim new reviews to FURY X and FURY , its seem like this some kind of favor to the GeForce... ¬¬
 
Last edited:
Joined
Dec 14, 2009
Messages
7,400 (2.04/day)
Location
Glasgow - home of formal profanity
System Name Newer Ho'Ryzen
Processor Ryzen 3700X
Motherboard Asus Crosshair VI Hero
Cooling TR Le Grand Macho
Memory 16Gb G.Skill 3200 RGB
Video Card(s) RTX 2080ti MSI Duke @2Ghz ish
Storage Samsumg 960 Pro m2. 512Gb
Display(s) LG 32" 165Hz 1440p GSYNC
Case Lian Li PC-V33WX
Audio Device(s) On Board
Power Supply Seasonic Prime TItanium 850
Software W10
Benchmark Scores Look, it's a Ryzen on air........ What's the point?
Good power circuit = more consistent voltage. Yes, it could lead to better power consumption, but not much.
But, you didn't read my post, right?
"The higher power limit allows the card stay in boost clock in more circumstances, hence higher power consumption"
Sapphire have higher stock voltage and power limit. That's why it draws more power even at lower clock.
I do see what you are saying but my point was the better phase power circuitry doesn't mean it's built for higher clocks and better performance. The fact the Sapphire card has the reference PCB and is clocked higher kinda of proves my point. Yes, if you want to over clock you will want good power circuitry - I entirely understand that point but in this situation the more phases create more stable current create better power efficiency. Then again, I once owned a card with a very 'basic' power circuitry but under water and volt soft modded it gave me around 1300Mhz and that was 2 years ago. Moar phases doesn't create the beast - the chip does.

yeah , i quite agree, lets see like arguin the reviews this point ? this is unsense we claim new reviews to FURY X and FURY , in seem like this some kind of favor to side of nVIDIA really! ¬¬
Umm, what? Your post is utter "unsense".
 
Joined
May 13, 2008
Messages
538 (0.13/day)
System Name HTPC whhaaaat?
Processor 2600k @ 4500mhz
Motherboard Asus Maximus IV gene-z gen3
Cooling Noctua NH-C14
Memory Gskill Ripjaw 2x4gb
Video Card(s) EVGA 1080 FTW @ 2037/11016
Storage 2x512GB MX100/1x Agility 3 128gb ssds, Seagate 3TB HDD
Display(s) Vizio P 65'' 4k tv
Case Lian Li pc-c50b
Audio Device(s) Denon 3311
Power Supply Corsair 620HX
So, end of the day, what did we learn from the Fiji launches?

IMHO, it justified some of the viewpoints I've long held about AMD's architecture...do you guys agree or do you see things differently?

1. AMD really needs clockspeeds in the ~1400-1500mhz (capable) range for an extremely compelling part given the properties of today's games.

2. AMD needs to get their CU/ROP ratio under control. 1CU:1ROP is not optimal...it's closer to something like 16ROPs: (14/)15 CUs or 24ROPs:22 CUs. While compute is great, there becomes a point where it's a liability. This is something nvidia learned from Kepler to Maxwell.

3. AMD is not benefited by HDL (high-density libraries), or whatever other jazz (outsourcing/lack of key engineers?) has gotten into their designs since Hawaii. While whatever process (HPM?) likely saves them space and/or may in theory run higher clocks at lower voltage, which obviously for Fiji's design may be crucial in one way (it's the largest it can be at 28nm) or another (Nano might be compelling if something like 850mhz-900/400mhz HBM at .9v core/1v memory vs 970/980), the underlying voltage required for decent clockspeed/performance isn't great, nor is the over-all scaling. While we see newer 390(x) parts doing better than the initial 290 series run, these parts clock worse per volt than any of the original 7000 series by a decent margin (10%?). Also, even figuring Maxwell having a 20-25% deeper pipeline (or whatever changed so their clock-speed scaling is now more similar to ARM A57; I always assumed a presumptive design towards 20nm), their scaling has stayed the same from Kepler to Maxwell. What is going on with AMD's clockspeed problems?

End of the day, I think AMD's arch could be a good one, give or take a few tweaks and changes in philosophy; GCN needs to be rebalanced. While I certainly have no idea what design rules apply, ie does 16 CUs take up just as much space as 15 in a setup engine within the confines of the overall chip parameters and/or can AMD reconfigure such an engine to be 12-24 ROPs etc, something needs to be done in that regard for efficiency. On the same token, changes in process/design need to be applied to allow the chips' clockspeeds to scale, even at the cost of chip size (remember the decap ring in rv790?) or they are flat-out doomed. While there is always an argument for adding more units and lowering clockspeed/voltage, probably especially at we move forward to more mobile-oriented (low-voltage) processes, ATi has always been at the top of their game when using less units and having a greater clockspeed potential than their competition; it saves space/cost (and on former processes stock power consumption) while also making it the 'overclocker's choice'. This philosophy has also surely helped nvidia succeed. Look at how many references you see in this review alone taking note that nvidia's arch, even if at a stock 1000-<1300mhz, and using less units (say 2080-2560 sp+sfu in GM204 or 3360-3840 in GM200) can overclock (even if sometimes while drawing lots of power) super high consistantly: ~1500mhz.


This, in part, is what scares me about 14LPP/16nmFF+ wrt to AMD. Samsung 14nm is seemingly smaller and cheaper (~10%), but likely offset by clock potential vs TSMC and 16nmFF+. While I could very much see AMD dropping a Fiji shrink that is under 225w (typical first parts from AMD on a process are around 188w; half of 375w) and capitalizes completely on die savings and extra perf/v, perhaps even dropping two such small chips (and 2x HBM2) on a single interposer for a crazy-awesome part, the compelling nature of such a chip, let's say (for argument's sake) Fiji at 1400mhz/625mhz and 8GB only goes so far. I seriously fear (for competition's sake) that nvidia will use 14/16nm to both increase floating point (and/or decrease unique special special function) to create a part similar to AMD while maximizing clock speed.

Say, for instance, 224sp-240 (+/- 32sfu) in a SMP (shader module Pascal), up from 128sp (+32 sfu) in Maxwell or 192 (+32 sfu) in Kepler. In a 16 module design (ex: GP104, something replacing GM200 for the slightly lower-end performance market)...the result is something similar to either 3584(4096) or 3840sp...similar or more efficient than Fiji. While nvidia may not capitalize completely on die savings (as amd surely will) nor absolute power consumption (I could see them doing another '980' which is made to draw >225w), we could conceivabley see something that continues to follow the clock scaling path of ARM processors (which on 14/16nm are planned as 2ghz+). While certainly rumors, this theory is backed by the voices in the wind mentioning nvidia ran back to tsmc for their next designs after Samsung's (/GF's) yields were terrible. For AMD, that has to be an incredibly scary thought....I very much doubt they want a 970 vs 290x (but imagine 970 wasn't gimped and 290x drew less power) rematch.

TLDR: I've always appreciated AMD's strengths in engineering, design choices, and pushing technology forward....but something needs to change. I surely hope it does by next generation.
 
Last edited:

nem

Joined
Oct 22, 2013
Messages
165 (0.07/day)
Location
Cyberdyne CPU Sky Net
meanwhile the review of GeForces are maked with the last drives ... ¬¬

#W1zzard pls !



 
Last edited:
Joined
Dec 16, 2012
Messages
3,439 (1.36/day)
Location
Jyväskylä, Finland
System Name Classified
Processor AMD Ryzen 5 2600
Motherboard Asus TUF B450 Plus Gaming
Cooling Custom loop by Alphacool
Memory G.Skill Value 16GB DDR4-2400
Video Card(s) Asus Radeon R9 290 Direct Cu II OC
Storage 2x256GB, 240GB & 480GB SSDs, 500GB & 2TB HDDs
Display(s) 2x 1920x1080 (23" & 22")
Case Corsair Carbide Air 740
Audio Device(s) Sound Blaster Z
Power Supply Seasonic Focus+ Gold 750W
Mouse Logitech G400s
Keyboard Dell keyboard
Software Windows 10 Pro
Benchmark Scores 19545 in Fire Strike with 290 Crossfire OC
No VGA signal should be a thumbs up, it's 2015 already, who the hell would connect a VGA device to these?

I was stunned already when Palit/Gainward custom HD4870X2 had a physical VGA connector and that was like 7 years ago.
 
Joined
Sep 7, 2011
Messages
2,785 (0.93/day)
Location
New Zealand
System Name MoneySink
Processor 2600K @ 4.8
Motherboard P8Z77-V
Cooling AC NexXxos XT45 360, RayStorm, D5T+XSPC tank, Tygon R-3603, Bitspower
Memory 16GB Crucial Ballistix DDR3-1600C8
Video Card(s) GTX 780 SLI (EVGA SC ACX + Giga GHz Ed.)
Storage Kingston HyperX SSD (128) OS, WD RE4 (1TB), RE2 (1TB), Cav. Black (2 x 500GB), Red (4TB)
Display(s) Achieva Shimian QH270-IPSMS (2560x1440) S-IPS
Case NZXT Switch 810
Audio Device(s) onboard Realtek yawn edition
Power Supply Seasonic X-1050
Software Win8.1 Pro
Benchmark Scores 3.5 litres of Pale Ale in 18 minutes.
yeah i quite agree, lets see like arguin the reviews this point ? this is unsense we claim new reviews to FURY X and FURY , its seem like this some kind of favor to the GeForce... ¬¬
meanwhile the review of GeForces are maked with the last drives ... ¬¬
If your location truly is Cyberdyne, the Human race is under no threat whatsoever...unless you're planning on subjugating us using broken syntax.
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
26,547 (5.20/day)
Location
Indiana, USA
Processor Intel Core i7 9900K@5.0GHz
Motherboard AsRock Z370 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB Corsair DDR4-3000
Video Card(s) ASUS Strix GTX 1080Ti
Storage 500GB Crucial MX500 + 8TB with 1TB SSD Cache
Display(s) QNIX QX2710 1440p@120Hz
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
No VGA signal should be a thumbs up, it's 2015 already, who the hell would connect a VGA device to these?
I do. No VGA is a down to me. My main rig, with SLI 970s still has a VGA monitor connected to it.
 

nem

Joined
Oct 22, 2013
Messages
165 (0.07/day)
Location
Cyberdyne CPU Sky Net
If your location truly is Cyberdyne, the Human race is under no threat whatsoever...unless you're planning on subjugating us using broken syntax.
you did forgot I'm a robot.. :B
 
Joined
Sep 7, 2011
Messages
2,785 (0.93/day)
Location
New Zealand
System Name MoneySink
Processor 2600K @ 4.8
Motherboard P8Z77-V
Cooling AC NexXxos XT45 360, RayStorm, D5T+XSPC tank, Tygon R-3603, Bitspower
Memory 16GB Crucial Ballistix DDR3-1600C8
Video Card(s) GTX 780 SLI (EVGA SC ACX + Giga GHz Ed.)
Storage Kingston HyperX SSD (128) OS, WD RE4 (1TB), RE2 (1TB), Cav. Black (2 x 500GB), Red (4TB)
Display(s) Achieva Shimian QH270-IPSMS (2560x1440) S-IPS
Case NZXT Switch 810
Audio Device(s) onboard Realtek yawn edition
Power Supply Seasonic X-1050
Software Win8.1 Pro
Benchmark Scores 3.5 litres of Pale Ale in 18 minutes.
Joined
Mar 24, 2011
Messages
2,311 (0.73/day)
Location
Essex Jct, VT
Processor AMD Ryzen 5 2600
Motherboard Gigabyte B450 Aurorus Elite
Cooling Stock
Memory 16GB (2x8GB) Corsair Vengence LPX DDR4
Video Card(s) Gigabyte GTX 1060 Windforce OC 6GB
Storage Samsung EVO 850 256GB / Samsung EVO 860 500GB / WD Caviar Black 1TB
Display(s) AOC G2590FX
Case NZXT H500 Mid-Tower
Audio Device(s) Onboard
Power Supply Corsair RM650x 650W Fully Modular
Software Windows 10
I see no reason to get this rather than a 980 or 980 Ti to be honest. The price/performance doesn't match it well against either card...
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
19,777 (3.49/day)
Processor Core i7-4790K
Memory 16 GB
Video Card(s) GTX 1080
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 7
meanwhile the review of GeForces are maked with the last drives ... ¬¬
Are you asking why a review posted on July 7 was not using a driver posted on July 9 ? I don't have a time machine yet
 
Joined
Oct 22, 2014
Messages
7,165 (3.87/day)
Location
Sunshine Coast
System Name Black Box
Processor Intel Xeon E5-2680 10c/20t 2.8GHz @ 3.0GHz
Motherboard Asrock X79 Extreme 11
Cooling Coolermaster 240 RGB A.I.O.
Memory G. Skill 16Gb (4x4Gb) 2133Mhz
Video Card(s) Nvidia GTX 710
Storage Sandisk X 400 256Gb
Display(s) AOC 22" Freesync 1m.s. 75Hz
Case Corsair 450D High Air Flow.
Audio Device(s) No need.
Power Supply FSP Aurum 650W
Mouse Yes
Keyboard Of course
Software W10 Home Premium 64 bit
Are you asking why a review posted on July 7 was not using a driver posted on July 9 ? I don't have a time machine yet
I'd love to read the review when you get one :respect:
 
  • Like
Reactions: nem

nem

Joined
Oct 22, 2013
Messages
165 (0.07/day)
Location
Cyberdyne CPU Sky Net
well dont surpized since W
Are you asking why a review posted on July 7 was not using a driver posted on July 9 ? I don't have a time machine yet
o_O?

 
Joined
Sep 29, 2011
Messages
208 (0.07/day)
Location
Ottawa, Canada
System Name Current Rig
Processor AMD Ryzen 7 1700@3.95GHz
Motherboard Asus X370 Crosshair VI
Cooling Arctic Cooling 240mm
Memory 2x8GB DDR4-3200 G.Skill Trident Z RGB
Video Card(s) Gigabyte Windforce R9 290 (bios flashed to 1050MHz core
Storage 1TB SSD
Display(s) 3x22" LG Flatron (eyefinity)
Case Cooler Master Storm Striker
Power Supply Antec True Power 750w
You mean like Nvidia with their ultimate fuck you to customers with their faulty laptop chips that rendered my 1500e laptop useless after a year and in my country they didn't offered extended warranty or anything.
This happened to me, too. Bought a Toshiba Tecra M3, and sold it to my mother-in-law about a year later. Damn nVidia graphics chip died less than a month afterwards, and I found out later that it was one of the 'Bump-gate' scandal chips that nVidia washed their hands of and never reimbursed many, many customers over. Then I had a GTX670 card that wouldn't power up all three monitors in a three monitor setup without going through about 15 steps each time there was a driver update. Then there;Then there's Phys-X and now Hairworks' dirty tricks to deliberately make games run like crap on AMD graphics cards by cranking up the tessellation levels to 11, even though it looks identical at half that level. Finally, nVidia sells tons of GTX970 cards, advertising them as '4GB' cards, only to have it revealed that really, they've got 3.5GB of 'fast' memory and .5GB of 'slow' memory, because it allows them to use a cheaper memory subsystem design and squeeze a few extra bucks out of their customers.

After all this, why would people continue to trust them? AMD may have made mistakes in the past, but at least they weren't straight up, naked attempts to rip off their own customers.
 
  • Like
Reactions: nem

nem

Joined
Oct 22, 2013
Messages
165 (0.07/day)
Location
Cyberdyne CPU Sky Net
This happened to me, too. Bought a Toshiba Tecra M3, and sold it to my mother-in-law about a year later. Damn nVidia graphics chip died less than a month afterwards, and I found out later that it was one of the 'Bump-gate' scandal chips that nVidia washed their hands of and never reimbursed many, many customers over. Then I had a GTX670 card that wouldn't power up all three monitors in a three monitor setup without going through about 15 steps each time there was a driver update. Then there;Then there's Phys-X and now Hairworks' dirty tricks to deliberately make games run like crap on AMD graphics cards by cranking up the tessellation levels to 11, even though it looks identical at half that level. Finally, nVidia sells tons of GTX970 cards, advertising them as '4GB' cards, only to have it revealed that really, they've got 3.5GB of 'fast' memory and .5GB of 'slow' memory, because it allows them to use a cheaper memory subsystem design and squeeze a few extra bucks out of their customers.

After all this, why would people continue to trust them? AMD may have made mistakes in the past, but at least they weren't straight up, naked attempts to rip off their own customers.
so what about the premium support and high quality of nVIDIA then ? :B
 
Joined
Sep 7, 2011
Messages
2,785 (0.93/day)
Location
New Zealand
System Name MoneySink
Processor 2600K @ 4.8
Motherboard P8Z77-V
Cooling AC NexXxos XT45 360, RayStorm, D5T+XSPC tank, Tygon R-3603, Bitspower
Memory 16GB Crucial Ballistix DDR3-1600C8
Video Card(s) GTX 780 SLI (EVGA SC ACX + Giga GHz Ed.)
Storage Kingston HyperX SSD (128) OS, WD RE4 (1TB), RE2 (1TB), Cav. Black (2 x 500GB), Red (4TB)
Display(s) Achieva Shimian QH270-IPSMS (2560x1440) S-IPS
Case NZXT Switch 810
Audio Device(s) onboard Realtek yawn edition
Power Supply Seasonic X-1050
Software Win8.1 Pro
Benchmark Scores 3.5 litres of Pale Ale in 18 minutes.
Last edited:

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
19,777 (3.49/day)
Processor Core i7-4790K
Memory 16 GB
Video Card(s) GTX 1080
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 7
That's the image you posted, date is july 7, zotac 980 ti review uses 15.15 beta.

_this_ article, the fury review, posted on july 10, uses 15.7
 
Joined
Sep 29, 2011
Messages
208 (0.07/day)
Location
Ottawa, Canada
System Name Current Rig
Processor AMD Ryzen 7 1700@3.95GHz
Motherboard Asus X370 Crosshair VI
Cooling Arctic Cooling 240mm
Memory 2x8GB DDR4-3200 G.Skill Trident Z RGB
Video Card(s) Gigabyte Windforce R9 290 (bios flashed to 1050MHz core
Storage 1TB SSD
Display(s) 3x22" LG Flatron (eyefinity)
Case Cooler Master Storm Striker
Power Supply Antec True Power 750w
well, amd is in deep trouble, so it has to surface somewhere. like chaotic pricing model in their last spasmatic attmept to grab a market share and generate revenue.. i dont see as likely for anyone to buy a g-card with a heatsink larger than the g-card itself and zero oc potential.. now i have to wonder even more what exactly amd expect from the nano model.......... :confused:

edit: oops, i thought the pcb was smaller, i skipped first few pages of the review :D ok.. the heatsink ISNT larger in length and width than the card/pcb.. ok, ok............ meh
Knee-jerk anti-AMD response perhaps? I don't recall people accusing nVidia of being in deep trouble when they asked $3000 for the first Titan. Greedy, yes.
 
  • Like
Reactions: nem
Joined
Sep 28, 2012
Messages
289 (0.11/day)
System Name 12 Wheeler Dump Truck
Processor AMD Ryzen 5 2600
Motherboard Gigabyte x470 Aorus Ultra
Cooling All stocks
Memory 32 GB Team Delta DDR4 3000Mhz
Video Card(s) XFX RX Vega 56 CrossfireX
Storage Samsung 960 Evo 500GB + 12TB Toshiba
Display(s) Viewsonic XG3240C
Case Phanteks Eclipse P400
Audio Device(s) crappy onboard to heavily modded Logitech z5500
Power Supply Seasonic Prime Ultra Titanium 650TR
Mouse Logitech G 304 wireless
Keyboard Logitech G710+
Software running many VM's
Benchmark Scores Who need bench when everything already fast?
Price is just a little too steep :shadedshu:
Oh well...i think i'm gonna wait a little while and for the time being i might go three fire :laugh:
 
Joined
Sep 29, 2011
Messages
208 (0.07/day)
Location
Ottawa, Canada
System Name Current Rig
Processor AMD Ryzen 7 1700@3.95GHz
Motherboard Asus X370 Crosshair VI
Cooling Arctic Cooling 240mm
Memory 2x8GB DDR4-3200 G.Skill Trident Z RGB
Video Card(s) Gigabyte Windforce R9 290 (bios flashed to 1050MHz core
Storage 1TB SSD
Display(s) 3x22" LG Flatron (eyefinity)
Case Cooler Master Storm Striker
Power Supply Antec True Power 750w
Actually I've never owned a single AMD GPU but I'd like to see the credit where it's due.

Riva TNT2 ->MX 440 8x->GeForce FX 5600 (owned for just a month - I hated it)->GeForce 6600-> GeForce 7600 GT->GeForce 8800 GT->(currently) Gigabyte GeForce GTX 660->Intend to buy GeForce 960 Ti if it gets released - if it's not, I will wait for Pascal

The reason why I've always avoided ATI/AMD is due to their awful drivers. They still are.
Strange that you are an authority on AMD drivers when, by your own admission, you've "never owned a single AMD GPU". I, on the other hand, have had mostly AMD Radeon cards over the last 7 years. Since a Radeon 4850, I've had a Radeon 6950 (briefly), a GTX670 (for about a month, which I sold because the nVidia drivers kept dropping my 3rd monitor after every driver update, and it required a 15 step process to get it back), a Radeon 7950 (bios flashed using a 7970 bios to 1000MHz - thanks TechPowerUp!), and now I've got a Gigabyte Radeon R9 290 (non-OC, but bios flashed with the OC bios to 1040MHz - again, thanks TechPowerUp for the bios!).

Frankly, the 'bad AMD drivers' argument is obsolete. AMD drivers have been very good for several years now. It's true that AMD sometimes lags behind nVidia in optimizing for a specific game, but that's also true of nVidia. Their driver for Tomb Raider that worked properly came out nearly a month after AMD's. Point is, AMD drivers are now excellent for the mostpart.
 
Top