• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

ASUS Radeon R9 Fury STRIX 4 GB

Joined
Apr 25, 2013
Messages
127 (0.03/day)
I intentionally mentioned the Reference Sapphire card. It's a better comparison to the power phase efficiency of the Strix. At 1000Mhz for the Sapphire reference, it draws more power than the 1020(?) of the Strix. That's what the power phase is good for. It's clear you dont understand the point of a good power circuit.
Good power circuit = more consistent voltage. Yes, it could lead to better power consumption, but not much.
But, you didn't read my post, right?
"The higher power limit allows the card stay in boost clock in more circumstances, hence higher power consumption"
Sapphire have higher stock voltage and power limit. That's why it draws more power even at lower clock.
 
Joined
Apr 18, 2013
Messages
1,260 (0.32/day)
Location
Artem S. Tashkinov
Is it not a tad ignorant to claim "awful drivers" if you never used the product?
What makes you even think they are awful?
Ive been using my HD6950 for years now and I have had no issues of any kind with the drivers
(in opposite of my previous Nvidia 8800GTS (G92) and 7900 GTO before it where settings reset and the control panel crashed upon trying to start it and having to DL extra software to be able to tweak stuff etc, personally I always found Nvidias software to feel a lot more crude and unsophisticated vs CCC, an opinion born out of experience).

It's strange that people while not knowing you speak as if they've known you your entire life. It's ever stranger that they try to humiliate you in a process.

Over the past 12 months my friends (who in fact own AMD GPUs) have had the following problems (some of them are still not resolved):

1) No fan speed management, it's running at full speed all the time.
2) Unable to uninstall AMD drivers (also see below).
3) Windows BSOD'ing after ostensibly uninstalling drivers (in fact they didn't uninstall cleanly thus the error - I could only uninstall them in safe mode).
4) Firefox working crazily slowly because of the drivers broken Direct2D acceleration.
5) CCC requiring .Net framework which taxes your CPU/HDD.
6) CCC weighing God knows how much.

I don't even want to touch brain-damaged AMD's website. At the same time practically a literal idiot can use NVIDIA's one.

I had problems only with leaked NVIDIA drivers over ten years ago (NVIDIA since fixed the leak).
 

nem

Joined
Oct 22, 2013
Messages
165 (0.04/day)
Location
Cyberdyne CPU Sky Net
You could always post a welcome and honest apology to @W1zzard for being hideously quick to judge and criticising him. He's probably one of the most flexible and thorough reviewers out there.
yeah i quite agree, lets see like arguin the reviews this point ? this is unsense we claim new reviews to FURY X and FURY , its seem like this some kind of favor to the GeForce... ¬¬
 
Last edited:

the54thvoid

Intoxicated Moderator
Staff member
Joined
Dec 14, 2009
Messages
12,378 (2.37/day)
Location
Glasgow - home of formal profanity
Processor Ryzen 7800X3D
Motherboard MSI MAG Mortar B650 (wifi)
Cooling be quiet! Dark Rock Pro 4
Memory 32GB Kingston Fury
Video Card(s) Gainward RTX4070ti
Storage Seagate FireCuda 530 M.2 1TB / Samsumg 960 Pro M.2 512Gb
Display(s) LG 32" 165Hz 1440p GSYNC
Case Asus Prime AP201
Audio Device(s) On Board
Power Supply be quiet! Pure POwer M12 850w Gold (ATX3.0)
Software W10
Good power circuit = more consistent voltage. Yes, it could lead to better power consumption, but not much.
But, you didn't read my post, right?
"The higher power limit allows the card stay in boost clock in more circumstances, hence higher power consumption"
Sapphire have higher stock voltage and power limit. That's why it draws more power even at lower clock.

I do see what you are saying but my point was the better phase power circuitry doesn't mean it's built for higher clocks and better performance. The fact the Sapphire card has the reference PCB and is clocked higher kinda of proves my point. Yes, if you want to over clock you will want good power circuitry - I entirely understand that point but in this situation the more phases create more stable current create better power efficiency. Then again, I once owned a card with a very 'basic' power circuitry but under water and volt soft modded it gave me around 1300Mhz and that was 2 years ago. Moar phases doesn't create the beast - the chip does.

yeah , i quite agree, lets see like arguin the reviews this point ? this is unsense we claim new reviews to FURY X and FURY , in seem like this some kind of favor to side of nVIDIA really! ¬¬

Umm, what? Your post is utter "unsense".
 
Joined
May 13, 2008
Messages
658 (0.11/day)
System Name HTPC whhaaaat?
Processor 2600k @ 4500mhz
Motherboard Asus Maximus IV gene-z gen3
Cooling Noctua NH-C14
Memory Gskill Ripjaw 2x4gb
Video Card(s) EVGA 1080 FTW @ 2037/11016
Storage 2x512GB MX100/1x Agility 3 128gb ssds, Seagate 3TB HDD
Display(s) Vizio P 65'' 4k tv
Case Lian Li pc-c50b
Audio Device(s) Denon 3311
Power Supply Corsair 620HX
So, end of the day, what did we learn from the Fiji launches?

IMHO, it justified some of the viewpoints I've long held about AMD's architecture...do you guys agree or do you see things differently?

1. AMD really needs clockspeeds in the ~1400-1500mhz (capable) range for an extremely compelling part given the properties of today's games.

2. AMD needs to get their CU/ROP ratio under control. 1CU:1ROP is not optimal...it's closer to something like 16ROPs: (14/)15 CUs or 24ROPs:22 CUs. While compute is great, there becomes a point where it's a liability. This is something nvidia learned from Kepler to Maxwell.

3. AMD is not benefited by HDL (high-density libraries), or whatever other jazz (outsourcing/lack of key engineers?) has gotten into their designs since Hawaii. While whatever process (HPM?) likely saves them space and/or may in theory run higher clocks at lower voltage, which obviously for Fiji's design may be crucial in one way (it's the largest it can be at 28nm) or another (Nano might be compelling if something like 850mhz-900/400mhz HBM at .9v core/1v memory vs 970/980), the underlying voltage required for decent clockspeed/performance isn't great, nor is the over-all scaling. While we see newer 390(x) parts doing better than the initial 290 series run, these parts clock worse per volt than any of the original 7000 series by a decent margin (10%?). Also, even figuring Maxwell having a 20-25% deeper pipeline (or whatever changed so their clock-speed scaling is now more similar to ARM A57; I always assumed a presumptive design towards 20nm), their scaling has stayed the same from Kepler to Maxwell. What is going on with AMD's clockspeed problems?

End of the day, I think AMD's arch could be a good one, give or take a few tweaks and changes in philosophy; GCN needs to be rebalanced. While I certainly have no idea what design rules apply, ie does 16 CUs take up just as much space as 15 in a setup engine within the confines of the overall chip parameters and/or can AMD reconfigure such an engine to be 12-24 ROPs etc, something needs to be done in that regard for efficiency. On the same token, changes in process/design need to be applied to allow the chips' clockspeeds to scale, even at the cost of chip size (remember the decap ring in rv790?) or they are flat-out doomed. While there is always an argument for adding more units and lowering clockspeed/voltage, probably especially at we move forward to more mobile-oriented (low-voltage) processes, ATi has always been at the top of their game when using less units and having a greater clockspeed potential than their competition; it saves space/cost (and on former processes stock power consumption) while also making it the 'overclocker's choice'. This philosophy has also surely helped nvidia succeed. Look at how many references you see in this review alone taking note that nvidia's arch, even if at a stock 1000-<1300mhz, and using less units (say 2080-2560 sp+sfu in GM204 or 3360-3840 in GM200) can overclock (even if sometimes while drawing lots of power) super high consistantly: ~1500mhz.


This, in part, is what scares me about 14LPP/16nmFF+ wrt to AMD. Samsung 14nm is seemingly smaller and cheaper (~10%), but likely offset by clock potential vs TSMC and 16nmFF+. While I could very much see AMD dropping a Fiji shrink that is under 225w (typical first parts from AMD on a process are around 188w; half of 375w) and capitalizes completely on die savings and extra perf/v, perhaps even dropping two such small chips (and 2x HBM2) on a single interposer for a crazy-awesome part, the compelling nature of such a chip, let's say (for argument's sake) Fiji at 1400mhz/625mhz and 8GB only goes so far. I seriously fear (for competition's sake) that nvidia will use 14/16nm to both increase floating point (and/or decrease unique special special function) to create a part similar to AMD while maximizing clock speed.

Say, for instance, 224sp-240 (+/- 32sfu) in a SMP (shader module Pascal), up from 128sp (+32 sfu) in Maxwell or 192 (+32 sfu) in Kepler. In a 16 module design (ex: GP104, something replacing GM200 for the slightly lower-end performance market)...the result is something similar to either 3584(4096) or 3840sp...similar or more efficient than Fiji. While nvidia may not capitalize completely on die savings (as amd surely will) nor absolute power consumption (I could see them doing another '980' which is made to draw >225w), we could conceivabley see something that continues to follow the clock scaling path of ARM processors (which on 14/16nm are planned as 2ghz+). While certainly rumors, this theory is backed by the voices in the wind mentioning nvidia ran back to tsmc for their next designs after Samsung's (/GF's) yields were terrible. For AMD, that has to be an incredibly scary thought....I very much doubt they want a 970 vs 290x (but imagine 970 wasn't gimped and 290x drew less power) rematch.

TLDR: I've always appreciated AMD's strengths in engineering, design choices, and pushing technology forward....but something needs to change. I surely hope it does by next generation.
 
Last edited:

nem

Joined
Oct 22, 2013
Messages
165 (0.04/day)
Location
Cyberdyne CPU Sky Net
meanwhile the review of GeForces are maked with the last drives ... ¬¬

#W1zzard pls !



 
Last edited:

Chloefile

S.T.A.R.S.
Joined
Dec 16, 2012
Messages
10,878 (2.64/day)
Location
Finland
System Name 4K-gaming
Processor AMD Ryzen 7 5800X
Motherboard Gigabyte B550M Aorus Elite
Cooling Custom loop (CPU+GPU, 240 & 120 rads)
Memory 32GB Kingston HyperX Fury @ DDR4-3466
Video Card(s) PowerColor RX 6700 XT Fighter
Storage ~4TB SSD + 6TB HDD
Display(s) Acer 27" 4K120 IPS + Lenovo 32" 4K60 IPS
Case Fractal Design Define Mini C
Audio Device(s) Asus TUF H3 Wireless
Power Supply EVGA Supernova G2 750W
Mouse Logitech MX518 Legendary
Keyboard Roccat Vulcan 121 AIMO
VR HMD Oculus Rift CV1
Software Windows 11 Pro
Benchmark Scores It runs Crysis remastered at 4K
No VGA signal should be a thumbs up, it's 2015 already, who the hell would connect a VGA device to these?

I was stunned already when Palit/Gainward custom HD4870X2 had a physical VGA connector and that was like 7 years ago.
 
Joined
Sep 7, 2011
Messages
2,785 (0.61/day)
Location
New Zealand
System Name MoneySink
Processor 2600K @ 4.8
Motherboard P8Z77-V
Cooling AC NexXxos XT45 360, RayStorm, D5T+XSPC tank, Tygon R-3603, Bitspower
Memory 16GB Crucial Ballistix DDR3-1600C8
Video Card(s) GTX 780 SLI (EVGA SC ACX + Giga GHz Ed.)
Storage Kingston HyperX SSD (128) OS, WD RE4 (1TB), RE2 (1TB), Cav. Black (2 x 500GB), Red (4TB)
Display(s) Achieva Shimian QH270-IPSMS (2560x1440) S-IPS
Case NZXT Switch 810
Audio Device(s) onboard Realtek yawn edition
Power Supply Seasonic X-1050
Software Win8.1 Pro
Benchmark Scores 3.5 litres of Pale Ale in 18 minutes.
yeah i quite agree, lets see like arguin the reviews this point ? this is unsense we claim new reviews to FURY X and FURY , its seem like this some kind of favor to the GeForce... ¬¬
meanwhile the review of GeForces are maked with the last drives ... ¬¬

If your location truly is Cyberdyne, the Human race is under no threat whatsoever...unless you're planning on subjugating us using broken syntax.
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,472 (4.25/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
No VGA signal should be a thumbs up, it's 2015 already, who the hell would connect a VGA device to these?

I do. No VGA is a down to me. My main rig, with SLI 970s still has a VGA monitor connected to it.
 

nem

Joined
Oct 22, 2013
Messages
165 (0.04/day)
Location
Cyberdyne CPU Sky Net
If your location truly is Cyberdyne, the Human race is under no threat whatsoever...unless you're planning on subjugating us using broken syntax.
you did forgot I'm a robot.. :B
 
Joined
Sep 7, 2011
Messages
2,785 (0.61/day)
Location
New Zealand
System Name MoneySink
Processor 2600K @ 4.8
Motherboard P8Z77-V
Cooling AC NexXxos XT45 360, RayStorm, D5T+XSPC tank, Tygon R-3603, Bitspower
Memory 16GB Crucial Ballistix DDR3-1600C8
Video Card(s) GTX 780 SLI (EVGA SC ACX + Giga GHz Ed.)
Storage Kingston HyperX SSD (128) OS, WD RE4 (1TB), RE2 (1TB), Cav. Black (2 x 500GB), Red (4TB)
Display(s) Achieva Shimian QH270-IPSMS (2560x1440) S-IPS
Case NZXT Switch 810
Audio Device(s) onboard Realtek yawn edition
Power Supply Seasonic X-1050
Software Win8.1 Pro
Benchmark Scores 3.5 litres of Pale Ale in 18 minutes.
Joined
Mar 24, 2011
Messages
2,356 (0.50/day)
Location
VT
Processor Intel i7-10700k
Motherboard Gigabyte Aurorus Ultra z490
Cooling Corsair H100i RGB
Memory 32GB (4x8GB) Corsair Vengeance DDR4-3200MHz
Video Card(s) MSI Gaming Trio X 3070 LHR
Display(s) ASUS MG278Q / AOC G2590FX
Case Corsair X4000 iCue
Audio Device(s) Onboard
Power Supply Corsair RM650x 650W Fully Modular
Software Windows 10
I see no reason to get this rather than a 980 or 980 Ti to be honest. The price/performance doesn't match it well against either card...
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
26,957 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
meanwhile the review of GeForces are maked with the last drives ... ¬¬
Are you asking why a review posted on July 7 was not using a driver posted on July 9 ? I don't have a time machine yet
 
Joined
Oct 22, 2014
Messages
13,210 (3.83/day)
Location
Sunshine Coast
System Name Black Box
Processor Intel Xeon E3-1260L v5
Motherboard MSI E3 KRAIT Gaming v5
Cooling Tt tower + 120mm Tt fan
Memory G.Skill 16GB 3600 C18
Video Card(s) Asus GTX 970 Mini
Storage Kingston A2000 512Gb NVME
Display(s) AOC 24" Freesync 1m.s. 75Hz
Case Corsair 450D High Air Flow.
Audio Device(s) No need.
Power Supply FSP Aurum 650W
Mouse Yes
Keyboard Of course
Software W10 Pro 64 bit
Are you asking why a review posted on July 7 was not using a driver posted on July 9 ? I don't have a time machine yet
I'd love to read the review when you get one :respect:
 
  • Like
Reactions: nem

nem

Joined
Oct 22, 2013
Messages
165 (0.04/day)
Location
Cyberdyne CPU Sky Net
well dont surpized since W
Are you asking why a review posted on July 7 was not using a driver posted on July 9 ? I don't have a time machine yet

o_O?

 
Joined
Sep 29, 2011
Messages
217 (0.05/day)
Location
Ottawa, Canada
System Name Current Rig
Processor Intel 12700K@5.1GHz
Motherboard MSI Pro Z790-P
Cooling Arctic Cooling Liquid Freezer II 360mm
Memory 2x16GB DDR5-6000 G.Skill Trident Z RGB
Video Card(s) MSI Gaming X Trio 6800 16GB
Storage 1TB SSD
Case Cooler Master Storm Striker
Power Supply Antec True Power 750w
Keyboard IBM Model 'M"
You mean like Nvidia with their ultimate fuck you to customers with their faulty laptop chips that rendered my 1500e laptop useless after a year and in my country they didn't offered extended warranty or anything.

This happened to me, too. Bought a Toshiba Tecra M3, and sold it to my mother-in-law about a year later. Damn nVidia graphics chip died less than a month afterwards, and I found out later that it was one of the 'Bump-gate' scandal chips that nVidia washed their hands of and never reimbursed many, many customers over. Then I had a GTX670 card that wouldn't power up all three monitors in a three monitor setup without going through about 15 steps each time there was a driver update. Then there;Then there's Phys-X and now Hairworks' dirty tricks to deliberately make games run like crap on AMD graphics cards by cranking up the tessellation levels to 11, even though it looks identical at half that level. Finally, nVidia sells tons of GTX970 cards, advertising them as '4GB' cards, only to have it revealed that really, they've got 3.5GB of 'fast' memory and .5GB of 'slow' memory, because it allows them to use a cheaper memory subsystem design and squeeze a few extra bucks out of their customers.

After all this, why would people continue to trust them? AMD may have made mistakes in the past, but at least they weren't straight up, naked attempts to rip off their own customers.
 
  • Like
Reactions: nem

nem

Joined
Oct 22, 2013
Messages
165 (0.04/day)
Location
Cyberdyne CPU Sky Net
This happened to me, too. Bought a Toshiba Tecra M3, and sold it to my mother-in-law about a year later. Damn nVidia graphics chip died less than a month afterwards, and I found out later that it was one of the 'Bump-gate' scandal chips that nVidia washed their hands of and never reimbursed many, many customers over. Then I had a GTX670 card that wouldn't power up all three monitors in a three monitor setup without going through about 15 steps each time there was a driver update. Then there;Then there's Phys-X and now Hairworks' dirty tricks to deliberately make games run like crap on AMD graphics cards by cranking up the tessellation levels to 11, even though it looks identical at half that level. Finally, nVidia sells tons of GTX970 cards, advertising them as '4GB' cards, only to have it revealed that really, they've got 3.5GB of 'fast' memory and .5GB of 'slow' memory, because it allows them to use a cheaper memory subsystem design and squeeze a few extra bucks out of their customers.

After all this, why would people continue to trust them? AMD may have made mistakes in the past, but at least they weren't straight up, naked attempts to rip off their own customers.
so what about the premium support and high quality of nVIDIA then ? :B
 
Joined
Sep 7, 2011
Messages
2,785 (0.61/day)
Location
New Zealand
System Name MoneySink
Processor 2600K @ 4.8
Motherboard P8Z77-V
Cooling AC NexXxos XT45 360, RayStorm, D5T+XSPC tank, Tygon R-3603, Bitspower
Memory 16GB Crucial Ballistix DDR3-1600C8
Video Card(s) GTX 780 SLI (EVGA SC ACX + Giga GHz Ed.)
Storage Kingston HyperX SSD (128) OS, WD RE4 (1TB), RE2 (1TB), Cav. Black (2 x 500GB), Red (4TB)
Display(s) Achieva Shimian QH270-IPSMS (2560x1440) S-IPS
Case NZXT Switch 810
Audio Device(s) onboard Realtek yawn edition
Power Supply Seasonic X-1050
Software Win8.1 Pro
Benchmark Scores 3.5 litres of Pale Ale in 18 minutes.
Last edited:

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
26,957 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
That's the image you posted, date is july 7, zotac 980 ti review uses 15.15 beta.

_this_ article, the fury review, posted on july 10, uses 15.7
 
Joined
Sep 29, 2011
Messages
217 (0.05/day)
Location
Ottawa, Canada
System Name Current Rig
Processor Intel 12700K@5.1GHz
Motherboard MSI Pro Z790-P
Cooling Arctic Cooling Liquid Freezer II 360mm
Memory 2x16GB DDR5-6000 G.Skill Trident Z RGB
Video Card(s) MSI Gaming X Trio 6800 16GB
Storage 1TB SSD
Case Cooler Master Storm Striker
Power Supply Antec True Power 750w
Keyboard IBM Model 'M"
well, amd is in deep trouble, so it has to surface somewhere. like chaotic pricing model in their last spasmatic attmept to grab a market share and generate revenue.. i dont see as likely for anyone to buy a g-card with a heatsink larger than the g-card itself and zero oc potential.. now i have to wonder even more what exactly amd expect from the nano model.......... :confused:

edit: oops, i thought the pcb was smaller, i skipped first few pages of the review :D ok.. the heatsink ISNT larger in length and width than the card/pcb.. ok, ok............ meh

Knee-jerk anti-AMD response perhaps? I don't recall people accusing nVidia of being in deep trouble when they asked $3000 for the first Titan. Greedy, yes.
 
  • Like
Reactions: nem
Joined
Sep 28, 2012
Messages
963 (0.23/day)
System Name Poor Man's PC
Processor AMD Ryzen 5 7500F
Motherboard MSI B650M Mortar WiFi
Cooling ID Cooling SE 206 XT
Memory 32GB GSkill Flare X5 DDR5 6000Mhz
Video Card(s) Sapphire Pulse RX 6800 XT
Storage XPG Gammix S70 Blade 2TB + 8 TB WD Ultrastar DC HC320
Display(s) Mi Gaming Curved 3440x1440 144Hz
Case Cougar MG120-G
Audio Device(s) MPow Air Wireless + Mi Soundbar
Power Supply Enermax Revolution DF 650W Gold
Mouse Logitech MX Anywhere 3
Keyboard Logitech Pro X + Kailh box heavy pale blue switch + Durock stabilizers
VR HMD Meta Quest 2
Benchmark Scores Who need bench when everything already fast?
Price is just a little too steep :shadedshu:
Oh well...i think i'm gonna wait a little while and for the time being i might go three fire :laugh:
 
Joined
Sep 29, 2011
Messages
217 (0.05/day)
Location
Ottawa, Canada
System Name Current Rig
Processor Intel 12700K@5.1GHz
Motherboard MSI Pro Z790-P
Cooling Arctic Cooling Liquid Freezer II 360mm
Memory 2x16GB DDR5-6000 G.Skill Trident Z RGB
Video Card(s) MSI Gaming X Trio 6800 16GB
Storage 1TB SSD
Case Cooler Master Storm Striker
Power Supply Antec True Power 750w
Keyboard IBM Model 'M"
Actually I've never owned a single AMD GPU but I'd like to see the credit where it's due.

Riva TNT2 ->MX 440 8x->GeForce FX 5600 (owned for just a month - I hated it)->GeForce 6600-> GeForce 7600 GT->GeForce 8800 GT->(currently) Gigabyte GeForce GTX 660->Intend to buy GeForce 960 Ti if it gets released - if it's not, I will wait for Pascal

The reason why I've always avoided ATI/AMD is due to their awful drivers. They still are.

Strange that you are an authority on AMD drivers when, by your own admission, you've "never owned a single AMD GPU". I, on the other hand, have had mostly AMD Radeon cards over the last 7 years. Since a Radeon 4850, I've had a Radeon 6950 (briefly), a GTX670 (for about a month, which I sold because the nVidia drivers kept dropping my 3rd monitor after every driver update, and it required a 15 step process to get it back), a Radeon 7950 (bios flashed using a 7970 bios to 1000MHz - thanks TechPowerUp!), and now I've got a Gigabyte Radeon R9 290 (non-OC, but bios flashed with the OC bios to 1040MHz - again, thanks TechPowerUp for the bios!).

Frankly, the 'bad AMD drivers' argument is obsolete. AMD drivers have been very good for several years now. It's true that AMD sometimes lags behind nVidia in optimizing for a specific game, but that's also true of nVidia. Their driver for Tomb Raider that worked properly came out nearly a month after AMD's. Point is, AMD drivers are now excellent for the mostpart.
 
Top