• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Anybody regret going red?

Joined
Dec 28, 2012
Messages
3,478 (0.84/day)
System Name Skunkworks
Processor 5800x3d
Motherboard x570 unify
Cooling Noctua NH-U12A
Memory 32GB 3600 mhz
Video Card(s) asrock 6800xt challenger D
Storage Sabarent rocket 4.0 2TB, MX 500 2TB
Display(s) Asus 1440p144 27"
Case Old arse cooler master 932
Power Supply Corsair 1200w platinum
Mouse *squeak*
Keyboard Some old office thing
Software openSUSE tumbleweed/Mint 21.2
I like my 390, no complaints or regrets from me... granted, I spend most of my time on my tower working, not gaming. However when I do game, nothing happens that makes me want to tweak anything. When I overclock, it's because I want to, not because I need it. The only exception to that statement might be Eyefinity as it is a little bit of a push for the 390 but, it's not like I'm doing that often. The 3 monitors is more for work than play so I end up gaming at 1080p most of the time. When your GPU isn't running full tilt, there is almost never a reason to complain about it unless there are graphical glitches which I only encounter when I overclock the piss out of it.

AMD has its strong points just as nVidia does. I just think that AMD GPUs are better at compute and have a better lifespan and longevity. nVidia tends to drop driver support earlier than AMD but, that may just be hardware changes. Most GCN based GPUs still have driver support, even GCN 1.0 cards. My aging 6870s just had driver support dropped after 6 years of service. That's not too shabby. So, as someone who can no longer upgrade his machine often due to fiscal constraints due to life (having a family,) makes how much I get out of my hardware far more important than it used to be.

tl;dr: No regrets. What AMD has to offer appeals to me even if it's not always the fastest GPU you can buy.
Devil's advocate, the 6000 series competitor, the 500 series, is still getting driver updates. Heck, even the 400 series, the 5000 series competitor, is still getting regular updates. By that metric, AMD's performance in terms of support is much worse than nvidia's.

And AMD really had no choice but to continue optimization of GCN 1, since they were still selling them 4 years later, while nvidia had moved onto maxwell. And kepler didnt have to wait 3 years for full performance, it was performing that well days after a game came out.

To answer the OG question, I've tried red team 4 times personally. first was the 9800 pro(constantly defaulted to the wrong driver causing black screen), then the 2600xt(driver refused to install until SP3 was uninstalled), then the llano APUs(features in CCC would randomly dissapear and/or reappear on different driver revisions, auto hybrid crossfire destroyed performance), then the 5770's(crossfire issues in many newer games took weeks, sometimes months longer than nvidia to fix, random issues would pop up from time to time). Every time driver issues have popped up that were far rarer in nvidia's camp. the 5770s (in a machine I built and supported for my friend) were closest to being stable, but my 550ti's in SLI had far fewer issues, and what issues did pop up were fixed much faster than AMD's.

This was back in 2012/2013, so take it with some salt. Things may be different now, but IME, AMD has never been as stable as nvidia.
770 - 280€
970 - 329€
1070 - 450€+

"Same tier". lol.
here in the US, the 770 launched at $400, the 970 launched at $330, and now the 1070 launches at $379 ($450 for sucker edition). Considering the 1070 comes in on a much more expensive process, I dont understand the hatred for the price. it IS the same tier as the 970 and 770. Even the 670 launched at $400, so why is everyone complaining?
 
Last edited:

Kanan

Tech Enthusiast & Gamer
Joined
Aug 22, 2015
Messages
3,517 (1.11/day)
Location
Europe
System Name eazen corp | Xentronon 7.2
Processor AMD Ryzen 7 3700X // PBO max.
Motherboard Asus TUF Gaming X570-Plus
Cooling Noctua NH-D14 SE2011 w/ AM4 kit // 3x Corsair AF140L case fans (2 in, 1 out)
Memory G.Skill Trident Z RGB 2x16 GB DDR4 3600 @ 3800, CL16-19-19-39-58-1T, 1.4 V
Video Card(s) Asus ROG Strix GeForce RTX 2080 Ti modded to MATRIX // 2000-2100 MHz Core / 1938 MHz G6
Storage Silicon Power P34A80 1TB NVME/Samsung SSD 830 128GB&850 Evo 500GB&F3 1TB 7200RPM/Seagate 2TB 5900RPM
Display(s) Samsung 27" Curved FS2 HDR QLED 1440p/144Hz&27" iiyama TN LED 1080p/120Hz / Samsung 40" IPS 1080p TV
Case Corsair Carbide 600C
Audio Device(s) HyperX Cloud Orbit S / Creative SB X AE-5 @ Logitech Z906 / Sony HD AVR @PC & TV @ Teufel Theater 80
Power Supply EVGA 650 GQ
Mouse Logitech G700 @ Steelseries DeX // Xbox 360 Wireless Controller
Keyboard Corsair K70 LUX RGB /w Cherry MX Brown switches
VR HMD Still nope
Software Win 10 Pro
Benchmark Scores 15 095 Time Spy | P29 079 Firestrike | P35 628 3DM11 | X67 508 3DM Vantage Extreme
It's technically TressFX:
https://en.wikipedia.org/wiki/TressFX#Version_3.0

It runs on DirectCompute and is open source. TressFX, as far as I can recall, has never played favoritism for either brand (it uses DirectCompute); on the other hand, HairWorks (NVIDIA's competing API which is proprietary and closed source) caused a ruckus with Witcher 3's release because AMD cards had framerates fall >60% when enabled compared to GTX 980s 30%.

Correct me if I'm wrong but I think you posted that RotTR graph to counter the Hitman graph. The reason why AMD does well in Hitman and not so well in RotTR is because Hitman uses async compute which boosts AMD and hurts NVIDIA.

I'd be interested in seeing an image quality comparison on RotTR too. AMD cards could easily be simulating more hair than NVIDIA cards do explaining the gap in performance. The tessellation effects (which TressFX uses) can be overridden in graphics drivers.
The reason why Hairworks sucks on AMD cards is simply the excessive use of tesselation for Nvidia knows that this is the weak spot of AMD. Even their own cards suffer a bit, when it's activated.

RotTR in DX12 is utter crap, that's why AMD cards perform badly there at the moment. The DX11 chart is way more balanced between the competitors.

Devil's advocate, the 6000 series competitor, the 500 series, is still getting driver updates. Heck, even the 400 series, the 5000 series competitor, is still getting regular updates. By that metric, AMD's performance in terms of support is much worse than nvidia's.
It's not, AMD still supports HD 5000/6000 as legacy cards with new drivers. At least AMD doesn't promise things they don't hold up to (GTX 400/500 geting DX12 support). Kepler support is falling as we speak, Doom and some other new games have pretty bad performance on Kepler.
 
Joined
Mar 10, 2010
Messages
11,878 (2.30/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R5 5900X/ Intel 8750H
Motherboard Crosshair hero8 impact/Asus
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Corsair Vengeance Rgb pro 3600cas14 16Gb in four sticks./16Gb/16GB
Video Card(s) Powercolour RX7900XT Reference/Rtx 2060
Storage Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dell shiter
Case Lianli 011 dynamic/strix scar2
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores 8726 vega 3dmark timespy/ laptop Timespy 6506
I made sure to close every program before doing the flash, even disabled my antivirus first. There was nothing running that could have interfered with it as far as I know, I terminated a few extra processes in Task Manager too, just to be safe.
I personally thought Atiflash didn't work on windows 10 ,it didn't for me previously.
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.63/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
The reason why Hairworks sucks on AMD cards is simply the excessive use of tesselation for Nvidia knows that this is the weak spot of AMD. Even their own cards suffer a bit, when it's activated.

RotTR in DX12 is utter crap, that's why AMD cards perform badly there at the moment. The DX11 chart is way more balanced between the competitors.
I did massive edits to that post. Done editing it now

Crystal Engine has a history of being biased in favor of NVIDIA, DX12 or no.


It's not, AMD still supports HD 5000/6000 as legacy cards with new drivers. At least AMD doesn't promise things they don't hold up to (GTX 400/500 geting DX12 support). Kepler support is falling as we speak, Doom and some other new games have pretty bad performance on Kepler.
Fun fact: I installed Windows 10 on a system with an HD 4### card. No problems. Obviously it runs at DirectX 12 feature level 10_1 because it doesn't have the hardware to do better than that.
 
Last edited:
Joined
Dec 28, 2012
Messages
3,478 (0.84/day)
System Name Skunkworks
Processor 5800x3d
Motherboard x570 unify
Cooling Noctua NH-U12A
Memory 32GB 3600 mhz
Video Card(s) asrock 6800xt challenger D
Storage Sabarent rocket 4.0 2TB, MX 500 2TB
Display(s) Asus 1440p144 27"
Case Old arse cooler master 932
Power Supply Corsair 1200w platinum
Mouse *squeak*
Keyboard Some old office thing
Software openSUSE tumbleweed/Mint 21.2
The reason why Hairworks sucks on AMD cards is simply the excessive use of tesselation for Nvidia knows that this is the weak spot of AMD. Even their own cards suffer a bit, when it's activated.

RotTR in DX12 is utter crap, that's why AMD cards perform badly there at the moment. The DX11 chart is way more balanced between the competitors.


It's not, AMD still supports HD 5000/6000 as legacy cards with new drivers. At least AMD doesn't promise things they don't hold up to (GTX 400/500 geting DX12 support). Kepler support is falling as we speak, Doom and some other new games have pretty bad performance on Kepler.
um, wat? from AMD's own site "MD Radeon™ R5 235X, Radeon™ R5 235, Radeon™ R5 230, Radeon™ R5 220, Radeon™ HD 8470, Radeon™ HD 8350, Radeon™ HD 8000 (D/G variants), Radeon™ HD 7000 Series (HD 7600 and below), Radeon™ HD 6000 Series, and Radeon™ HD 5000 Series Graphics products have been moved to a legacy support model and no additional driver releases are planned. This change enables us to dedicate valuable engineering resources to developing new features and enhancements for graphics products based on the GCN Architecture."
http://support.amd.com/en-us/download/desktop/legacy?product=legacy3&os=Windows 10 - 64

http://techreport.com/news/29362/amd-ends-driver-support-for-non-gcn-radeon-cards

http://betanews.com/2015/11/24/amd-kills-gpu/

http://www.tomshardware.com/news/amd-retires-non-gcn-gpu-lineup,30643.html


The only drivers available for download are the 15.7.1 driver (7/29/2015) and the 16.2.1 beta driver(3/1/2016), nothing newer.

Meanwhile, nvidia has the newest 368.22 driver available for anything 400 series on up. Performance on DOOM for kepler is poor, true, but other new games, like forza 6, can be maxxed out 60FPS if you have a 4GB kepler card. Kepler is simply not being optimized for, so you will occasionally have these kinds of problems. DOOM wont run at all on the 5000 and 6000 series (but it DOES run on the 500 series), and neither of them have DX12 either.

EDIT: youtube videos show the 770 running doom pretty well, considering it is a 4+ year old GPU. The poor performance at launch seems to have been fixed.
 
Last edited:

Ebo

Joined
May 9, 2013
Messages
778 (0.19/day)
Location
Nykoebing Mors, Denmark
System Name the little fart
Processor AMD Ryzen 2600X
Motherboard MSI x470 gaming plus
Cooling Noctua NH-C14S
Memory 16 GB G.Skill Ripjaw 2400Mhz DDR 4
Video Card(s) Sapphire RX Vega 56 Pulse
Storage 1 Crucial MX100 512GB SSD,1 Crucial MX500 2TB SSD, 1 1,5TB WD Black Caviar, 1 4TB WD RED HD
Display(s) IIyama XUB2792QSU IPS 2560x1440
Case White Lian-Li PC-011 Dynamic
Audio Device(s) Asus Xonar SE pci-e card
Power Supply Thermaltake DPS G 1050 watt Digital PSU
Mouse Steelseries Sensei
Keyboard Corsair K70
Software windows 10 64 pro bit
Never had any problems with ATI/AMD cards which I have had since R9700 PRO back in the day. I will be upgrading also, but I will wait forthe toptier and HBM2 to comes out.
Benchmaks dosent interest me at all, I want realworld experience.
 
Joined
Jul 9, 2015
Messages
3,413 (1.06/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
Joined
Dec 28, 2012
Messages
3,478 (0.84/day)
System Name Skunkworks
Processor 5800x3d
Motherboard x570 unify
Cooling Noctua NH-U12A
Memory 32GB 3600 mhz
Video Card(s) asrock 6800xt challenger D
Storage Sabarent rocket 4.0 2TB, MX 500 2TB
Display(s) Asus 1440p144 27"
Case Old arse cooler master 932
Power Supply Corsair 1200w platinum
Mouse *squeak*
Keyboard Some old office thing
Software openSUSE tumbleweed/Mint 21.2
I'm sure a friend of mine got it within 2 month from release for 280.


Oh, give me a break.
And I'm sure your friend didnt. 770's were solidly $400 until the 290/290x dropped. and it was only down to $330, with most custom ones coming in at $340-$350. $280 would only be possible as a used card or a firesale just before maxwell dropped.

And I am not giving you a break. 28nm was dirt cheap by the time maxwell came out. 14nm is brand new, new processes are more expensive than old ones. That has been true for over 20 years. unless you mena to tell me that 14nm is somehow cheaper than 28nm from 2011?
 
Joined
Oct 8, 2012
Messages
1,445 (0.34/day)
Location
Israel
Processor AMD Ryzen 7 5800X
Motherboard B550 Aorus PRO V2
Cooling Corsair H115i RGB Platinum
Memory Gskill Trident Z Neo 3600 2x16GB
Video Card(s) MSI RTX 3070 Trio X
Storage WD Blue SN550 1TB NVME/Supertalent Teranova 2x1TB
Display(s) Acer XV340CK
Case Corsair 4000D Airflow
Power Supply Corsair HX850i
Mouse Logitech G502 Lightspeed
Keyboard Logitech G513 Tactile
Software Windows 10 Pro
Oh, give me a break.
It's actually the die size that drives the production costs up, and yes, it's significant enough to cause the higher pricing that we are seeing.
 
Joined
Sep 7, 2011
Messages
2,785 (0.60/day)
Location
New Zealand
System Name MoneySink
Processor 2600K @ 4.8
Motherboard P8Z77-V
Cooling AC NexXxos XT45 360, RayStorm, D5T+XSPC tank, Tygon R-3603, Bitspower
Memory 16GB Crucial Ballistix DDR3-1600C8
Video Card(s) GTX 780 SLI (EVGA SC ACX + Giga GHz Ed.)
Storage Kingston HyperX SSD (128) OS, WD RE4 (1TB), RE2 (1TB), Cav. Black (2 x 500GB), Red (4TB)
Display(s) Achieva Shimian QH270-IPSMS (2560x1440) S-IPS
Case NZXT Switch 810
Audio Device(s) onboard Realtek yawn edition
Power Supply Seasonic X-1050
Software Win8.1 Pro
Benchmark Scores 3.5 litres of Pale Ale in 18 minutes.
It's actually the die size that drives the production costs up, and yes, it's significant enough to cause the higher pricing that we are seeing.
Process yield would also impact cost per die mm. Regardless of whether the contract is on a good die percentage basis, or a flat wafer cost, a lower yield adds to the cost. There is also a higher level of amortization of ROI to consider:
But perhaps the biggest issue is cost. The average IC design cost for a 28nm device is about $30 million, according to Gartner. In comparison, the IC design cost for a mid-range 14nm SoC is about $80 million. “Add an extra 60% (to that cost) if embedded software development and mask costs are included,” Gartner’s Wang said. “A high-end SoC can be double this amount, and a low-end SoC with re-used IP can be half of the amount.”

On top of that, it takes 100 engineer-years to bring out a 28nm chip design. “Therefore, a team of 50 engineers will need two years to complete the chip design to tape-out. Then, add 9 to 12 months more for prototype manufacturing, testing and qualification before production starts. That is if the first silicon works,” he said. “For a 14nm mid-range SoC, it takes 200 man-years. A team of 50 engineers will need four years of chip design time, plus add nine to 12 months for production.”

If that’s not enough, there is also a sizable jump in manufacturing costs. In a typical 11-metal level process, there are 52 mask steps at 28nm. With an 80% fab utilization rate at 28nm, the loaded manufacturing cost is about $3,500 per 300mm wafer, according to Gartner.

At 1.3 days per lithography layer, the cycle time for a 28nm chip is about 68 days. “Add one week minimum for package testing,” Wang said. “So, the total is two-and-half months from wafer start to chip delivery.”

At 16nm/14nm, there are 66 mask steps. With an 80% fab utilization rate at 16nm/14nm, the loaded cost is about $4,800 per 300mm wafer, according to Gartner. “It takes three months from wafer start to chip delivery,” he added.
[Source]
Oh, give me a break.
You mean stop posting factual analysis that directly contradicts your unsubstantiated comments? Now why would anyone on a tech forum be OK with you posting half-baked uninformed nonsense without subjecting it to scrutiny?
 

Aquinus

Resident Wat-man
Joined
Jan 28, 2012
Messages
13,147 (2.94/day)
Location
Concord, NH, USA
System Name Apollo
Processor Intel Core i9 9880H
Motherboard Some proprietary Apple thing.
Memory 64GB DDR4-2667
Video Card(s) AMD Radeon Pro 5600M, 8GB HBM2
Storage 1TB Apple NVMe, 4TB External
Display(s) Laptop @ 3072x1920 + 2x LG 5k Ultrafine TB3 displays
Case MacBook Pro (16", 2019)
Audio Device(s) AirPods Pro, Sennheiser HD 380s w/ FIIO Alpen 2, or Logitech 2.1 Speakers
Power Supply 96w Power Adapter
Mouse Logitech MX Master 3
Keyboard Logitech G915, GL Clicky
Software MacOS 12.1
Devil's advocate, the 6000 series competitor, the 500 series, is still getting driver updates. Heck, even the 400 series, the 5000 series competitor, is still getting regular updates. By that metric, AMD's performance in terms of support is much worse than nvidia's.

And AMD really had no choice but to continue optimization of GCN 1, since they were still selling them 4 years later, while nvidia had moved onto maxwell. And kepler didnt have to wait 3 years for full performance, it was performing that well days after a game came out.

To answer the OG question, I've tried red team 4 times personally. first was the 9800 pro(constantly defaulted to the wrong driver causing black screen), then the 2600xt(driver refused to install until SP3 was uninstalled), then the llano APUs(features in CCC would randomly dissapear and/or reappear on different driver revisions, auto hybrid crossfire destroyed performance), then the 5770's(crossfire issues in many newer games took weeks, sometimes months longer than nvidia to fix, random issues would pop up from time to time). Every time driver issues have popped up that were far rarer in nvidia's camp. the 5770s (in a machine I built and supported for my friend) were closest to being stable, but my 550ti's in SLI had far fewer issues, and what issues did pop up were fixed much faster than AMD's.

This was back in 2012/2013, so take it with some salt. Things may be different now, but IME, AMD has never been as stable as nvidia.

here in the US, the 770 launched at $400, the 970 launched at $330, and now the 1070 launches at $379 ($450 for sucker edition). Considering the 1070 comes in on a much more expensive process, I dont understand the hatred for the price. it IS the same tier as the 970 and 770. Even the 670 launched at $400, so why is everyone complaining?
Now you've done it. If you're not already sitting, please sit down before reading this post. I'm not sure if it was the post @TheinsanegamerN wrote or the beer but, work with me here for a minute while I lay it all out for you.

Alright everyone, it's story time. Time to get nostalgic!

<dissertation>
I hear you but, let me contrast your experience with mine and maybe we'll see uncover why despite having several ATi/AMD cards, we've come to different conclusions.

When I first started building my own tower, I started with a Radeon 9200. In fact, I still have it in the attic with my first CPU, motherboard, and chassis. It wasn't the fastest but, it was 10x better than Intel's integrated graphics which I was using prior to that machine, so I loved it. Heck it even overclocked without external power (Woo!) After that, I had a GeForce 6800 OC. It was better than the 9200 but, I was kind of expecting more out of it and mine didn't overclock very well. I ended up selling it to a friend at school. After that I had a x800 SE which had some pipelines cut but, overclocked really well which I was very happy with, I also sold that to a friend when I was done with it. After that I had a GeForce 7900 GT where the VRAM eventually failed and XFX replaced that with a GeForce 8600 GTS which was without a doubt the best card I've ever overclocked, I don't remember the exactly numbers very well but I think the core was at 950Mhz and shaders at 1950Mhz and performance was up something like 30% over stock. I got a Radeon HD 2600 XT to compare it to since I could get it pretty cheap and found them to be on par at stock but, the 8600 GTS wiped the floor after being overclocked. I think that 8600 GTS is burried in the attic and I still have the 2600 XT which is sitting behind me as a memento of the fact it had GDDR4 which very few GPUs used. After that I upgraded to a Radeon HD 4850 which was pretty nice. It didn't overclock well but, at the time I didn't really need to. I ended up giving that away in a machine to a friend. I replaced the 4850 with a Radeon HD 6870 on release day. I still have that GPU and it still runs great, it's sitting in a box with static-proof wrap. I paired that with a second 6870 3 years later which gave me (at the time,) 7970-like performance (when it was new) at half of the cost (since I already had the first 6870,) so long as I could deal with CFX and drivers (which I did to primary positive effect.) About 9 months ago, I decided it was time for an upgrade and chose the Radeon R9 390. The 6870s were suffering because they only had 1GB of VRAM but, they had plenty of compute power that was being untouched, so I opted for the 390 with 8GB over the 970 with 3.5GB+.5GB despite the 970s OC advantage thinking that the extra VRAM would give me more longevity should I need to buy a second one and run CFX again. While the 390 was en-route to my house, my newer 6870 failed catastrophically so clearly it was a good time to upgrade. Since then I've used the 390 primarily for work but when I game with it, I so far haven't been disappointed.

So that's my story, the only commonality is that we both owned the 2600 XT and I think we can both agree that it wasn't a very good card. It could have been great with that GDDR4 but, it ended up being pretty weak. For what it's worth though, the 2600 XT (at least mine,) didn't have external power where the 8600 GTS did so, it may not really be a fair comparison. I would argue that you shouldn't let you experience with the 2600 XT to form an opinion around AMD/ATi. I feel like between the 9000 series Radeons and the x*** Radeons that there was a shift where the higher end was better than the mainstream GPUs where nVidia had a clear advantage. I do think that the 8600 GTS was a clear indicator of that over the 2600 XT however, the 4850 was a nice card but, the 4870 ran kind of hot (I came across one of those in my travels, gave that away too,) but, the area between mainstream and high-end has always (to me, ) seems to be a strength for AMD for the last decade or so. It hasn't always been the best but, it's been comparable.

So, I will leave you with this. Not every GPU in either camp is great but, depending on the way you use a GPU, if you think it was monetarily worth it, and if something goes wrong, is up to you. Experiences vary greatly and I will tell a short story that's an example of that: Remember that XFX GeForce 8600 GTS that overclocked like hell? It also was using a proprietary MOLEX to PCI-E-like connector on the GPU which provided both +12v and +5v. Without knowing this at the time, I used a PCI-E power connector to power it since the key was the same and tried using it (for those of you who don't know, PCI-E power connectors only provide +12v.) The GPU was fine but, it ended up frying DDC on one of my monitors. I bet you not many other people will have that kind of story but, I'm sure it has influenced my upgrade decisions since. That isn't to say it's nVidia's fault but, it was a negative experience that has pushed me away from nVidia (to a lesser extent than...) and XFX to this day.

tl;dr: We may all own cards from both camps but, depending on experience, you could own the same cards and come to different conclusions. Either way, it's dumb to not acknowledge that both camps have worked hard to produce good GPUs but, that isn't always the case where sometimes there are GPUs that don't quite meet the satisfaction of consumers depending on what you bought, when you bought it, and if something went wrong.
</dissertation>

You can blame Sierra Nevada's Torpedo Extra IPA for this rant. :toast:
 
Last edited:
Joined
Mar 18, 2008
Messages
5,717 (0.97/day)
System Name Virtual Reality / Bioinformatics
Processor Undead CPU
Motherboard Undead TUF X99
Cooling Noctua NH-D15
Memory GSkill 128GB DDR4-3000
Video Card(s) EVGA RTX 3090 FTW3 Ultra
Storage Samsung 960 Pro 1TB + 860 EVO 2TB + WD Black 5TB
Display(s) 32'' 4K Dell
Case Fractal Design R5
Audio Device(s) BOSE 2.0
Power Supply Seasonic 850watt
Mouse Logitech Master MX
Keyboard Corsair K70 Cherry MX Blue
VR HMD HTC Vive + Oculus Quest 2
Software Windows 10 P
I think @xkm1948 had a similar issue with his Fury X. I thought I recall the problem getting solved somehow but, I'm not exactly sure how. Maybe he can chime in.

The screen corruption problem is completely fixed with the new official UEFI bios pushed out by AMD. I suspect there were some efforts in the newer drivers as well.
 

OneMoar

There is Always Moar
Joined
Apr 9, 2010
Messages
8,744 (1.71/day)
Location
Rochester area
System Name RPC MK2.5
Processor Ryzen 5800x
Motherboard Gigabyte Aorus Pro V2
Cooling Enermax ETX-T50RGB
Memory CL16 BL2K16G36C16U4RL 3600 1:1 micron e-die
Video Card(s) GIGABYTE RTX 3070 Ti GAMING OC
Storage ADATA SX8200PRO NVME 512GB, Intel 545s 500GBSSD, ADATA SU800 SSD, 3TB Spinner
Display(s) LG Ultra Gear 32 1440p 165hz Dell 1440p 75hz
Case Phanteks P300 /w 300A front panel conversion
Audio Device(s) onboard
Power Supply SeaSonic Focus+ Platinum 750W
Mouse Kone burst Pro
Keyboard EVGA Z15
Software Windows 11 +startisallback
The screen corruption problem is completely fixed with the new official UEFI bios pushed out by AMD. I suspect there were some efforts in the newer drivers as well.
flickering/screen corruption is always a thing with AMD
its like shingles you can treat it and cover it all you want but its still in you ...
 
Joined
Mar 18, 2008
Messages
5,717 (0.97/day)
System Name Virtual Reality / Bioinformatics
Processor Undead CPU
Motherboard Undead TUF X99
Cooling Noctua NH-D15
Memory GSkill 128GB DDR4-3000
Video Card(s) EVGA RTX 3090 FTW3 Ultra
Storage Samsung 960 Pro 1TB + 860 EVO 2TB + WD Black 5TB
Display(s) 32'' 4K Dell
Case Fractal Design R5
Audio Device(s) BOSE 2.0
Power Supply Seasonic 850watt
Mouse Logitech Master MX
Keyboard Corsair K70 Cherry MX Blue
VR HMD HTC Vive + Oculus Quest 2
Software Windows 10 P
Just saying my old HD5870 still have new driver support up to 16.3.2 beta. AMD stopped supporting WHQL drivers to 5000 series cards, but beta driver is still good. I have my HTPC running my old HD5870 on Windows 10 64bit.
 
Joined
Mar 18, 2008
Messages
5,717 (0.97/day)
System Name Virtual Reality / Bioinformatics
Processor Undead CPU
Motherboard Undead TUF X99
Cooling Noctua NH-D15
Memory GSkill 128GB DDR4-3000
Video Card(s) EVGA RTX 3090 FTW3 Ultra
Storage Samsung 960 Pro 1TB + 860 EVO 2TB + WD Black 5TB
Display(s) 32'' 4K Dell
Case Fractal Design R5
Audio Device(s) BOSE 2.0
Power Supply Seasonic 850watt
Mouse Logitech Master MX
Keyboard Corsair K70 Cherry MX Blue
VR HMD HTC Vive + Oculus Quest 2
Software Windows 10 P
flickering/screen corruption is always a thing with AMD
its like shingles you can treat it and cover it all you want but its still in you ...

Really?

Let me check my memory.

Radeon 9700 days no problem at all
Radeon 1650XT days no problem at all
HD3870~HD4870~HD5870 no problem at all
APU 6800K no problem

As a matter of fact the screen corruption only happened for FuryX due to aggressive power saving. But now it is all fixed.
 

OneMoar

There is Always Moar
Joined
Apr 9, 2010
Messages
8,744 (1.71/day)
Location
Rochester area
System Name RPC MK2.5
Processor Ryzen 5800x
Motherboard Gigabyte Aorus Pro V2
Cooling Enermax ETX-T50RGB
Memory CL16 BL2K16G36C16U4RL 3600 1:1 micron e-die
Video Card(s) GIGABYTE RTX 3070 Ti GAMING OC
Storage ADATA SX8200PRO NVME 512GB, Intel 545s 500GBSSD, ADATA SU800 SSD, 3TB Spinner
Display(s) LG Ultra Gear 32 1440p 165hz Dell 1440p 75hz
Case Phanteks P300 /w 300A front panel conversion
Audio Device(s) onboard
Power Supply SeaSonic Focus+ Platinum 750W
Mouse Kone burst Pro
Keyboard EVGA Z15
Software Windows 11 +startisallback
Really?

Let me check my memory.

Radeon 9700 days no problem at all
Radeon 1650XT days no problem at all
HD3870~HD4870~HD5870 no problem at all
APU 6800K no problem

As a matter of fact the screen corruption only happened for FuryX due to aggressive power saving. But now it is all fixed.
really because
5750 - flickering
6870 - screen corruption
7870 flickering
7970 screen corruption/can't enable power play because loldriver crashes running quake 3
 
Joined
Mar 18, 2008
Messages
5,717 (0.97/day)
System Name Virtual Reality / Bioinformatics
Processor Undead CPU
Motherboard Undead TUF X99
Cooling Noctua NH-D15
Memory GSkill 128GB DDR4-3000
Video Card(s) EVGA RTX 3090 FTW3 Ultra
Storage Samsung 960 Pro 1TB + 860 EVO 2TB + WD Black 5TB
Display(s) 32'' 4K Dell
Case Fractal Design R5
Audio Device(s) BOSE 2.0
Power Supply Seasonic 850watt
Mouse Logitech Master MX
Keyboard Corsair K70 Cherry MX Blue
VR HMD HTC Vive + Oculus Quest 2
Software Windows 10 P
really because
5750 - flickering
6870 - screen corruption
7870 flickering
7970 screen corruption/can't enable power play because loldriver crashes running quake 3

I missed all the bad generations. :D
 

OneMoar

There is Always Moar
Joined
Apr 9, 2010
Messages
8,744 (1.71/day)
Location
Rochester area
System Name RPC MK2.5
Processor Ryzen 5800x
Motherboard Gigabyte Aorus Pro V2
Cooling Enermax ETX-T50RGB
Memory CL16 BL2K16G36C16U4RL 3600 1:1 micron e-die
Video Card(s) GIGABYTE RTX 3070 Ti GAMING OC
Storage ADATA SX8200PRO NVME 512GB, Intel 545s 500GBSSD, ADATA SU800 SSD, 3TB Spinner
Display(s) LG Ultra Gear 32 1440p 165hz Dell 1440p 75hz
Case Phanteks P300 /w 300A front panel conversion
Audio Device(s) onboard
Power Supply SeaSonic Focus+ Platinum 750W
Mouse Kone burst Pro
Keyboard EVGA Z15
Software Windows 11 +startisallback
I have personally never owned a post 4xx amd card that hasn't had some sort of power-play related fuckup
amds power management scheme is horrible and has been horrible for generations
 
Joined
Jan 2, 2012
Messages
1,079 (0.24/day)
Location
Indonesia
Processor AMD Ryzen 7 5700X
Motherboard ASUS STRIX X570-E
Cooling NOCTUA NH-U12A
Memory G.Skill FlareX 32 GB (4 x 8 GB) DDR4-3200
Video Card(s) ASUS RTX 4070 DUAL
Storage 1 TB WD Black SN850X | 2 TB WD Blue SN570 | 10 TB WD Purple Pro
Display(s) LG 32QP880N 32"
Case Fractal Design Define R5 Black
Power Supply Seasonic Focus Gold 750W
Mouse Pulsar X2
Keyboard KIRA EXS
It's technically TressFX 3.0 which runs on DirectCompute and is open source. TressFX, as far as I can recall, has never played favoritism for either brand; on the other hand, HairWorks (NVIDIA's competing API which is proprietary and closed source) caused a ruckus with Witcher 3's release because AMD cards had framerates fall >60% when enabled compared to GTX 980s 30% (see below, TressFX is far more friendly than HairWorks).


Correct me if I'm wrong but I think you posted that RotTR graph to counter the Hitman graph. The reason why AMD does well in Hitman and not so well in RotTR is because Hitman uses async compute which boosts AMD and hurts NVIDIA.

I'd be interested in seeing an image quality comparison on RotTR too. AMD cards could easily be simulating more hair than NVIDIA cards do explaining the gap in performance. The tessellation effects (which TressFX uses) can be overridden in graphics drivers. It probably should have just been disabled for all cards because it optionally inflicts a major penalty to framerate. It's better to isolate TressFX to its own test.


Here's a benchmark that isolates PureHair:
http://www.hardocp.com/article/2016..._graphics_features_performance/6#.V0yKnI-cFaQ

FPS gained turning it from "on" to "off":
GTX 980 Ti: 107.395%
R9 390X: 107.383%

The cost is damn near identical, as it should be.


Looking back at some history, the Crystal Engine has always favored NVIDIA:
Tomb Raider - http://www.guru3d.com/articles-pages/nvidia-geforce-gtx-titan-x-review,15.html
Deus Ex: Human Revolution -http://www.tomshardware.com/reviews/deus-ex-human-revolution-performance-benchmark,3012-6.html

Background: Square Enix Montreal started working on DXHR before Tomb Raider started production. Deus Ex: Mankind Divided is actually switching from the Crystal Engine of Tomb Raider (DX11)/Rise of the Tomb Raider (DX12) to the same engine Hitman Absolution (DX11) used and Hitman 2016 (DX12 + async) is using which heavily favors AMD. Ironic, isn't it?

I do remember that enabling TressFX made my FPS tank back when I'm still using GTX 560 Ti, it took about 2 weeks until NVIDIA release a new driver to improve TressFX performance.
They even acknowledged the performance issues : http://techreport.com/news/24463/nvidia-acknowledges-tomb-raider-performance-issues

And about Hairworks in Witcher 3, if you look at this review : http://www.hardocp.com/article/2015/08/25/witcher_3_wild_hunt_gameplay_performance_review/6
it's not that bad actually for AMD cards (certainly not a >60% FPS drop). The worst drop in average FPS among those cards are GTX 960 (46.8%) for NVIDIA and R9 390 (44.5%) for AMD.

Quick recap and comparison (average FPS drop HairWorks OFF vs ON in %) based on HardOCP article :

TitanX drop by 10.76 %
980Ti drop by 19.96 %
FuryX drop by 24 %

980 drop by 34.16 %
Fury drop by 27.47 % (less FPS drop compared to 980)
R9 390X drop by 37.52 %

970 drop by 41.69 %
R9 390 drop by 44.50 %

960 drop by 46.83 %
R9 380 drop by 44.04 % (less FPS drop compared to 960)

No big difference in FPS drop percentage except for TitanX vs FuryX (13.24%), but that's not a directly comparable card price wise ($1000 vs $650).
FuryX vs 980Ti is separated by a small margin (4.04%), Fury managed to be better than 980, 980 vs Fury + 970 vs 390 also separated by a small margin (less than 5%).
 
Joined
Sep 12, 2015
Messages
413 (0.13/day)
Location
Corn field in Iowa
System Name Vellinious
Processor i7 6950X
Motherboard ASUS X99-A II
Cooling Custom Liquid
Memory 32GB GSkill TridentZ 3200 14
Video Card(s) 2 x EVGA GTX 1080 FTW
Storage 512 GB Samsung 950 Pro, 120GB Kingston Hyper X SSD, 2 x 1TB WD Caviar Black
Case Thermaltake Core X9, stacked
Power Supply EVGA SuperNova 1000P2, EVGA SuperNova 750G2
Mouse Razer Naga Molten Edition
Keyboard TT eSports Challenger Ultimate
Benchmark Scores Timespy-1080 SLI-15972
Bought an 8GB 290X to play with for a few months while I waited for the new tech to come out this summer. I've been pleasantly surprised by how well it overclocked and ran. It's been a fun card to play with.
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.63/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
I do remember that enabling TressFX made my FPS tank back when I'm still using GTX 560 Ti, it took about 2 weeks until NVIDIA release a new driver to improve TressFX performance.
They even acknowledged the performance issues : http://techreport.com/news/24463/nvidia-acknowledges-tomb-raider-performance-issues

And about Hairworks in Witcher 3, if you look at this review : http://www.hardocp.com/article/2015/08/25/witcher_3_wild_hunt_gameplay_performance_review/6
it's not that bad actually for AMD cards (certainly not a >60% FPS drop). The worst drop in average FPS among those cards are GTX 960 (46.8%) for NVIDIA and R9 390 (44.5%) for AMD.

Quick recap and comparison (average FPS drop HairWorks OFF vs ON in %) based on HardOCP article :

TitanX drop by 10.76 %
980Ti drop by 19.96 %
FuryX drop by 24 %

980 drop by 34.16 %
Fury drop by 27.47 % (less FPS drop compared to 980)
R9 390X drop by 37.52 %

970 drop by 41.69 %
R9 390 drop by 44.50 %

960 drop by 46.83 %
R9 380 drop by 44.04 % (less FPS drop compared to 960)

No big difference in FPS drop percentage except for TitanX vs FuryX (13.24%), but that's not a directly comparable card price wise ($1000 vs $650).
FuryX vs 980Ti is separated by a small margin (4.04%), Fury managed to be better than 980, 980 vs Fury + 970 vs 390 also separated by a small margin (less than 5%).
If you look at the link, it doesn't even mention TressFX. It just says that NVIDIA had late access too the game so they couldn't preempt the title with optimized drivers.

Doing more research, it appears that Tomb Raider on NVIDIA had a wide gamut of problems with or without TressFX enabled.


All of your numbers do paint a clear bias in favor of NVIDIA cards where TressFX doesn't have that bias; additionally, TressFX has a much lower cost compared to HairWorks. Knowing both of those facts, why would anyone use HairWorks over TressFX?
 
Last edited:
Joined
Oct 8, 2012
Messages
1,445 (0.34/day)
Location
Israel
Processor AMD Ryzen 7 5800X
Motherboard B550 Aorus PRO V2
Cooling Corsair H115i RGB Platinum
Memory Gskill Trident Z Neo 3600 2x16GB
Video Card(s) MSI RTX 3070 Trio X
Storage WD Blue SN550 1TB NVME/Supertalent Teranova 2x1TB
Display(s) Acer XV340CK
Case Corsair 4000D Airflow
Power Supply Corsair HX850i
Mouse Logitech G502 Lightspeed
Keyboard Logitech G513 Tactile
Software Windows 10 Pro
really because
5750 - flickering
6870 - screen corruption
7870 flickering
7970 screen corruption/can't enable power play because loldriver crashes running quake 3
I have gone through two 7970's and two 7950's and never had those issues.
 
Joined
Nov 2, 2013
Messages
461 (0.12/day)
System Name Auriga
Processor Ryzen 7950X3D w/ aquacomputer cuplex kryos NEXT with VISION - acrylic/nickel
Motherboard Asus ROG Strix X670E-E Gaming WiFi
Cooling Alphacool Res/D5 Combo •• Corsair XR7 480mm + Black Ice Nemesis 360GTS radiators •• 7xNF-A12 chromax
Memory 2x 32GB G.Skill Trident Z5 Neo RGB @ 6200MHz, 30-40-40-28, 1.35V (F5-6000J3040G32GX2-TZ5NR)
Video Card(s) MSI RTX 4090 Suprim Liquid X w/ Bykski waterblock
Storage 2TB WD Black SN850X ••• 2TB Corsair M510 ••• 40TB QNAP NAS via SFP+ NIC
Display(s) Alienware AW3423DWF (3440x1440, 10-bit @ 139Hz)
Case Thermaltake Core P8
Power Supply Corsair AX1600i
Mouse Razer Viper V2 Pro (FPS games) + Logitech MX Master 2S (everything else)
Keyboard Keycult No2 rev 1 w/Amber Alps and TX stabilizers on a steel plate. DCS 9009 WYSE keycaps
Software W10 X64 Pro
Benchmark Scores https://valid.x86.fr/c3rxw7
I regret paying almost 500 for my nano months ago

here is an analogy. i paid $60,000 (USD) for a 2016 Audi A6 several months ago. but the new 2017 A6 and Benz E class are WAY better than my car in terms of performance and features. do i cry about it? no. i needed a car then, so i got the car. you needed your GPU then, so you got your GPU.

companies have to make something every year to stay in business. it's a fact of life. deal with it. i knew that a better model for the same money would come out some months down the road. you should have known that faster GPUs were coming in the summer. pretty sure everyone knew. now stop crying and go enjoy your card
 
Joined
Jun 10, 2005
Messages
1,775 (0.26/day)
Location
Singapore
System Name Half-fucked overclockedd
Processor Intel Core i7 2600k 3.40Ghz @ 4.20Ghz
Motherboard Gigabyte P67 UD7 B3
Cooling Antec Kuhler H2O 920
Memory G.Skill RipjawsX DDR3 8GB X2 1866Mhz (Model F3-2133C9D-16GXH)
Video Card(s) Gigabyte AORUS 1080Ti Extreme Edition
Storage Samsung 840 Pro 256GB / Western Digital Black Cavier 2TB X2
Display(s) Dell U2715H 2560X1440
Case NZXT Phantom
Audio Device(s) Creative Sound Blaster Recon3D Fatal1ty Professional
Power Supply Cooler Master Silent Pro Gold 1000W
Mouse Logitech G510
Keyboard Tesoro Excalibur Spectrum
Software Microsoft Windows 10 Professional
I have major flickering issues on R290 Crossfire playing TW3 / Dragon Age Inquisition. Heck it annoys the heck out of me when i stop and stare at the environment, only to see mountains flickering in and out :shadedshu::shadedshu:

Been a fan of AMD since 9800PRO till now, even requesting all my friends and neighbors to go AMD. But when Nvidia GTX1080 released their card...... it make me cringe, wanting to jump to the green camp :banghead::banghead:
 
Joined
Jun 28, 2008
Messages
1,109 (0.19/day)
Location
Greenville, NC
System Name Champ's 1440P Rig
Processor Intel i7-4770K @ 4.6 GHz
Motherboard AsRock Z97 Extreme6
Cooling Corsair H60
Memory Corsair Vengeance 16GB 1600 Mhz 4x4 Blue Ram
Video Card(s) Nvidia 1080 FE
Storage Samsung 840 Evo 256 GB/RAID 0 Western Digital Blue 1 TB HDDs
Display(s) Acer XG270HU
Case Antec P100
Power Supply Corsair CX850M
Mouse Logitech G502
Keyboard TT eSports Poseidon
Software Windows 10
I have major flickering issues on R290 Crossfire playing TW3 / Dragon Age Inquisition. Heck it annoys the heck out of me when i stop and stare at the environment, only to see mountains flickering in and out :shadedshu::shadedshu:

Been a fan of AMD since 9800PRO till now, even requesting all my friends and neighbors to go AMD. But when Nvidia GTX1080 released their card...... it make me cringe, wanting to jump to the green camp :banghead::banghead:

I did also. I thought it was my board at first.
 
Top