- Joined
- Sep 6, 2013
- Messages
- 2,978 (0.77/day)
- Location
- Athens, Greece
System Name | 3 desktop systems: Gaming / Internet / HTPC |
---|---|
Processor | Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is) |
Motherboard | MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3 |
Cooling | Νoctua U12S / Segotep T4 / Snowman M-T6 |
Memory | 16GB G.Skill RIPJAWS 3600 / 16GB G.Skill Aegis 3200 / 16GB Kingston 2400MHz (DDR3) |
Video Card(s) | ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580 |
Storage | NVMes, NVMes everywhere / NVMes, more NVMes / Various storage, SATA SSD mostly |
Display(s) | Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5 |
Case | Sharkoon Rebel 12 / Sharkoon Rebel 9 / Xigmatek Midguard |
Audio Device(s) | onboard |
Power Supply | Chieftec 850W / Silver Power 400W / Sharkoon 650W |
Mouse | CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech |
Keyboard | CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech |
Software | Windows 10 / Windows 10 / Windows 7 |
They do say that the card is "optimized for every stage of this workflow". If the card had inferior performance in final game testing, then that statement is misleading.Actually, look more closely at what they said. You are putting words in AMD's mouth. They basically said that devs can use this card from development thru testing without having to switch out cards when testing.