- May 8, 2021
- 1,225 (5.70/day)
|Processor||Intel Core i5 10400F|
|Motherboard||Gigabyte B460M Aorus Pro|
|Memory||2x8GB G.Skill Aegis 2666 MHz|
|Video Card(s)||PowerColor Red Dragon V2 RX 580 8GB 100 watt 1100 MHz core|
|Storage||512GB WD Blue + 256GB WD Green|
|Case||Cooler Master Silencio S400|
|Audio Device(s)||Topping D10|
|Power Supply||Chieftec A90 550W (GDP-550C)|
|Mouse||Steel Series Rival 100|
|Keyboard||Hama SL 570|
|Software||Windows 10 Enterprise|
At least Radeon cards (GCN era), have TDP, small power limit and maximum power limit. They try not to deviate from set wattage, even if there are spikes they are very fast and averages out at no more than power limit. There's also AMP limiter too. I can't really imagine any workload that could bypass so many protections. That's just vBIOS. There probably are some hardware protections too. I modify my RX 580's BIOS a lot (haven't modified for more wattage yet) and there are lots of things to tweak (and fuck up). So no, it isn't (or at least wasn't) Furmark specifically that killed cards, cards are made to not allow any workloads that demand more power than manufacturer sets. Setting wattage limit high, let's card run at full boost speed in Furmark and let's card consume far more power than it does in any game. But then again, same effect can be observed in Unigine tests and likely other synthetic benches.Furmark GREATLY downclocks the card massively.
At 400W Power limit, a 3090 or 3080 card that would normally run at 1920 mhz would run at 1300 mhz in furmark.
You can destroy the card if it doesn't.
Proof? Try renaming furmark.exe to Quake3.exe or UnrealTournament.exe
Don't blame me for what happens.