• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Gigabyte R9 380 G1 Gaming 4GB Constantly Throttling [solved]

Kanan

Tech Enthusiast & Gamer
Joined
Aug 22, 2015
Messages
3,517 (1.12/day)
Location
Europe
System Name eazen corp | Xentronon 7.2
Processor AMD Ryzen 7 3700X // PBO max.
Motherboard Asus TUF Gaming X570-Plus
Cooling Noctua NH-D14 SE2011 w/ AM4 kit // 3x Corsair AF140L case fans (2 in, 1 out)
Memory G.Skill Trident Z RGB 2x16 GB DDR4 3600 @ 3800, CL16-19-19-39-58-1T, 1.4 V
Video Card(s) Asus ROG Strix GeForce RTX 2080 Ti modded to MATRIX // 2000-2100 MHz Core / 1938 MHz G6
Storage Silicon Power P34A80 1TB NVME/Samsung SSD 830 128GB&850 Evo 500GB&F3 1TB 7200RPM/Seagate 2TB 5900RPM
Display(s) Samsung 27" Curved FS2 HDR QLED 1440p/144Hz&27" iiyama TN LED 1080p/120Hz / Samsung 40" IPS 1080p TV
Case Corsair Carbide 600C
Audio Device(s) HyperX Cloud Orbit S / Creative SB X AE-5 @ Logitech Z906 / Sony HD AVR @PC & TV @ Teufel Theater 80
Power Supply EVGA 650 GQ
Mouse Logitech G700 @ Steelseries DeX // Xbox 360 Wireless Controller
Keyboard Corsair K70 LUX RGB /w Cherry MX Brown switches
VR HMD Still nope
Software Win 10 Pro
Benchmark Scores 15 095 Time Spy | P29 079 Firestrike | P35 628 3DM11 | X67 508 3DM Vantage Extreme
Well you ignored my post. Again: the R9 380 simply isn't a good card for lower resolutions than 1920x1080, I would change the monitor / use VSR or get a GTX 960, that's less/or not CPU limited in that low resolution you use for gaming. Again, AMD drivers are single threaded, NV use 2 threads, they are way less CPU limited, this is just "on top" of what you already know, you should not ignore it. AMD cards all in all are good, drivers are good enough too, this is rather a specific problem with your PC + low resolution. btw. I don't think it's the hard drive, you only lose FPS with a hard drive if Ram needs to be loaded again by it, mid game. That's rarely the case though, maybe in Skyrim more than in other games.

PS. A card that's bottlenecked by CPU would clock down, because it's bored by the low resolution and missing data from the CPU. The lower the resolution, the better the IPC of the CPU must be, for Radeon drivers at least, to not bottleneck the GPU. You use a i5 2400 @ 3,8 GHz @ 1440x900, that's basically the opposite what Radeon drivers like. Optimal with an R9 380 would be a resolution of 1080p or 1440p with the same CPU or a bit stronger. But only the resolution makes a big difference, because it needs so much more graphics power to drive a 1920x1080 resolution compared to 1440x900, that the bottleneck would shift from CPU to GPU here, and I don't think (with fixed bios and all) that the GPU would then still clock down.
 
Joined
Jan 13, 2016
Messages
660 (0.22/day)
Location
127.0.0.1, London, UK
System Name Warranty Void Mk.IV
Processor AMD Ryzen 5 5600
Motherboard Asus X470-I Strix
Cooling Arctic Freezer 240 + 2x Be Quiet! Pure Wings 2 140mm / Silverstone 120mm Slim
Memory Crucial Ballistix Elite 3600MHz 2x8GB CL16 - Tightened Sub-timings
Video Card(s) EVGA RTX 2080 XC Ultra
Storage WD SN550 / MX300 / MX500
Display(s) AOC CU34G2 / LG 29UM69G-B - Auxilary
Case CM NR200p / Silverstone RVZ03-B
Audio Device(s) Realtek ALC 1220+SupremeFX
Power Supply Corsair CX550M 550W / Silverstone SX650-G 650W
Mouse Logitech G302/G502/G203 / RPG: Corsair Nightsword
Keyboard CM Masterkeys Pro M / Asus Sagaris GK100
VR HMD Oculus Rift S
Software Windows 10 Pro x64 - LTSB
Well you ignored my post. Again: the R9 380 simply isn't a good card for lower resolutions than 1920x1080, I would change the monitor / use VSR or get a GTX 960, that's less/or not CPU limited in that low resolution you use for gaming. Again, AMD drivers are single threaded, NV use 2 threads, they are way less CPU limited, this is just "on top" of what you already know, you should not ignore it. AMD cards all in all are good, drivers are good enough too, this is rather a specific problem with your PC + low resolution. btw. I don't think it's the hard drive, you only lose FPS with a hard drive if Ram needs to be loaded again by it, mid game. That's rarely the case though, maybe in Skyrim more than in other games.

PS. A card that's bottlenecked by CPU would clock down, because it's bored by the low resolution and missing data from the CPU. The lower the resolution, the better the IPC of the CPU must be, for Radeon drivers at least, to not bottleneck the GPU. You use a i5 2400 @ 3,8 GHz @ 1440x900, that's basically the opposite what Radeon drivers like. Optimal with an R9 380 would be a resolution of 1080p or 1440p with the same CPU or a bit stronger. But only the resolution makes a big difference, because it needs so much more graphics power to drive a 1920x1080 resolution compared to 1440x900, that the bottleneck would shift from CPU to GPU here, and I don't think (with fixed bios and all) that the GPU would then still clock down.

I haven't seen a single peace of proof where lower resolutions make performance worse than it could be, never have I had an issue with this. Unless you are referring to what I was experiencing with the GPU usage going down constantly. Even if I forced max clocks with RadeonPro, the stutters happen and there's no going around that.

You almost hit the nail though, resolution does affect VRAM usage though.

Also AMD and Nvidia GPU's using CPU different thread counts? I have never in my days came across anyone discussing this, Please elaborate, maybe I am just stupid and don't know what you are talking about.

I do know that Radeon GPU's tend to run much better with higher resolutions than Geforce ones, but that's where the differences end for me.

You forget, that a lot of what I run uses graphical modifications that already tax the GPU to 100% but there are more "stressful" locations where the camera points at, which in turn has high drop in frames. Raising the resolution would make me go ever closer to the VRAM limit, because I am already at 3,2GB usage.

The issue was really prominent in Skyrim because I had a bad configuration for my stability mod which made the VRAM usage and GPU load go all over the place. The mod in question uses allocated amounts VRAM instead of RAM to load assets, which has a large performance boost. Read about "ENBoost" if you want to know more how it works.

For other games, yes I can raise the resolution, but there's no point since I am already running post-processing that removes or at least blurs jaggies. I do know that DSR/VSR could be more cost effective for performance.

I am still new to the notion of using a Radeon, so I am still trying to tailor everything to my specific setup and using what other people put out as guidelines, and even then if I find something sketchy I will test it myself just in case.

Btw, I am running my CPU at stock (the BCLK OC is only for bechmarks), because the performance improvement on games is 0-3FPS which doesn't help much, nor for stability, since I am running at 59 frames capped already for most of the games I have tried, again going back to the resolution setting, the performance drop that comes with upping the pixels results in lower frames + I have to account for the lag spikes that come from the game engines themselves.

Edit: Fallout 4 everything at Ultra, 1680x1050, except God Rays set to Off and Shadows at Medium gives me about 55fps on average. Don't ask me why that resolution, Fallout 4 doesn't support my aspect ratio and thus I only get to choose from this and some other lower resolution. I'm not even going to bother using the launcher because it resets the settings back. Maybe putting in the resolution directly in the .ini files would work OK, I have to look that up later. I don't really like playing on anything higher than max native monitor resolution because of performance issues, maybe this isn't the case for the 380 4GB. I know it has so much untapped power, but it's just not used, upping resolution only goes so far...

That's why I use post-processing to get the most out of GPU's and I've been this way since 2009.

Even though there multiple places where people document that AF x16 has almost no performance impact when forced through the driver. There's still one thing to remember, you're applying it to an old DX9 title that doesn't even have proper multithreading. So you're adding on extra load to something that already is cursed to run bad. This mostly applies to game engines that are unoptimized by game developers that use them. (Real Virtuality and NetImmerse engine, why NetImmerse and not any other engine that Bethesda Game Studios uses? Because essentially Gamebryo and Creation engine are at their core, the engine that dates back to 2002 with the original release of Morrowind for Xbox, except that they are heavily modified to cater to their needs)

So tell me again why should I put extra workload on something that is trying it's best to run at 100% but just doesn't, when the game engine likes to just stop and think for a second.

My GTX 660 could run something like Bioshock Infinite on Ultra 1920x1080 easy. Didn't need to put any AA or AF since the game has so much bloom that blurs out everything. Pretty much every non-open world game runs totally fine then and now.

I am not going back to my posts to edit them all, going to keep them for reference. There's 4-6 games in my library that run like absolute crap on any GPU. For others the GPU works as intended (as far as I can tell) now that I figured out the fan issue I was having before.

Seems to be like I was just reaffirming myself that developers or user error are to blame and not the hardware, most of the time at least. I have to remember that I am using something that is close to a 7950, rather than a 7970, but that's where the boost clocks should come in.
 
Last edited:

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,472 (4.25/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
GPU-Z seems to report that I am still using the F1 bios (015.049.000.009.000000) even after a successful flash to the F51 bios with ATIFlash.

The Revision is not the BIOS version, it is the revision of the GPU on the card. It will always say F1, there is no way to change that.
 
Joined
Jan 13, 2016
Messages
660 (0.22/day)
Location
127.0.0.1, London, UK
System Name Warranty Void Mk.IV
Processor AMD Ryzen 5 5600
Motherboard Asus X470-I Strix
Cooling Arctic Freezer 240 + 2x Be Quiet! Pure Wings 2 140mm / Silverstone 120mm Slim
Memory Crucial Ballistix Elite 3600MHz 2x8GB CL16 - Tightened Sub-timings
Video Card(s) EVGA RTX 2080 XC Ultra
Storage WD SN550 / MX300 / MX500
Display(s) AOC CU34G2 / LG 29UM69G-B - Auxilary
Case CM NR200p / Silverstone RVZ03-B
Audio Device(s) Realtek ALC 1220+SupremeFX
Power Supply Corsair CX550M 550W / Silverstone SX650-G 650W
Mouse Logitech G302/G502/G203 / RPG: Corsair Nightsword
Keyboard CM Masterkeys Pro M / Asus Sagaris GK100
VR HMD Oculus Rift S
Software Windows 10 Pro x64 - LTSB
The Revision is not the BIOS version, it is the revision of the GPU on the card. It will always say F1, there is no way to change that.
Ah, thanks for clarifying that, I got confused a little. On Nvidia it seemed to mirror what ever was the BIOS version/revision.
 
Last edited:

Kanan

Tech Enthusiast & Gamer
Joined
Aug 22, 2015
Messages
3,517 (1.12/day)
Location
Europe
System Name eazen corp | Xentronon 7.2
Processor AMD Ryzen 7 3700X // PBO max.
Motherboard Asus TUF Gaming X570-Plus
Cooling Noctua NH-D14 SE2011 w/ AM4 kit // 3x Corsair AF140L case fans (2 in, 1 out)
Memory G.Skill Trident Z RGB 2x16 GB DDR4 3600 @ 3800, CL16-19-19-39-58-1T, 1.4 V
Video Card(s) Asus ROG Strix GeForce RTX 2080 Ti modded to MATRIX // 2000-2100 MHz Core / 1938 MHz G6
Storage Silicon Power P34A80 1TB NVME/Samsung SSD 830 128GB&850 Evo 500GB&F3 1TB 7200RPM/Seagate 2TB 5900RPM
Display(s) Samsung 27" Curved FS2 HDR QLED 1440p/144Hz&27" iiyama TN LED 1080p/120Hz / Samsung 40" IPS 1080p TV
Case Corsair Carbide 600C
Audio Device(s) HyperX Cloud Orbit S / Creative SB X AE-5 @ Logitech Z906 / Sony HD AVR @PC & TV @ Teufel Theater 80
Power Supply EVGA 650 GQ
Mouse Logitech G700 @ Steelseries DeX // Xbox 360 Wireless Controller
Keyboard Corsair K70 LUX RGB /w Cherry MX Brown switches
VR HMD Still nope
Software Win 10 Pro
Benchmark Scores 15 095 Time Spy | P29 079 Firestrike | P35 628 3DM11 | X67 508 3DM Vantage Extreme
Also AMD and Nvidia GPU's using CPU different thread counts? I have never in my days came across anyone discussing this, Please elaborate, maybe I am just stupid and don't know what you are talking about.
The thing why AMD GPUs scale better with resolution has to do with this and the fact, they are simply better suited to higher resolution (high bandwidth, more Vram besides Fiji GPUs). I've read somewhere here and in reviews over and over again, that AMD drivers are pretty limited in DX11 applications because the drivers only use 1 core and NV drivers use 2 cores, so a higher resolution and a strong IPC processor help reduce this problem with AMD GPUs - that's why Intel CPUs are better for AMD GPUs, as dumb as this sounds, AMD CPUs and GPUs aren't such a good combination. That's why Fiji is only really on par with 980 Ti in 4K. That's why 290/390(X) are better suited to 1440p. Because they have higher usage there, whereas NV GPUs just run on 100% all the time, AMD GPUs only do if you can max them out, either with a very demanding game (as Crysis 3 on 1080p+) or with a high resolution simply. They can close in to 980 Ti / Titan X because of this fact.

Compare 1080p with 1440p and 4K, higher resolution "helps" AMD GPUs to close in:
https://www.techpowerup.com/reviews/AMD/R9_Fury_X/31.html

I don't really like playing on anything higher than max native monitor resolution because of performance issues, maybe this isn't the case for the 380 4GB. I know it has so much untapped power, but it's just not used, upping resolution only goes so far...
I thought you have the R9 380 4 GB from GB? Well I wasn't aware that you play modded games, this makes things way more complicated than I thought.

Even if I forced max clocks with RadeonPro, the stutters happen and there's no going around that.
If you run into a CPU bottleneck, increasing GPU clocks doesn't help at all. It's like having a car with 1000 PS and not enough fuel to drive it. That's why I thought your GPU is downclocking in the first place, because it's bored, it needs more data and clocks down to save energy.

Btw, I am running my CPU at stock (the BCLK OC is only for bechmarks), because the performance improvement on games is 0-3FPS which doesn't help much, nor for stability, since I am running at 59 frames capped already for most of the games I have tried, again going back to the resolution setting, the performance drop that comes with upping the pixels results in lower frames + I have to account for the lag spikes that come from the game engines themselves.
I'd run the CPU at max overclock if I were you. The Radeon GPUs need all the CPU power they can get, basically your CPU isn't the strongest besides that. If 3,8 GHz is the max OC of it, I'd use it, you don't have any handycaps in doing so I guess? So I don't see a reason why not running it at the max. 1-3 FPS doesn't sound much, but it helps with minimum FPS too, don't underestimate it. Maybe you can make some benchmarking with OC and without (ingame with fraps, it puts out average, min, max FPS), then you know for sure.

You forget, that a lot of what I run uses graphical modifications that already tax the GPU to 100% but there are more "stressful" locations where the camera points at, which in turn has high drop in frames.
Well, what I said was directed to games that have problems with 100% GPU usage, it can only be because of a CPU bottleneck or because the game is simply not demanding enough, but then you wouldn't have the FPS problems you have encountered.

PS. The whole reason for AMD to do Mantle was the better utilization of their GPUs, because they tend to get underused in DX11 games, because of how their GPUs are architectured compared to NV. The reason why they are so much better in DX12 games compared to DX11 games is the same. Whereas NV DX11 to DX12 doesn't change much, their GPUs are already utilized very well under DX11.

R9 390X better than GTX 980:
http://www.pcgameshardware.de/DirectX-12-Software-255525/Specials/Spiele-Benchmark-1172196/
 
Last edited:
Joined
Jan 13, 2016
Messages
660 (0.22/day)
Location
127.0.0.1, London, UK
System Name Warranty Void Mk.IV
Processor AMD Ryzen 5 5600
Motherboard Asus X470-I Strix
Cooling Arctic Freezer 240 + 2x Be Quiet! Pure Wings 2 140mm / Silverstone 120mm Slim
Memory Crucial Ballistix Elite 3600MHz 2x8GB CL16 - Tightened Sub-timings
Video Card(s) EVGA RTX 2080 XC Ultra
Storage WD SN550 / MX300 / MX500
Display(s) AOC CU34G2 / LG 29UM69G-B - Auxilary
Case CM NR200p / Silverstone RVZ03-B
Audio Device(s) Realtek ALC 1220+SupremeFX
Power Supply Corsair CX550M 550W / Silverstone SX650-G 650W
Mouse Logitech G302/G502/G203 / RPG: Corsair Nightsword
Keyboard CM Masterkeys Pro M / Asus Sagaris GK100
VR HMD Oculus Rift S
Software Windows 10 Pro x64 - LTSB
The thing why AMD GPUs scale better with resolution has to do with this and the fact, they are simply better suited to higher resolution (high bandwidth, more Vram besides Fiji GPUs). I've read somewhere here and in reviews over and over again, that AMD drivers are pretty limited in DX11 applications because the drivers only use 1 core and NV drivers use 2 cores, so a higher resolution and a strong IPC processor help reduce this problem with AMD GPUs - that's why Intel CPUs are better for AMD GPUs, as dumb as this sounds, AMD CPUs and GPUs aren't such a good combination. That's why Fiji is only really on par with 980 Ti in 4K. That's why 290/390(X) are better suited to 1440p. Because they have higher usage there, whereas NV GPUs just run on 100% all the time, AMD GPUs only do if you can max them out, either with a very demanding game (as Crysis 3 on 1080p+) or with a high resolution simply. They can close in to 980 Ti / Titan X because of this fact.

Compare 1080p with 1440p and 4K, higher resolution "helps" AMD GPUs to close in:
https://www.techpowerup.com/reviews/AMD/R9_Fury_X/31.html
While I seem to agree with this, the gaps with R9 380 would be much smaller and wouldn't make much difference.

I thought you have the R9 380 4 GB from GB? Well I wasn't aware that you play modded games, this makes things way more complicated than I thought.
I owned a GTX 660 before, for 100 euro that was a good deal, but it couldn't run Fallout 4 on ultra, that I knew before the game ever came out.

If you run into a CPU bottleneck, increasing GPU clocks doesn't help at all. It's like having a car with 1000 PS and not enough fuel to drive it. That's why I thought your GPU is downclocking in the first place, because it's bored, it needs more data and clocks down to save energy.
I'd run the CPU at max overclock if I were you. The Radeon GPUs need all the CPU power they can get, basically your CPU isn't the strongest besides that. If 3,8 GHz is the max OC of it, I'd use it, you don't have any handycaps in doing so I guess? So I don't see a reason why not running it at the max. 1-3 FPS doesn't sound much, but it helps with minimum FPS too, don't underestimate it. Maybe you can make some benchmarking with OC and without (ingame with fraps, it puts out average, min, max FPS), then you know for sure.
Well the CPU cores are nowhere reaching 70% usage for constant gaming @ stock levels. So I figure I do not need the OC unless it's a single-thread game. And even then the performance increase wouldn't be as much as I wanted, plus I'd have to run this OC all the time, and not per game or some sort of profile.

I'd run the CPU at max overclock if I were you. The Radeon GPUs need all the CPU power they can get, basically your CPU isn't the strongest besides that. If 3,8 GHz is the max OC of it, I'd use it, you don't have any handycaps in doing so I guess? So I don't see a reason why not running it at the max. 1-3 FPS doesn't sound much, but it helps with minimum FPS too, don't underestimate it. Maybe you can make some benchmarking with OC and without (ingame with fraps, it puts out average, min, max FPS), then you know for sure.
I'll keep all that in mind. My i5-2400 destroys any AMD CPU/APU offering in single-thread, that's why I switched from a Phenom X6 1605T @ 4Ghz. Plus Phenom was power hungry and the VRMs on the motherboard would become pretty hot, with the Intel setup I have no such problems, I was even able to scale down to a smaller case. I could overclock the CPU as much as I wanted, but the FPS gains were in 0.1 digits.

Going from an AMD CPU gave a huge boost, around +20FPS. I didn't trust the tech sites at the time because it seemed so unreal how the Intel Core series 1st and 2nd generation was having a lead over anything AMD had to offer.

Arma, Dying Light, Skyrim, Fallout, Battlefield 3 to name a few ran a lot better and smoother.

Well, what I said was directed to games that have problems with 100% GPU usage, it can only be because of a CPU bottleneck or because the game is simply not demanding enough, but then you wouldn't have the FPS problems you have encountered.
I think I read that as such anyway, no worries. :D
 
Last edited:

Kanan

Tech Enthusiast & Gamer
Joined
Aug 22, 2015
Messages
3,517 (1.12/day)
Location
Europe
System Name eazen corp | Xentronon 7.2
Processor AMD Ryzen 7 3700X // PBO max.
Motherboard Asus TUF Gaming X570-Plus
Cooling Noctua NH-D14 SE2011 w/ AM4 kit // 3x Corsair AF140L case fans (2 in, 1 out)
Memory G.Skill Trident Z RGB 2x16 GB DDR4 3600 @ 3800, CL16-19-19-39-58-1T, 1.4 V
Video Card(s) Asus ROG Strix GeForce RTX 2080 Ti modded to MATRIX // 2000-2100 MHz Core / 1938 MHz G6
Storage Silicon Power P34A80 1TB NVME/Samsung SSD 830 128GB&850 Evo 500GB&F3 1TB 7200RPM/Seagate 2TB 5900RPM
Display(s) Samsung 27" Curved FS2 HDR QLED 1440p/144Hz&27" iiyama TN LED 1080p/120Hz / Samsung 40" IPS 1080p TV
Case Corsair Carbide 600C
Audio Device(s) HyperX Cloud Orbit S / Creative SB X AE-5 @ Logitech Z906 / Sony HD AVR @PC & TV @ Teufel Theater 80
Power Supply EVGA 650 GQ
Mouse Logitech G700 @ Steelseries DeX // Xbox 360 Wireless Controller
Keyboard Corsair K70 LUX RGB /w Cherry MX Brown switches
VR HMD Still nope
Software Win 10 Pro
Benchmark Scores 15 095 Time Spy | P29 079 Firestrike | P35 628 3DM11 | X67 508 3DM Vantage Extreme
Well the CPU cores are nowhere reaching 70% usage for constant gaming @ stock levels. So I figure I do not need the OC unless it's a single-thread game. And even then the performance increase wouldn't be as much as I wanted, plus I'd have to run this OC all the time, and not per game or some sort of profile.
70% is fine, but could still not be enough for the GPU. Don't forget the "70% CPU usage" is on all cores, but the AMD driver only uses 1 core, so a stronger core maybe helps the GPU. What's really important, how high is the usage on core #0. I'd really do some real game benchmarks if I were you, with OC and w/o. Can you not OC the CPU without it running always on max clock? Well if not I understand your problem. I'd use software overclocking then, easier than always restarting the pc and doing it in the bios.
 
Joined
Jan 13, 2016
Messages
660 (0.22/day)
Location
127.0.0.1, London, UK
System Name Warranty Void Mk.IV
Processor AMD Ryzen 5 5600
Motherboard Asus X470-I Strix
Cooling Arctic Freezer 240 + 2x Be Quiet! Pure Wings 2 140mm / Silverstone 120mm Slim
Memory Crucial Ballistix Elite 3600MHz 2x8GB CL16 - Tightened Sub-timings
Video Card(s) EVGA RTX 2080 XC Ultra
Storage WD SN550 / MX300 / MX500
Display(s) AOC CU34G2 / LG 29UM69G-B - Auxilary
Case CM NR200p / Silverstone RVZ03-B
Audio Device(s) Realtek ALC 1220+SupremeFX
Power Supply Corsair CX550M 550W / Silverstone SX650-G 650W
Mouse Logitech G302/G502/G203 / RPG: Corsair Nightsword
Keyboard CM Masterkeys Pro M / Asus Sagaris GK100
VR HMD Oculus Rift S
Software Windows 10 Pro x64 - LTSB
70% is fine, but could still not be enough for the GPU. Don't forget the "70% CPU usage" is on all cores, but the AMD driver only uses 1 core, so a stronger core maybe helps the GPU. What's really important, how high is the usage on core #0. I'd really do some real game benchmarks if I were you, with OC and w/o. Can you not OC the CPU without it running always on max clock? Well if not I understand your problem. I'd use software overclocking then, easier than always restarting the pc and doing it in the bios.
I can force max clocks with "High Performance" power setting on Windows. that makes the processor run at highest turbo or non-turbo frequency and unparks all cores. Software OC is really flaky on non-K series parts so I'm out of luck there.

I might do some game runs to test and monitor overall system usage and the FPS the frequency gains provide when the CPU is at its max capability.

At the time of purchase of my processor, the price of an i5 2500K is something I couldn't take a bite at. At 4.5GHz it is significantly more powerful than other i5 counterparts of that time, it still beats next-gen i5's with that overclock and then some.
 
Last edited:
Joined
Jan 16, 2016
Messages
16 (0.01/day)
WTF is gigabyte waiting to fix this problem? the g1s were supposed to be the top of the line.
 
Joined
Jan 13, 2016
Messages
660 (0.22/day)
Location
127.0.0.1, London, UK
System Name Warranty Void Mk.IV
Processor AMD Ryzen 5 5600
Motherboard Asus X470-I Strix
Cooling Arctic Freezer 240 + 2x Be Quiet! Pure Wings 2 140mm / Silverstone 120mm Slim
Memory Crucial Ballistix Elite 3600MHz 2x8GB CL16 - Tightened Sub-timings
Video Card(s) EVGA RTX 2080 XC Ultra
Storage WD SN550 / MX300 / MX500
Display(s) AOC CU34G2 / LG 29UM69G-B - Auxilary
Case CM NR200p / Silverstone RVZ03-B
Audio Device(s) Realtek ALC 1220+SupremeFX
Power Supply Corsair CX550M 550W / Silverstone SX650-G 650W
Mouse Logitech G302/G502/G203 / RPG: Corsair Nightsword
Keyboard CM Masterkeys Pro M / Asus Sagaris GK100
VR HMD Oculus Rift S
Software Windows 10 Pro x64 - LTSB
WTF is gigabyte waiting to fix this problem? the g1s were supposed to be the top of the line.
That title just went to their "XTREME" series cards. Besides, I know Gigabyte for their lack of support after less than a year for some products. This is even more true when they release a product with a new revision number, they are very famous for doing so a lot. I look at my own motherboard and it has a second revision which got better updates.

Back to topic edit:
I went into BIOS, disabled C6 states and turbo boost, allowed real-time OS clock changes, left most of the voltage options on Auto and set the core frequency multiplier to x38. Currently @ 3.8GHz and running games with increases of up to 5FPS in most cases where my max FPS is below 59. Skyrim and Fallout 4 stutter hasn't been changed at all, only average frame rates are better where certain scenes run at below 50FPS.

Fallout now instead of being 45-59FPS average is now 50-59 which is slightly smoother.

I am going to try to find a way to force turbo clocks for Core 0 and 1 to 3.8GHz and leave Core 2 and 3 unchanged.

Edit2: Really strange thing is Windows is reporting max core clock as 5.8GHz which is insane at one point while still booting into the desktop, core clock did go that high. For a second there I thought I was going to shut of the PC immediately, but it seems to be completely fine with this. I'm not running this 24/7 though. :roll:

AIDA64 seems to be very slow at reporting current CPU clock speeds, so I am using Task Manager to monitor that change, what I am interested in is the voltage the CPU is running at.

Task manager, you had ONE job.

Never Trust Task Manager #NTTM :nutkick: Or Micro$oft for that matter...

CPU Core 1.28v is a little bit too much for 3.6/3.8GHz... Probably... running two cores on turbo 3.8 doesn't seem such a good idea. Maybe I should create a separate thread for i5-2400 overclocking and stability.

--==All cores non Turbo x38 Mult.==--
Core Voltage: 1.28-1.308v
CPU VID, reported as 1.35v
Core temp @ Idle: 37c - Core temp @ Load: 60c

--==Core 0, 1 Turbo x38 Mult.==--
Core Voltage: 1.28-1.29v
CPU VID, reported as 1.35v
Core temp @ Idle: 34c - Core temp @ Load: 53c

This is why I love Sandy Bridge CPU's so much (I've always wanted one but I didn't have enough money to get that top of the line i5), even their non-K offerings are pretty decent. I got my i5-2400 from a guy at the other side of my country, I can only assume that the CPU was only seeing office use and no gaming.

I haven't messed with the integrated NB, it's best to let it stay at 100MHz because of stability reasons. That clock speed ties in a lot of things from Intel DMI, on full load the south bridge is reaching high temps which is very undesirable. Increasing that for Benchmarks might be okay, but for every day usage, it's just better to leave the FSB alone on non-K Pre-Skylake CPU's.

I'm still holding my breath for good DirectX 12 capable titles. I bought the R9 380 for DX12 full support. Microsoft stop making upgrade everything when I don't want to!
 
Last edited:
Joined
Jan 16, 2016
Messages
16 (0.01/day)
This thread is about r9 380 g1, CPU is not the problem.

How can i flash the f51 BIOS? I tried the gigabyte flash app but i get an error.
 
Joined
Jan 13, 2016
Messages
660 (0.22/day)
Location
127.0.0.1, London, UK
System Name Warranty Void Mk.IV
Processor AMD Ryzen 5 5600
Motherboard Asus X470-I Strix
Cooling Arctic Freezer 240 + 2x Be Quiet! Pure Wings 2 140mm / Silverstone 120mm Slim
Memory Crucial Ballistix Elite 3600MHz 2x8GB CL16 - Tightened Sub-timings
Video Card(s) EVGA RTX 2080 XC Ultra
Storage WD SN550 / MX300 / MX500
Display(s) AOC CU34G2 / LG 29UM69G-B - Auxilary
Case CM NR200p / Silverstone RVZ03-B
Audio Device(s) Realtek ALC 1220+SupremeFX
Power Supply Corsair CX550M 550W / Silverstone SX650-G 650W
Mouse Logitech G302/G502/G203 / RPG: Corsair Nightsword
Keyboard CM Masterkeys Pro M / Asus Sagaris GK100
VR HMD Oculus Rift S
Software Windows 10 Pro x64 - LTSB
This thread is about r9 380 g1, CPU is not the problem.

How can i flash the f51 BIOS? I tried the gigabyte flash app but i get an error.
Use ATIWinFlash instead, it's pretty self-explanatory to use. If you're still getting an error or an unsuccessful flash, it's better to create a new thread for that.
 

Kanan

Tech Enthusiast & Gamer
Joined
Aug 22, 2015
Messages
3,517 (1.12/day)
Location
Europe
System Name eazen corp | Xentronon 7.2
Processor AMD Ryzen 7 3700X // PBO max.
Motherboard Asus TUF Gaming X570-Plus
Cooling Noctua NH-D14 SE2011 w/ AM4 kit // 3x Corsair AF140L case fans (2 in, 1 out)
Memory G.Skill Trident Z RGB 2x16 GB DDR4 3600 @ 3800, CL16-19-19-39-58-1T, 1.4 V
Video Card(s) Asus ROG Strix GeForce RTX 2080 Ti modded to MATRIX // 2000-2100 MHz Core / 1938 MHz G6
Storage Silicon Power P34A80 1TB NVME/Samsung SSD 830 128GB&850 Evo 500GB&F3 1TB 7200RPM/Seagate 2TB 5900RPM
Display(s) Samsung 27" Curved FS2 HDR QLED 1440p/144Hz&27" iiyama TN LED 1080p/120Hz / Samsung 40" IPS 1080p TV
Case Corsair Carbide 600C
Audio Device(s) HyperX Cloud Orbit S / Creative SB X AE-5 @ Logitech Z906 / Sony HD AVR @PC & TV @ Teufel Theater 80
Power Supply EVGA 650 GQ
Mouse Logitech G700 @ Steelseries DeX // Xbox 360 Wireless Controller
Keyboard Corsair K70 LUX RGB /w Cherry MX Brown switches
VR HMD Still nope
Software Win 10 Pro
Benchmark Scores 15 095 Time Spy | P29 079 Firestrike | P35 628 3DM11 | X67 508 3DM Vantage Extreme
That title just went to their "XTREME" series cards. Besides, I know Gigabyte for their lack of support after less than a year for some products. This is even more true when they release a product with a new revision number, they are very famous for doing so a lot. I look at my own motherboard and it has a second revision which got better updates.

Back to topic edit:
I went into BIOS, disabled C6 states and turbo boost, allowed real-time OS clock changes, left most of the voltage options on Auto and set the core frequency multiplier to x38. Currently @ 3.8GHz and running games with increases of up to 5FPS in most cases where my max FPS is below 59. Skyrim and Fallout 4 stutter hasn't been changed at all, only average frame rates are better where certain scenes run at below 50FPS.

Fallout now instead of being 45-59FPS average is now 50-59 which is slightly smoother.

I am going to try to find a way to force turbo clocks for Core 0 and 1 to 3.8GHz and leave Core 2 and 3 unchanged.

Edit2: Really strange thing is Windows is reporting max core clock as 5.8GHz which is insane at one point while still booting into the desktop, core clock did go that high. For a second there I thought I was going to shut of the PC immediately, but it seems to be completely fine with this. I'm not running this 24/7 though. :roll:

AIDA64 seems to be very slow at reporting current CPU clock speeds, so I am using Task Manager to monitor that change, what I am interested in is the voltage the CPU is running at.

Task manager, you had ONE job.

Never Trust Task Manager #NTTM :nutkick: Or Micro$oft for that matter...

CPU Core 1.28v is a little bit too much for 3.6/3.8GHz... Probably... running two cores on turbo 3.8 doesn't seem such a good idea. Maybe I should create a separate thread for i5-2400 overclocking and stability.

--==All cores non Turbo x38 Mult.==--
Core Voltage: 1.28-1.308v
CPU VID, reported as 1.35v
Core temp @ Idle: 37c - Core temp @ Load: 60c

--==Core 0, 1 Turbo x38 Mult.==--
Core Voltage: 1.28-1.29v
CPU VID, reported as 1.35v
Core temp @ Idle: 34c - Core temp @ Load: 53c

This is why I love Sandy Bridge CPU's so much (I've always wanted one but I didn't have enough money to get that top of the line i5), even their non-K offerings are pretty decent. I got my i5-2400 from a guy at the other side of my country, I can only assume that the CPU was only seeing office use and no gaming.

I haven't messed with the integrated NB, it's best to let it stay at 100MHz because of stability reasons. That clock speed ties in a lot of things from Intel DMI, on full load the south bridge is reaching high temps which is very undesirable. Increasing that for Benchmarks might be okay, but for every day usage, it's just better to leave the FSB alone on non-K Pre-Skylake CPU's.

I'm still holding my breath for good DirectX 12 capable titles. I bought the R9 380 for DX12 full support. Microsoft stop making upgrade everything when I don't want to!
See, I told you it would help to overclock the CPU. ;) Though this high GHz can't be - PC would instantly crash, CPU would maybe die, can't be.
 
Joined
Jan 16, 2016
Messages
16 (0.01/day)
Use ATIWinFlash instead, it's pretty self-explanatory to use. If you're still getting an error or an unsuccessful flash, it's better to create a new thread for that.

thanks, i flashed it, the throttling problem is still there but the fan curve is less agresive than before.
 
Joined
Jan 13, 2016
Messages
660 (0.22/day)
Location
127.0.0.1, London, UK
System Name Warranty Void Mk.IV
Processor AMD Ryzen 5 5600
Motherboard Asus X470-I Strix
Cooling Arctic Freezer 240 + 2x Be Quiet! Pure Wings 2 140mm / Silverstone 120mm Slim
Memory Crucial Ballistix Elite 3600MHz 2x8GB CL16 - Tightened Sub-timings
Video Card(s) EVGA RTX 2080 XC Ultra
Storage WD SN550 / MX300 / MX500
Display(s) AOC CU34G2 / LG 29UM69G-B - Auxilary
Case CM NR200p / Silverstone RVZ03-B
Audio Device(s) Realtek ALC 1220+SupremeFX
Power Supply Corsair CX550M 550W / Silverstone SX650-G 650W
Mouse Logitech G302/G502/G203 / RPG: Corsair Nightsword
Keyboard CM Masterkeys Pro M / Asus Sagaris GK100
VR HMD Oculus Rift S
Software Windows 10 Pro x64 - LTSB
thanks, i flashed it, the throttling problem is still there but the fan curve is less agresive than before.
Are there any specific games that seem to do this? Or is it on everything that taxes GPU?
 
Joined
Jan 16, 2016
Messages
16 (0.01/day)
ac unity, unigine heaven and valley throttle even with the power limit at +20% (with stock power limit the throttle is even more noticiable), gta v doesnt throttle with +20% power limit.
 
Joined
Jan 13, 2016
Messages
660 (0.22/day)
Location
127.0.0.1, London, UK
System Name Warranty Void Mk.IV
Processor AMD Ryzen 5 5600
Motherboard Asus X470-I Strix
Cooling Arctic Freezer 240 + 2x Be Quiet! Pure Wings 2 140mm / Silverstone 120mm Slim
Memory Crucial Ballistix Elite 3600MHz 2x8GB CL16 - Tightened Sub-timings
Video Card(s) EVGA RTX 2080 XC Ultra
Storage WD SN550 / MX300 / MX500
Display(s) AOC CU34G2 / LG 29UM69G-B - Auxilary
Case CM NR200p / Silverstone RVZ03-B
Audio Device(s) Realtek ALC 1220+SupremeFX
Power Supply Corsair CX550M 550W / Silverstone SX650-G 650W
Mouse Logitech G302/G502/G203 / RPG: Corsair Nightsword
Keyboard CM Masterkeys Pro M / Asus Sagaris GK100
VR HMD Oculus Rift S
Software Windows 10 Pro x64 - LTSB
ac unity, unigine heaven and valley throttle even with the power limit at +20% (with stock power limit the throttle is even more noticiable), gta v doesnt throttle with +20% power limit.
Benchmarks shouldn't do that, unless you haven't picked settings that are high enough. And AC Unity is just a badly written game, so you're probably stuck with running it like that unless there's some fix on the internet.
 
Joined
Jan 16, 2016
Messages
16 (0.01/day)
gine heaven and valley th
Benchmarks shouldn't do that, unless you haven't picked settings that are high enough. And AC Unity is just a badly written game, so you're probably stuck with running it like that unless there's some fix on the internet.

i tried a r9 285 on the same pc, on the same games and benchs, same settings and it didnt throttle a single time
 
Last edited:

Mussels

Freshwater Moderator
Staff member
Joined
Oct 6, 2004
Messages
58,413 (8.21/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
posting quickly on my way to work - my gigabyte R9 290 has two seperate BIOSes on it, the first being a 'silent' mode that throttles and the second being louder and faster.


tried running the other stock BIOS to see how it behaves?
 
Joined
Jan 16, 2016
Messages
16 (0.01/day)
posting quickly on my way to work - my gigabyte R9 290 has two seperate BIOSes on it, the first being a 'silent' mode that throttles and the second being louder and faster.


tried running the other stock BIOS to see how it behaves?

no dual bios on the g1 gaming
 
Joined
Jun 24, 2010
Messages
278 (0.06/day)
System Name MSI GT72S 6QE
Processor Core i7 6820HK
Motherboard Intel Sunrise Point CM236
Cooling 2 fans
Memory 2x 8 GB SO-DIMM DDR4-RAM (2133 MHz)
Video Card(s) NVIDIA GeForce GTX 980M - 8192 MB
Storage 1 ssd 3 hard drives
Display(s) 17.3 inch 16:9, 1920x1080 pixel, LG Philips LP173WF4-SPF1 (LGD0469), IPS, Full HD
Case ??!!!
Audio Device(s) Realtek ALC899
Power Supply most beautiful brick you have ever seen
Some horror story this thread is. I personally stay away from gigabyte boards regardless of nvidia/amd.
Just curios. What was the price difference between the R9 380 and Gtx 960 in your country? How much did you save?
 
Joined
Jan 13, 2016
Messages
660 (0.22/day)
Location
127.0.0.1, London, UK
System Name Warranty Void Mk.IV
Processor AMD Ryzen 5 5600
Motherboard Asus X470-I Strix
Cooling Arctic Freezer 240 + 2x Be Quiet! Pure Wings 2 140mm / Silverstone 120mm Slim
Memory Crucial Ballistix Elite 3600MHz 2x8GB CL16 - Tightened Sub-timings
Video Card(s) EVGA RTX 2080 XC Ultra
Storage WD SN550 / MX300 / MX500
Display(s) AOC CU34G2 / LG 29UM69G-B - Auxilary
Case CM NR200p / Silverstone RVZ03-B
Audio Device(s) Realtek ALC 1220+SupremeFX
Power Supply Corsair CX550M 550W / Silverstone SX650-G 650W
Mouse Logitech G302/G502/G203 / RPG: Corsair Nightsword
Keyboard CM Masterkeys Pro M / Asus Sagaris GK100
VR HMD Oculus Rift S
Software Windows 10 Pro x64 - LTSB
Some horror story this thread is. I personally stay away from gigabyte boards regardless of nvidia/amd.
Just curios. What was the price difference between the R9 380 and Gtx 960 in your country? How much did you save?
The GTX 960's 4GB range from 225-250 euros, and R9 380's are 230-255 euros. I got mine for about 230, now the price is at 247. I thought I was getting the better deal until I found out how the card performs. Too little too late. This thread, is the one of the main reasons I made a user persona on TPU.

I've been with computers all my life. Only now I am having such anomalies happen, the race to regulate power usage has hurt more than it helped for this card.

Would have been nice to have a dual BIOS, even as a G1 board, it lacks that. The difference between Windforce and G1 are that you have a backplate, some indicator LED's for the fans, different fan and shroud config, albeit still retains 2 of the fans and the ability to stop fans to stop completely spinning if temps drop below 45-50c at this time.

The really good thing I have to say about the card is that it's really cool and quiet, and post-processing shaders run like a hot knife through butter. It isn't a bad card per se, it just has a few kinks here and there that just leaves a bad taste in your mouth.

I noticed that the heat sink looks almost identical to the one on Gigabyte GTX 660 which I used to own a few months ago.

The combination of Radeon Crimson 16.1 hotfix driver for WIndows 10 x64 and F51 BIOS, seems to have worked out to some degree, I am still unhappy that I had to deal with these issues for almost 3 weeks.
 
Last edited:
Joined
Aug 29, 2005
Messages
7,062 (1.04/day)
Location
Asked my ISP.... 0.0
System Name Lynni PS \ Lenowo TwinkPad T480
Processor AMD Ryzen 7 7700 Raphael \ i7-8550U Kaby Lake-R
Motherboard ASRock B650M PG Riptide Bios v. 2.02 AMD AGESA 1.1.0.0 \ Lenowo 20L60036MX Bios 1.47
Cooling Noctua NH-D15 Chromax.Black (Only middle fan) \ Lenowo WN-2
Memory G.Skill Flare X5 2x16GB DDR5 6000MHZ CL36-36-36-96 AMD EXPO \ Willk Elektronik 2x16GB 2666MHZ CL17
Video Card(s) Asus GeForce RTX™ 4070 Dual OC GPU: 2325-2355 MEM: 1462| Nvidia GeForce MX™ 150 2GB GDDR5 Micron
Storage Gigabyte M30 1TB|Sabrent Rocket 2TB| HDD: 10TB|1TB \ SKHynix 256GB 2242 3x2
Display(s) LG UltraGear 27GP850-B 1440p@165Hz | LG 48CX OLED 4K HDR | AUO 14" 1440p IPS
Case Fractal Design Meshify 2 Tempered Glass White/Black | Lenowo T480 chassis
Audio Device(s) Steelseries Arctis Pro Wireless
Power Supply Be Quiet! Pure Power 12 M 750W Goldie | 65W
Mouse Logitech G305 Lightspeedy Wireless | Lenowo TouchPad & Logitech G305
Keyboard Akko 3108 DS Horizon V2 Cream Yellow | T480 UK Lumi
Software Win11 Pro 23H2 UK
Benchmark Scores 3DMARK: https://www.3dmark.com/3dm/89434432? GPU-Z: https://www.techpowerup.com/gpuz/details/v3zbr
IF the power target and gpu temp is linked, and prioritized to go for the temp of the gpu, u might find ur problem there.

I noticed that with my Asus GTX Titan, that it could do it's stock boost to 992mhz without changing anything depending on the temp with the not good stock cooler even after re-paste, but that was bcs it didn't hit the power target, just the time, as soon as i changed the prioritization to power target it keept the boost :)
 
Joined
Jan 13, 2016
Messages
660 (0.22/day)
Location
127.0.0.1, London, UK
System Name Warranty Void Mk.IV
Processor AMD Ryzen 5 5600
Motherboard Asus X470-I Strix
Cooling Arctic Freezer 240 + 2x Be Quiet! Pure Wings 2 140mm / Silverstone 120mm Slim
Memory Crucial Ballistix Elite 3600MHz 2x8GB CL16 - Tightened Sub-timings
Video Card(s) EVGA RTX 2080 XC Ultra
Storage WD SN550 / MX300 / MX500
Display(s) AOC CU34G2 / LG 29UM69G-B - Auxilary
Case CM NR200p / Silverstone RVZ03-B
Audio Device(s) Realtek ALC 1220+SupremeFX
Power Supply Corsair CX550M 550W / Silverstone SX650-G 650W
Mouse Logitech G302/G502/G203 / RPG: Corsair Nightsword
Keyboard CM Masterkeys Pro M / Asus Sagaris GK100
VR HMD Oculus Rift S
Software Windows 10 Pro x64 - LTSB
IF the power target and gpu temp is linked, and prioritized to go for the temp of the gpu, u might find ur problem there.

I noticed that with my Asus GTX Titan, that it could do it's stock boost to 992mhz without changing anything depending on the temp with the not good stock cooler even after re-paste, but that was bcs it didn't hit the power target, just the time, as soon as i changed the prioritization to power target it keept the boost :)
Having the power target set higher than 0 seems to, seems to let it keep boost clocks for longer, but not long enough though to matter. The games themselves run at below average FPS for the resources they request from the GPU. I wonder how Witcher III runs, that game likes GeForce GPU's far more. But I wonder how far I can push with that game, sadly I don't own it to get my own numbers, all I can do is try to trust benchmarks.
 

Kanan

Tech Enthusiast & Gamer
Joined
Aug 22, 2015
Messages
3,517 (1.12/day)
Location
Europe
System Name eazen corp | Xentronon 7.2
Processor AMD Ryzen 7 3700X // PBO max.
Motherboard Asus TUF Gaming X570-Plus
Cooling Noctua NH-D14 SE2011 w/ AM4 kit // 3x Corsair AF140L case fans (2 in, 1 out)
Memory G.Skill Trident Z RGB 2x16 GB DDR4 3600 @ 3800, CL16-19-19-39-58-1T, 1.4 V
Video Card(s) Asus ROG Strix GeForce RTX 2080 Ti modded to MATRIX // 2000-2100 MHz Core / 1938 MHz G6
Storage Silicon Power P34A80 1TB NVME/Samsung SSD 830 128GB&850 Evo 500GB&F3 1TB 7200RPM/Seagate 2TB 5900RPM
Display(s) Samsung 27" Curved FS2 HDR QLED 1440p/144Hz&27" iiyama TN LED 1080p/120Hz / Samsung 40" IPS 1080p TV
Case Corsair Carbide 600C
Audio Device(s) HyperX Cloud Orbit S / Creative SB X AE-5 @ Logitech Z906 / Sony HD AVR @PC & TV @ Teufel Theater 80
Power Supply EVGA 650 GQ
Mouse Logitech G700 @ Steelseries DeX // Xbox 360 Wireless Controller
Keyboard Corsair K70 LUX RGB /w Cherry MX Brown switches
VR HMD Still nope
Software Win 10 Pro
Benchmark Scores 15 095 Time Spy | P29 079 Firestrike | P35 628 3DM11 | X67 508 3DM Vantage Extreme
Having the power target set higher than 0 seems to, seems to let it keep boost clocks for longer, but not long enough though to matter. The games themselves run at below average FPS for the resources they request from the GPU. I wonder how Witcher III runs, that game likes GeForce GPU's far more. But I wonder how far I can push with that game, sadly I don't own it to get my own numbers, all I can do is try to trust benchmarks.
In reviews it works good enough. GTX 960 isn't worth it, you made the right choice to get a R9 380 minus the problems, it is much more future proof and has simply better performance in almost all games. GTX 960 is a cheapo card with 1024 shaders and 128 bits sold as expensive. I don't think everyone has the same problems you have with your R9 380, maybe the Gigabyte is bad or just your special situation with mods etc. - but seems you fixed it, thats good to hear.
 
Top