• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

5800x Limiting maximum boost Frequency, undervolting with PBO

Joined
Nov 15, 2021
Messages
16 (0.01/day)
Processor AMD Ryzen 7 5800X
Motherboard Asus X570i
Cooling EKWB 240 AIO
Memory 2x8gb 3200 Mhz
Video Card(s) AMD Midnight Black 6800XT
Storage 970 evo, 980 pro, 840 evo
Display(s) DELL S2719DGF
Case iQunix ZX-1
Power Supply Corsair SF 750
Mouse Logitech G Pro Wireless
Keyboard Cooler Master Masterkeys Pro S
Was tempted to react to one of the many existing 5800x undervolting/overclocking/PBO guides or discussions but I have not seen this question before:
Is it currently possible to set a limit to PBO's maximum boost frequency? A manual negative frequency offset if you will?

My 'issue' is the following: when using PBO, with curve optimizer set manually, tested and everything, with PPT110, TDC75 and EDC95, in mixed-load usage (gaming, opening 20 new tabs/opening the browser, using CAD) the CPU will try to perform a temporary boost to 4.8, sometimes even 4.85Ghz, with an SoC SV12 voltage of 1.425-1.45, causing temps to go higher (to 85-89 celsius) than in the cinebench all core synthetic load where it will settle at 4.55-4.6Ghz @ ~1.38V. This results in an average CR20 score of roughly 6000, CR23 of around 15k, staying below 76 degrees. I realize these voltages are the maximum read values but I am no voltage expert, even with a polling period of 100ms on HWinfo the 1.38V seems to be relatively constant.

Disabling 'core performance boost' in the bios stops PBO from working, I suspect there is no workaround for this? To my knowledge they are not the same? It stops the processor from 'boosting' above 3.8Ghz, PPT never hits >70W and CR20 reaches an average of ~4800.

Hard locking the voltage and frequency ratio to 1.275 and 44x would be fine if it weren't for the high idle wattage that is a result of that.
Is it possible to have all the good things? My aim is to limit the maximum boost frequency to 4.6Ghz.
Am using Asus X570i, latest bios, EKWB 240 AIO. Maybe @Mussels can help, as he has an Asus motherboard and I saw him mentioning the boost behaviour to 4.8+Ghz while gaming.
 
Last edited:
I am able to adjust max boost and Vcore with Zenstate along with not have continous/locked vcore voltages. When not underload the voltages and clock speeds go down.
I am not sure how the software will work on Zen 3 but if it does you should be able to get some higher boosts clocks with lower temps at idle.

Ok, nevermind looks like there is no Zen 3 support. :(
  • Support for Zen, Zen+ and Zen2
  • Redesign Settings and add Auto and Manual OC panels
  • Add SMU version detection
  • Fix performance bias options
 
Last edited:
SoC SV12 voltage of 1.425-1.45
I think you meant core voltage. SOC to 1.45V would kill your CPU in a matter of days.

Your issue is cooling. I find it hard to believe that a 240mm AIO can't cool a 100W CPU. AIOs are supposed to be pretty good at handling bursty temp spikes so I'm not sure what's going on in your case. Maybe there's a contact issue? Fans not spinning fast enough?

In any case, you can simply disable PBO in the UEFI. Way more reliable than software.
 
@The King I had indeed looked at ZenStates, but no luck for Zen3. CTR allowed me to tweak things for the better but it leads to system-wide stutters, I cannot figure out why, might have something to do with the polling.

@rares495 my bad, it is indeed the CPU core voltage SV12.
Disabling PBO still results in the high frequency but at a higher wattage (goes up to 134W) and voltage if core performance boost is enabled, two cores hit 5.05Ghz then. The fan curve I have set in Argus:
There is no delay. I first had the fan speed react to a 10-second average but that caused the temps to go up even more, obviously.

1636995338629.png


It is optimized for noise-acceptable temps :)
 
Last edited:
@The King I had indeed looked at ZenStates, but no luck for Zen3. CTR allowed me to tweak things for the better but it leads to system-wide stutters, I cannot figure out why, might have something to do with the polling.

@rares495 my bad, it is indeed the CPU core voltage SV12.
Disabling PBO still results in the high frequency but at a higher wattage (goes up to 134W) and voltage if core performance boost is enabled. The fan curve I have set in Argus:
There is no delay. I first had the fan speed react to a 10-second average but that caused the temps to go up even more, obviously.

It is optimized for noise-acceptable temps :)
Do me a favor and add your PC specs: https://www.techpowerup.com/forums/account/specs

Can you do a Cinebench R20 run with auto core performance boost and auto PBO? Then post the package power and temps here.
 
With regards to your issue, is your concern the voltages or the temps?
For temps, you can set a temp limit in PBO that is lower than the stock 90F, although as mentioned above it may be worth checking our cooler mount first.
I don't know of a good way to directly limit voltage below the AMD spec without disabling PBO and setting it manually.
In either case, I'm not sure this is something to worry about unless you are running your cpu at its limits 24/7.
 
@The King I had indeed looked at ZenStates, but no luck for Zen3. CTR allowed me to tweak things for the better but it leads to system-wide stutters, I cannot figure out why, might have something to do with the polling.

@rares495 my bad, it is indeed the CPU core voltage SV12.
Disabling PBO still results in the high frequency but at a higher wattage (goes up to 134W) and voltage if core performance boost is enabled, two cores hit 5.05Ghz then. The fan curve I have set in Argus:
There is no delay. I first had the fan speed react to a 10-second average but that caused the temps to go up even more, obviously.

View attachment 225272

It is optimized for noise-acceptable temps :)
Just disable PBO. It will boost close to the limit before disabling it for single-core but will clock lower for heavy multithreading loads and will get lower temps too.
 
With regards to your issue, is your concern the voltages or the temps?
For temps, you can set a temp limit in PBO that is lower than the stock 90F, although as mentioned above it may be worth checking our cooler mount first.
I don't know of a good way to directly limit voltage below the AMD spec without disabling PBO and setting it manually.
In either case, I'm not sure this is something to worry about unless you are running your cpu at its limits 24/7.
My concerns are the stutters that (presumably) arrive from the unnecessary boosting in mixed load programs. I dare not to conclude anything from these figures, nor can I say something about the voltages, to be fair. The temps and noise are third order problems.

@HD64G I think the 'core performance boost' is the culprit. Disabling PBO does not have an effect if 'core performance boost remains on'
@blu3dragon with regards to the cooler: I replaced the thermal paste a week ago.
I know this should be done in CR23 and then even multiple runs to get the average but this is similar to previous results with these 'stock' settings (as in, Asus ships the Bios with this).

I just noticed Argus is monitoring 'CPU' and not 'CPU CCD1', and the fan curve for the raditors is reacting to CPU. CCD1 temps are much lower than the package (obviously) but according to Argus it tops out at 85, while in the log file it tops out at ~76. Any thoughts on that?

1637014542370.png

graphs log.png

cpuspeed.png


Put the results in a log then graph'd them. @rares495

@HD64G reran CR20 and disabled the three options that allow PBO to be enabled in the BIOS.
1637015634458.png


1637015528219.png

Voltage is a bit different, not much else?
The PBO Fmax Enhancer is still on Auto but if PBO is turned off, I suspect this does nothing.
 
Last edited:
I just did a quick run for comparison. I have PBO on (EDIT actually PBO is off for below run - this is the stock power limits). Your temps are a little higher than mine, but otherwise it looks pretty similar. If I run a few more loops it will peak closer to 85C.

cinebench_temps.png


The PBO Fmax Enhancer is still on Auto but if PBO is turned off, I suspect this does nothing.
I don't know if auto is enabled or disabled, but at least for Zen3, PBO Fmax Enhancer should always be disabled. I have tried a few times enabling this and although it increases clock speeds, it lowers effective clock speeds resulting in more heat and worse performance.
 
Last edited:
@panini The 5800X is notoriously hot but 140W in CB is insane for a Zen 3 8-core CPU. That's what my 5900X pulls in Cinebench R20 with everything on Auto. Yes, my motherboard is insane and allows for crazy boosting and yes, it's a better bin but it's still 50% more CPU. No wonder your cooler is struggling a bit. We need to get this package power down to 100W or so. If Auto everything isn't the solution, maybe let's try Disabled everything. I really want to see the 5800X run within stock limits.

@blu3dragon Please tell me which cooler and case you own.
 
Best approach, use OBO +0 and max out CO, also set temp limit, also a PPT limit should help. Something seems off though, on my weak cooler my 5600X +200 pbo and -30x4/-29x2 never exceeds 75C at MB limits (84W). For small 4.85 bursts it stops at 70C.
 
@panini The 5800X is notoriously hot but 140W in CB is insane for a Zen 3 8-core CPU. That's what my 5900X pulls in Cinebench R20 with everything on Auto. Yes, my motherboard is insane and allows for crazy boosting and yes, it's a better bin but it's still 50% more CPU. No wonder your cooler is struggling a bit. We need to get this package power down to 100W or so. If Auto everything isn't the solution, maybe let's try Disabled everything. I really want to see the 5800X run within stock limits.

@blu3dragon Please tell me which cooler and case you own.
Disabling all boosting options, stock 5800x runs at 3.8Ghz max, drawing 70W. Like I mentioned, this results in a CR20 score of ~4800. This solves the stuttering issues while gaming but it in heavier workloads, like rendering or benchmarking, there is a 15-20% performance loss if everything scales well. Now, 50% less power on full utilisation is impressive, certainly given the idle wattage is halved (and most PC's are closer to idle wattage than maximum power draw for most of their lifetime), but I am leaving 'free' performance there.

For all of you, I am running this in a 13L enclosure but I suspect the high temps are mainly due to high voltage and high wattage.
@Taraquin I had a 5600x in my exact setup before and it never hit 70 even when overclocked to its fullest potential. Though, that 5600x was bronze sample and this 5800x is golden sample. By OBO you mean PBO or the Override Boost allowance? If the latter, I have never had that option on anything higher than +0, I believe I am looking for an option to make that -200, which is nonexistant.

I will now try to see if (un)installing the AMD chipsets has an effect on wattage draw when all boosting options are disabled, as currently the chipset drivers are uninstalled (they made my mouse freeze/glitch out...).

I will also log typical boosting behavior with PBO manually regulated (advanced, Curve Optimizer set) under advanced (extreme tweaker in the more high-end Asus motherboards, I believe) for some games, rendering and CAD.

Report incoming
 
Last edited:
Disabling all boosting options, stock 5800x runs at 3.8Ghz max, drawing 70W. Like I mentioned, this results in a CR20 score of ~4800. This solves the stuttering issues while gaming but it in heavier workloads, like rendering or benchmarking, there is a 15-20% performance loss if everything scales well. Now, 50% less power on full utilisation is impressive, certainly given the idle wattage is halved (and most PC's are closer to idle wattage than maximum power draw for most of their lifetime), but I am leaving 'free' performance there.

For all of you, I am running this in a 13L enclosure but I suspect the high temps are mainly due to high voltage and high wattage.
@Taraquin I had a 5600x in my exact setup before and it never hit 70 even when overclocked to its fullest potential. Though, that 5600x was bronze sample and this 5800x is golden sample. By OBO you mean PBO or the Override Boost allowance? If the latter, I have never had that option on anything higher than +0, I believe I am looking for an option to make that -200, which is nonexistant.

I will now try to see if (un)installing the AMD chipsets has an effect on wattage draw when all boosting options are disabled, as currently the chipset drivers are uninstalled (they made my mouse freeze/glitch out...).

I will also log typical boosting behavior with PBO manually regulated (advanced, Curve Optimizer set) under advanced (extreme tweaker in the more high-end Asus motherboards, I believe) for some games, rendering and CAD.

Rerport incoming
We need to find a middle ground between the stock 70W and the ridiculous 140W that the CPU pulls. Your motherboard is very weird because normally having everything set to Auto would result in decent performance at decent power, not in pushing the CPU to its max.
 
We need to find a middle ground between the stock 70W and the ridiculous 140W that the CPU pulls. Your motherboard is very weird because normally having everything set to Auto would result in decent performance at decent power, not in pushing the CPU to its max.
I'll short the cmos jumpers to see if it makes a difference compared to loading default optimized settings, though it shouldn't?
 
My bad! You (OP) just need to change the auto setting of power limit and adjust it to a lower one. Me thinks that once you limit the CPU at 95W you will lose 5% performance for multithreading loads and none for single threaded while lowering temps of full loading by at least 10C. That happens to my 2600X when going from auto (128W) to 95W. Core Performance Boost is needed to be enabled btw.
 
Last edited:
I'll short the cmos jumpers to see if it makes a difference compared to loading default optimized settings, though it shouldn't?

I don't think this will help. Your CPU is already running at the stock power limit of 142W PPT in the 2nd HWinfo shot you posted. The 5800x runs hot by default since it has the same power limit (142W PPT) as a 5900 or 5950x, but those CPUs have 2 CCDs to spread the heat out over. You also can't compare temps of a 5800x to a 5600x with a stock limit of 88W.

If you want to lower temps, you can simply set a lower temp limit in your PBO settings, or go down the path of lowering PPT, EDC and TDC, which will achieve the same thing for more effort.

However, I don't think any of this should cause or fix stuttering in games. My guess is that there is something else going on there since most games won't run close to the power or temp limits.

@panini The 5800X is notoriously hot but 140W in CB is insane for a Zen 3 8-core CPU. That's what my 5900X pulls in Cinebench R20 with everything on Auto.

@blu3dragon Please tell me which cooler and case you own.

5800x has the same stock power limit as a 5900x, so it will try to pull the same until it hits a temp or voltage limit ;-)

I'm using an Arctic freezer II 240, in a large case with a bunch of fans. The AIO is at the top of the case exhausting out, so not the most optimal spot, but as long as room temps are reasonable the internal case temp stays at ~25C.
 
I don't think this will help. Your CPU is already running at the stock power limit of 142W PPT in the 2nd HWinfo shot you posted. The 5800x runs hot by default since it has the same power limit (142W PPT) as a 5900 or 5950x, but those CPUs have 2 CCDs to spread the heat out over. You also can't compare temps of a 5800x to a 5600x with a stock limit of 88W.

If you want to lower temps, you can simply set a lower temp limit in your PBO settings, or go down the path of lowering PPT, EDC and TDC, which will achieve the same thing for more effort.

However, I don't think any of this should cause or fix stuttering in games. My guess is that there is something else going on there since most games won't run close to the power or temp limits.
I am aware of this, I was responding to Taraquin regarding the CCD/5600x. I have even went as far as to try a complete fresh windows install on a different drive to check if it is software related, but no, CB and games behave the same, even if I have fan curves set through the BIOS, making me think it was hardware related. I have since upgraded to the 5800x, swapped out my PSU, RMA'd my GPU and run my pc with and without a gpu riser. CPU behaviour stays the same regardless of these changes, since the upgrade, making me think it is the BIOS.
@HD64G will try it right now. I assume this is changed under AMD overclocking PBO and not in the AI tweaker menu (given that you know what that is, I see you have an MSI mobo)?
 
Last edited:
I have since upgraded to the 5800x, swapped out my PSU, RMA'd my GPU and run my pc with and without a gpu riser. CPU behaviour stays the same regardless of these changes, since the upgrade, making me think it is the BIOS.
Ouch. Sorry to hear this is causing you so much trouble. Maybe we should look at HWinfo in the game(s) you are having trouble with? Presumably you have tried the latest bios and AMD chipset driver. Have you run any memory stability tests? (btw I'm still in the camp of CB behavior looks correct, so it's really the stuttering in games we should debug).

will try it right now. I assume this is changed under AMD overclocking PBO and not in the AI tweaker menu (given that you know what that is, I see you have an MSI mobo)?
In my ASUS B550-F, I need to set the Asus AI tweaker PBO settings to Auto, and then in the AMD overclocking PBO menu I can set power limits to Manual and enter limits there. If you leave the Asus PBO option Enabled, it will override the settings in the AMD menu.
 
der8auer did some testing with 5800X maybe useful to look at voltages and temps.
 
I am aware of this, I was responding to Taraquin regarding the CCD/5600x. I have even went as far as to try a complete fresh windows install on a different drive to check if it is software related, but no, CB and games behave the same, even if I have fan curves set through the BIOS, making me think it was hardware related. I have since upgraded to the 5800x, swapped out my PSU, RMA'd my GPU and run my pc with and without a gpu riser. CPU behaviour stays the same regardless of these changes, since the upgrade, making me think it is the BIOS.
@HD64G will try it right now. I assume this is changed under AMD overclocking PBO and not in the AI tweaker menu (given that you know what that is, I see you have an MSI mobo)?
Check the screenshot of my UEFI below in case it helps you find the setting I posted about previously. Configurable power limit makes it very easy for anyone to bring the CPU under control.
 

Attachments

  • MSI_SnapShot_01.jpg
    MSI_SnapShot_01.jpg
    169.1 KB · Views: 1,091
Disabling all boosting options, stock 5800x runs at 3.8Ghz max, drawing 70W. Like I mentioned, this results in a CR20 score of ~4800. This solves the stuttering issues while gaming but it in heavier workloads, like rendering or benchmarking, there is a 15-20% performance loss if everything scales well. Now, 50% less power on full utilisation is impressive, certainly given the idle wattage is halved (and most PC's are closer to idle wattage than maximum power draw for most of their lifetime), but I am leaving 'free' performance there.

For all of you, I am running this in a 13L enclosure but I suspect the high temps are mainly due to high voltage and high wattage.
@Taraquin I had a 5600x in my exact setup before and it never hit 70 even when overclocked to its fullest potential. Though, that 5600x was bronze sample and this 5800x is golden sample. By OBO you mean PBO or the Override Boost allowance? If the latter, I have never had that option on anything higher than +0, I believe I am looking for an option to make that -200, which is nonexistant.

I will now try to see if (un)installing the AMD chipsets has an effect on wattage draw when all boosting options are disabled, as currently the chipset drivers are uninstalled (they made my mouse freeze/glitch out...).

I will also log typical boosting behavior with PBO manually regulated (advanced, Curve Optimizer set) under advanced (extreme tweaker in the more high-end Asus motherboards, I believe) for some games, rendering and CAD.

Report incoming
what did your 5600x do? cause mine is supposedly a golden sample that will 4,750mhz with 1.375 volts on all core static overclock. no I don't use it, but I know it can do it.
I honestly Have around the same temps you do with Avx on in prime95 with a 5600x! up to 77-80C depending on the ambient temperature.
I have pbo on but only for the 4850mhz single thread.
 
what did your 5600x do? cause mine is supposedly a golden sample that will 4,750mhz with 1.375 volts on all core static overclock. no I don't use it, but I know it can do it.
I honestly Have around the same temps you do with Avx on in prime95 with a 5600x! up to 77-80C depending on the ambient temperature.
I have pbo on but only for the 4850mhz single thread.
I think it did 4.45Ghz all cores around that voltage, stable. I do have a couple of screenshots but cannot find them. Could have been 4.55Ghz as well, though I had it mostly dialed in for temps to remain below 75C even while rendering. PPT was 75

Limiting PPT to 95w of course did wonders to the temperature, but not for stability per se.
@HD64G I tried multiple things now, time for a little update:
  1. PBO enabled without curve optimizer makes Battlefield V crash. CPU Core reaches 1.496 volts with temps below 78 in-game. CR20 score is ~5700, temps below 75.
  2. PBO enabled with curve optimizer makes Battlefield V crash. CPU Core reaches 1.494 volts. CR20 is ~5600. For both, temps remain below 70 (!).
  3. PBO enabled with manual voltage set to 1.275. Battlefield V does not crash. CR20 is ~4000. For both, temps remain below 70.

I have to say these temps were partly that low because I turned off the heat a couple of hours ago, so ambient is around ~17 celsius, instead of the normal 20-21 when the heat is on.
CPU core current maxed at 66A for all three of these.
I will do logs tomorrow on a more mixed workload, though I do not know what would be giving useful information.

@The King Thanks for the video, I was not aware it came in to existence two days ago.
@blu3dragon will make sure to run some mem-tests as well.
see you guys in 24 hours.
 
I think it did 4.45Ghz all cores around that voltage, stable. I do have a couple of screenshots but cannot find them. Could have been 4.55Ghz as well, though I had it mostly dialed in for temps to remain below 75C even while rendering. PPT was 75

Limiting PPT to 95w of course did wonders to the temperature, but not for stability per se.
@HD64G I tried multiple things now, time for a little update:
  1. PBO enabled without curve optimizer makes Battlefield V crash. CPU Core reaches 1.496 volts with temps below 78 in-game. CR20 score is ~5700, temps below 75.
  2. PBO enabled with curve optimizer makes Battlefield V crash. CPU Core reaches 1.494 volts. CR20 is ~5600. For both, temps remain below 70 (!).
  3. PBO enabled with manual voltage set to 1.275. Battlefield V does not crash. CR20 is ~4000. For both, temps remain below 70.

I have to say these temps were partly that low because I turned off the heat a couple of hours ago, so ambient is around ~17 celsius, instead of the normal 20-21 when the heat is on.
CPU core current maxed at 66A for all three of these.
I will do logs tomorrow on a more mixed workload, though I do not know what would be giving useful information.

@The King Thanks for the video, I was not aware it came in to existence two days ago.
@blu3dragon will make sure to run some mem-tests as well.
see you guys in 24 hours.

Good info. Does BFV crash when PBO is disabled?
I guess when you lower the voltage it is not crashing as the low voltage is preventing the CPU from reaching higher boost clocks. It is unusual that it crashes just from enabling PBO, but it is possible if the curve offsets are not high enough.
The other cause might be FLCK or memory related. What is your memory and FLCK currently set to?

Couple of suggestions:
  1. Run some memory and FLCK stability tests. Quickest way to find these in my experience:
    1. Memory: TestMem5 (tm5) with the 1usmus_v3.cfg
    2. FCLK: y-cruncher test #16 N64 - Classic NTT (64-bit) while running HWInfo to monitor for WHEA errors.
  2. Use p95 and a script to test each core individually for stability, and set the core offsets accordingly.
    1. Quick(ish) test script: https://www.overclock.net/threads/s...script-for-zen-3-curve-offset-tuning.1777112/
    2. More thorough test script: https://www.overclock.net/threads/corecycler-tool-for-testing-curve-optimizer-settings.1777398/
With PBO enabled and/or FLCK tuned, you might need to set a positive offset for some cores.
If your cpu passes above tests, but still crashes in BFV, then I guess you need to look at your GPU or power supply again.
 
Good info. Does BFV crash when PBO is disabled?
I guess when you lower the voltage it is not crashing as the low voltage is preventing the CPU from reaching higher boost clocks. It is unusual that it crashes just from enabling PBO, but it is possible if the curve offsets are not high enough.
The other cause might be FLCK or memory related. What is your memory and FLCK currently set to?

Couple of suggestions:
  1. Run some memory and FLCK stability tests. Quickest way to find these in my experience:
    1. Memory: TestMem5 (tm5) with the 1usmus_v3.cfg
    2. FCLK: y-cruncher test #16 N64 - Classic NTT (64-bit) while running HWInfo to monitor for WHEA errors.
  2. Use p95 and a script to test each core individually for stability, and set the core offsets accordingly.
    1. Quick(ish) test script: https://www.overclock.net/threads/s...script-for-zen-3-curve-offset-tuning.1777112/
    2. More thorough test script: https://www.overclock.net/threads/corecycler-tool-for-testing-curve-optimizer-settings.1777398/
With PBO enabled and/or FLCK tuned, you might need to set a positive offset for some cores.
If your cpu passes above tests, but still crashes in BFV, then I guess you need to look at your GPU or power supply again.
Long post warning.
BF V does not crash when PBO and Core Performance boost are disabled.

DOCP turned on, with FCLK on Auto
1637169952567.png

These tests below are done with the Curve Optimizer disabled, PBO enabled, 95PPT. Stock voltage regulation I would say.
1. TestMem5 with 1usmus_v3.cfg is stable, 6 cycles done (30 minutes roughly).
2. Y-cruncher passed NTT 64 bit, but the entire pc froze then rebooted while running Hybrid NTT (the one after Classic NTT 64).
The following part was logged, though I suspect the part where it crashed is not in the graphs.
Upon boot, a new WHEA error was logged
1637173830482.png

I have made a custom event logger for WHEA, so I know to run two cores at a minimal undervolt (not enabled in these tests, after the line the tests had these two cores at +/-0.
Also, this error at the time of crashing:
1637174784270.png

Ryzen Master is not installed, but was used by CTR before (comes with it). CTR is now not running nor did I activate its profiles.
y-cruncher.png

Weird behavior on some of the cores clocking down all the way?. I think this is from SFT - Small In-Cache FFT, then Classic NTT and then it did no save the log file any further. I am 100% certain it finished Classic NTT 64 on that run.



Now, for the next thing, with Curve Optimizer optimized, the following happened:
Y-Cruncher did not crash on Classic NTT nor HNT.
Therefore, the problem seems voltage related (I think it is safe to state that).
This is supported by the monitoring of the y-cruncher test I fully monitored. Look at Core 0 VID. The log stops here when Classic NTT 64 was done.
y-cruncherCurveOptimizer.png


Amidst all of this I find it very weird that Core 0 only seems to have 1 thread (T1) and not two, like the other cores.

The following graphs are from Battlefield V, from Boot to joining a match (multiplayer, 64 players, capped at monitor refresh rate@165fps, Ultra settings), to immediately crashing before loading the map.
It crashed around the 1 minute mark.
bfv2.png

On the second try, it did not crash (it's like the system is warmed up...). For this and the test after this I changed the polling rate to 100ms to monitor stuttering.
BFV3.png


Now onto Prime95:
Only here, two of the cores that were on a -20 step Curve Optimizer offset were failing the script from the first link you sent.
Strangely enough, also a core that is on a -/+0 step failed.
In addition to that, it also completely shut down the system after it froze.
I will try lowering the step values tomorrow, rerunning the same script.
Prime95 test.png

Like with Battlefield and y-cruncher crashing the system, HWInfo does not show the data of right before it crashed.
I will go to bed now, more confused than yesterday.
-
Windows memory diagnostic tools doesn't report any failures.
 
Just want to chip again in regards to stuttering in games.

You can try disabling free-sync even anti lag etc has that solved many stutter issues for some.
Also the new AMD drivers have CPU tuning mode. Wont hurt to make sure your GPU drivers are
not causing issues by doing a clean install and removing all your current profiles.
 
Back
Top