• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Buildzoid's 3700x Static Overclock Degrades Processor

Let's not kid ourselves, trolls exist on either side. Although I think the AMD side is more akin to ostriches burying their head in the sand.

In either case, as 7nm plays out, it will be interesting to see what happens. It already seems like the writing is on the wall with Intel having issues. We'll get a better picture when NV pushes over.
I remember back in the day when Intel was developing the Itanium CPU as their 64-bit processor and then AMD came out with their 64-bit architecture and basically ended Intel's Itanium.

I wonder if the same thing could happen again with regard to Nvidia's RTX when AMD introduces its version of Ray-Tracing. It has to be borne in mind that the consoles will be running on AMD GPU hardware in the next generation and games developers will probably adopt that as the standard.

I think we can all agree that RTX has been pretty much a disaster and DLSS is as good as dead.
 
I just found 2 NB coolers..nice ones..in the sink box.
Thought I sold them, guess not! :D

I was like..dang! 2, even!
2 heatpipe+fin coolers.
I know I sent one to somebody free. I know I did.
 
Thanks for the very detailed post, can you expand on the reason why you simply didn't enable XMP settings? Cheers
That's quite simple, as soon as I enabled XMP on the RAM the system became very unstable, even when I had the same values implemented in Ryzen Master.

The other thing that, if I configure the BIOS exactly the way the system is configured in Ryzen Master then Windows loads with just garbage on the screen and when I reboot it goes into repair mode.

The other thing about using the XMP profile was that any time I made even the slightest change to the configuration in Ryzen Master it required a reboot, whereas with XMP off I can change values on the fly and click on "Apply" and I am done.

The thing is that the actual BIOS is the AGESA and it is supplied as a binary so what I am configuring with regard to the CPU in what one would normally think of as the BIOS is basically "GigaByte Master".

After a couple of months of frustration and realising that the Tech Media and Tech YouTubers were full of it, I decided to start all over again.

Through trial and error I found that the system ran most stably when I only configured the motherboard parameters in the BIOS and configured the CPU and the RAM in Ryzen Master (except for overclocking the RAM to 3733 MHz and I had to configure the Infinity Fabric to 1866 in the AMD potion of the BIOS).

Since then I have had no problems with the system.
 
What's up with the XMP/D.O.C.P fail?
Z170 days that was the way to go!
Set XMP and you got Nick Shih RAM OCing guide and profiles.
 
Last edited:
I remember back in the day when Intel was developing the Itanium CPU as their 64-bit processor and then AMD came out with their 64-bit architecture and basically ended Intel's Itanium.

I wonder if the same thing could happen again with regard to Nvidia's RTX when AMD introduces its version of Ray-Tracing. It has to be borne in mind that the consoles will be running on AMD GPU hardware in the next generation and games developers will probably adopt that as the standard.

I think we can all agree that RTX has been pretty much a disaster and DLSS is as good as dead.

I don't think so because the individual nuts and bolts to RTX are abstracted by DXR within DirectX. I think devs may optimize for AMD's implementation of it naturally because of consoles but NV will continue to shell out money to devs to optimize for it.

I see most of the intel trolls in most AMD related threads, i dont bother intel threads unless if i can help with cooling...

LOL, did you call yourself a troll? As for threadshitting, asshats are born everyday but life goes on.
 
I remember back in the day when Intel was developing the Itanium CPU as their 64-bit processor and then AMD came out with their 64-bit architecture and basically ended Intel's Itanium.

I wonder if the same thing could happen again with regard to Nvidia's RTX when AMD introduces its version of Ray-Tracing. It has to be borne in mind that the consoles will be running on AMD GPU hardware in the next generation and games developers will probably adopt that as the standard.

I think we can all agree that RTX has been pretty much a disaster and DLSS is as good as dead.
In that respect the entire computer industry can thank AMD for pushing 64-bit technology.

With regards to RTX, last I checked AMD, Intel and Nvidia are coordinating there RTX efforts together in cooperation with game devs.
 
In that respect the entire computer industry can thank AMD for pushing 64-bit technology.

64bit was a necessity and an inevitability. The only question was who's was going to be adopted.
 
In that respect the entire computer industry can thank AMD for pushing 64-bit technology.

With regards to RTX, last I checked AMD, Intel and Nvidia are coordinating there RTX efforts together in cooperation with game devs.
Nvidia doesn't "coordinate" with others. It tries to dictate and is strictly a "my way or the highway" kind of player in the market. DirectX Raytracing is based on Nvidia hardware without any input from AMD.

AMD filed a patent for hybrid Ray-Tracing at the end of 2017 and is in a far better position to impose a de facto Ray-Tracing standard given its monopoly with regard to creating the CPU and GPU in the Micro$haft and $ONY consoles.

I have two 1080 Ti graphics cards (EVGA 1080 Ti FTW3 and an EVGA 1080 Ti SC2) and the current overpriced offerings from both AMD and Nvidia can rot on the shelves as far as I am concerned.

There are very few games available which implement "Ray-Tracing" and even those do so to a very limited extent. Nvidia has effectively priced Ray-Tracing out of the market and it is highly unlikely that AMD will follow Nvidia's lead.
 

Buildzoid did not kill his 3700x see him set it up optimally here, on an Asus Itx.

Degraded possibly , dead no.
 
Nvidia doesn't "coordinate" with others. It tries to dictate and is strictly a "my way or the highway" kind of player in the market. DirectX Raytracing is based on Nvidia hardware without any input from AMD.

AMD filed a patent for hybrid Ray-Tracing at the end of 2017 and is in a far better position to impose a de facto Ray-Tracing standard given its monopoly with regard to creating the CPU and GPU in the Micro$haft and $ONY consoles.

I have two 1080 Ti graphics cards (EVGA 1080 Ti FTW3 and an EVGA 1080 Ti SC2) and the current overpriced offerings from both AMD and Nvidia can rot on the shelves as far as I am concerned.

There are very few games available which implement "Ray-Tracing" and even those do so to a very limited extent. Nvidia has effectively priced Ray-Tracing out of the market and it is highly unlikely that AMD will follow Nvidia's lead.
I fully agree. Wasn't happy with Nvidia's Ray Tracing implementation.
But was going by this article.
Screenshot_20200302-073933.png


64bit was a necessity and an inevitability. The only question was who's was going to be adopted.
I remember Intel was kind of angry with AMD for coming out with the Athlon 64 and being the 1st to introduce 64-bit processors. Intel didn't want to move in that direction at that time and did what they can to prevent AMD from technological and innovative advancements. AMD won and Intel lost.

Intel's Itanium 64-bit was full of issues. There's a wealth of articles on the subject, but the main site The Inquirer.net unfortunately shut down. From that site came Fudzilla and SemiAccurate.
 

Buildzoid did not kill his 3700x see him set it up optimally here, on an Asus Itx.

Degraded possibly , dead no.
The problem as I see it with the Ryzen 3000 series - after experimenting first with a 3600X on a GigaByte X470 Gaming 7 WiFi, then on the GigaByte X570 AURUS XTREME, then with a 3950X on the X570 board and with the 3900X my friend lent me also on the X570 board for over six months now - is that AMD has pushed the voltage on these CPUs beyond the limit in an attempt not to fall back in the clockspeed department.

People are stupid, and they see 5 GHz for Intel and 4.2 GHz for AMD and think "5 is bigger than 4 therefore Intel is better".

The first generation of Ryzen was a good series, the one thing that let them down was that all the motherboards sucked.

The second generation of Ryzen was and incremental improvement on the first generation and there were a few pretty good boards that were brought out by the motherboard manufacturers because they started to take Ryzen seriously.

The third generation which we currently have was a major step forward, but also a step back.

The improvement in IPC of the third generation was comparatively huge, and the motherboard manufacturers have gotten behind this series in a big way. The step back however was in the realms of clockspeed. As the nodes have shrunk it is more and more difficult to squeeze more clockspeed out of the architecture simply because you have to work with reduced voltages.

Unfortunately to hit the advertised boost frequencies way too much voltage is pumped in - to compensate for the VDroop under load - and you end up with the system on idle drifting up to 1.5 Volts and under load going to 1.37 Volts or above.

This is unacceptable, considering that at these levels of voltage you are not even getting 4.1 GHz running CineBench R20.

It is inevitable, left to its own devices, and the grace of AMD's defaults, that a Ryzen 3000 CPU will degrade.

Thankfully they managed to gag that baldy-headed little cretin of a marketdroid Robert Hallock, because he was just a lawsuit waiting to happen for AMD the way he was shooting his mouth off, and especially his videos on configuring the Ryzen 3000 CPU.

I fully agree. Wasn't happy with Nvidia's Ray Tracing implementation.
But was going by this article.
View attachment 147051


I remember Intel was kind of angry with AMD for coming out with the Athlon 64 and being the 1st to introduce 64-bit processors. Intel didn't want to move in that direction at that time and did what they can to prevent AMD from technological and innovative advancements. AMD won and Intel lost.

Intel's Itanium 64-bit was full of issues. There's a wealth of articles on the subject, but the main site The Inquirer.net unfortunately shut down. From that site came Fudzilla and SemiAccurate.
GDC has been cancelled and it speaks of wanting to plan discussions, not that any discussions had actually taken place in the past.
 
The problem as I see it with the Ryzen 3000 series - after experimenting first with a 3600X on a GigaByte X470 Gaming 7 WiFi, then on the GigaByte X570 AURUS XTREME, then with a 3950X on the X570 board and with the 3900X my friend lent me also on the X570 board for over six months now - is that AMD has pushed the voltage on these CPUs beyond the limit in an attempt not to fall back in the clockspeed department.

People are stupid, and they see 5 GHz for Intel and 4.2 GHz for AMD and think "5 is bigger than 4 therefore Intel is better".

The first generation of Ryzen was a good series, the one thing that let them down was that all the motherboards sucked.

The second generation of Ryzen was and incremental improvement on the first generation and there were a few pretty good boards that were brought out by the motherboard manufacturers because they started to take Ryzen seriously.

The third generation which we currently have was a major step forward, but also a step back.

The improvement in IPC of the third generation was comparatively huge, and the motherboard manufacturers have gotten behind this series in a big way. The step back however was in the realms of clockspeed. As the nodes have shrunk it is more and more difficult to squeeze more clockspeed out of the architecture simply because you have to work with reduced voltages.

Unfortunately to hit the advertised boost frequencies way too much voltage is pumped in - to compensate for the VDroop under load - and you end up with the system on idle drifting up to 1.5 Volts and under load going to 1.37 Volts or above.

This is unacceptable, considering that at these levels of voltage you are not even getting 4.1 GHz running CineBench R20.

It is inevitable, left to its own devices, and the grace of AMD's defaults, that a Ryzen 3000 CPU will degrade.

Thankfully they managed to gag that baldy-headed little cretin of a marketdroid Robert Hallock, because he was just a lawsuit waiting to happen for AMD the way he was shooting his mouth off, and especially his videos on configuring the Ryzen 3000 CPU.


GDC has been cancelled and it speaks of wanting to plan discussions, not that any discussions had actually taken place in the past.
Dont forget please that voltage, only by it self, does not harm/degrade silicon. Its in conjunction with speed, current(A) and temp. Its the combination of the 4 that can slowly kill the CPU.

Thats why 3000 series is best kept on auto settings, whatever those can be. Only on auto the silicon FITness controller, can regulate the operating parameters (Speed/boost, voltage, current... and temp) and preserve silicon.

1. Cool it down, and it will boost/voltage more.
2. Play with PBO settings to bring Current(A) down, and it will boost more and maybe voltage more.
Do it both... well you can get it...

Somethings, all should be aware of:



 
The problem as I see it with the Ryzen 3000 series - after experimenting first with a 3600X on a GigaByte X470 Gaming 7 WiFi, then on the GigaByte X570 AURUS XTREME, then with a 3950X on the X570 board and with the 3900X my friend lent me also on the X570 board for over six months now - is that AMD has pushed the voltage on these CPUs beyond the limit in an attempt not to fall back in the clockspeed department.

People are stupid, and they see 5 GHz for Intel and 4.2 GHz for AMD and think "5 is bigger than 4 therefore Intel is better".

The first generation of Ryzen was a good series, the one thing that let them down was that all the motherboards sucked.

The second generation of Ryzen was and incremental improvement on the first generation and there were a few pretty good boards that were brought out by the motherboard manufacturers because they started to take Ryzen seriously.

The third generation which we currently have was a major step forward, but also a step back.

The improvement in IPC of the third generation was comparatively huge, and the motherboard manufacturers have gotten behind this series in a big way. The step back however was in the realms of clockspeed. As the nodes have shrunk it is more and more difficult to squeeze more clockspeed out of the architecture simply because you have to work with reduced voltages.

Unfortunately to hit the advertised boost frequencies way too much voltage is pumped in - to compensate for the VDroop under load - and you end up with the system on idle drifting up to 1.5 Volts and under load going to 1.37 Volts or above.

This is unacceptable, considering that at these levels of voltage you are not even getting 4.1 GHz running CineBench R20.

It is inevitable, left to its own devices, and the grace of AMD's defaults, that a Ryzen 3000 CPU will degrade.

Thankfully they managed to gag that baldy-headed little cretin of a marketdroid Robert Hallock, because he was just a lawsuit waiting to happen for AMD the way he was shooting his mouth off, and especially his videos on configuring the Ryzen 3000 CPU.


GDC has been cancelled and it speaks of wanting to plan discussions, not that any discussions had actually taken place in the past.
Your ideology is tainted by the experience gigabyte provided you, as mine is Asus.
If you have the board set on auto ,try it on default , I'll be surprised if the Volt's are not lower.
Auto is NOT default.
Secondly I disagree, scientists and engineers tested it , in typical use unloaded 2-6 cores could be gated fully off , the chip has some degree of regulation built in and anyway a core at 1.45 unloaded won't be pulling much current and at low temps.
It is NOT one thing ie Volt's in isolation that kills or degrades chips.
Hyper focusing leads to bad theology.
 
Your ideology is tainted by the experience gigabyte provided you, as mine is Asus.
If you have the board set on auto ,try it on default , I'll be surprised if the Volt's are not lower.
Auto is NOT default.
Secondly I disagree, scientists and engineers tested it , in typical use unloaded 2-6 cores could be gated fully off , the chip has some degree of regulation built in and anyway a core at 1.45 unloaded won't be pulling much current and at low temps.
It is NOT one thing ie Volt's in isolation that kills or degrades chips.
Hyper focusing leads to bad theology.
From what I understand

Too high a voltage causes Oxide Breakdown in the CPU.

Too high a current causes Electromigration in the CPU.

Although I do not have an ASUS motherboard, my friend, who loaned me his 3900X does. He has an ASUS ROG Crosshair VIII Hero WiFi.

His results were similar to mine when he followed my guide. He also allowed me to configure his system in Windows so that I knew it was done right.

His temps were lower than on default, but also his benchmark results were a lot higher.

I guess you didn't read the part where I said that I have been experimenting with the Ryzen 3000 series of CPUs for over SIX MONTHS now.

You also didn't read the part of my original post above where I stated that I had to basically unlearn everything I thought I knew and start again at first principles and build up a new body of knowledge built on trial and error.

Nice try at a passive-agressive ad hominem there old son, but you scored a clean miss.

I would be deeply grateful if you could point me in the direction of the "scientists and engineers" who have tested it. Could you name one for me? Could you point me in the direction of any paper they have written?

You can't? Well what a surprise!

I have been in the techie game for almost 38 years now and aside from an AMD 7870K APU system I built as a backup, the third generation of AMD Ryzen CPUs is the first AMD processor I have built a system for my own use with.

I am not a stranger to the concept of RTFM, and I approach every new piece of hardware with the attitude, "A computer does not do what I want it to do, it does what I tell it to do"; I am also very much aware of the saying, "Behind every computer error there are at least two human errors, including the error of blaming it on the computer".
 
From what I understand

Too high a voltage causes Oxide Breakdown in the CPU.

Too high a current causes Electromigration in the CPU.

Although I do not have an ASUS motherboard, my friend, who loaned me his 3900X does. He has an ASUS ROG Crosshair VIII Hero WiFi.

His results were similar to mine when he followed my guide. He also allowed me to configure his system in Windows so that I knew it was done right.

His temps were lower than on default, but also his benchmark results were a lot higher.

I guess you didn't read the part where I said that I have been experimenting with the Ryzen 3000 series of CPUs for over SIX MONTHS now.

You also didn't read the part of my original post above where I stated that I had to basically unlearn everything I thought I knew and start again at first principles and build up a new body of knowledge built on trial and error.

Nice try at a passive-agressive ad hominem there old son, but you scored a clean miss.

I would be deeply grateful if you could point me in the direction of the "scientists and engineers" who have tested it. Could you name one for me? Could you point me in the direction of any paper they have written?

You can't? Well what a surprise!

I have been in the techie game for almost 38 years now and aside from an AMD 7870K APU system I built as a backup, the third generation of AMD Ryzen CPUs is the first AMD processor I have built a system for my own use with.

I am not a stranger to the concept of RTFM, and I approach every new piece of hardware with the attitude, "A computer does not do what I want it to do, it does what I tell it to do"; I am also very much aware of the saying, "Behind every computer error there are at least two human errors, including the error of blaming it on the computer".
Are you having a laugh , Dr Lisa Su for a start though I doubt she tested it , I am not even going to argue , since you know stuff and now are implying pixies designed Ryzen or something.

On voltages I am not talking 1.5 am I, though I've benched at that on all three generations of Ryzen and on all makers boards bar asrock , six months ,and what.

It's designers and engineers and testers spent the last seven years, but you know more then them.

Go you.

But do note your not the only old timer
 
Last edited:
So you don't have anything specific to contribute then aside from being a member of a cult of personality?

And here was me thinking you could point me in the direction of something enlightening.

My results speak for themselves.
 
So you don't have anything specific to contribute then aside from being a member of a cult of personality?

And here was me thinking you could point me in the direction of something enlightening.

My results speak for themselves.
So does your argument , your dragging in all your post's ,yet I replied to one.

Your arguing some points I might agree with and not realising I only debated your comments on high voltage at idle.

Plus see the video by buildzoid , that's how you get highest clocks while leaving the chip capable Of protecting itself, that and manually setting a max temp ,which he doesn't do and your good, IMHO.

Opinions can vary.

Though my assumptions of you are dropping after calling me a cultist for passing you a dr's name who worked on it, you asked a stupid question , I am not obligated to spend hours on Google to protect my ego and PROVE you wrong though you are, scientists and engineers worked on and tested this for the last seven years.
 
Last edited:
I don't think so because the individual nuts and bolts to RTX are abstracted by DXR within DirectX. I think devs may optimize for AMD's implementation of it naturally because of consoles but NV will continue to shell out money to devs to optimize for it.



LOL, did you call yourself a troll? As for threadshitting, asshats are born everyday but life goes on.

Wrong

Nvidia doesn't "coordinate" with others. It tries to dictate and is strictly a "my way or the highway" kind of player in the market. DirectX Raytracing is based on Nvidia hardware without any input from AMD.

AMD filed a patent for hybrid Ray-Tracing at the end of 2017 and is in a far better position to impose a de facto Ray-Tracing standard given its monopoly with regard to creating the CPU and GPU in the Micro$haft and $ONY consoles.

I have two 1080 Ti graphics cards (EVGA 1080 Ti FTW3 and an EVGA 1080 Ti SC2) and the current overpriced offerings from both AMD and Nvidia can rot on the shelves as far as I am concerned.

There are very few games available which implement "Ray-Tracing" and even those do so to a very limited extent. Nvidia has effectively priced Ray-Tracing out of the market and it is highly unlikely that AMD will follow Nvidia's lead.

They are with navi gen 1
 
So does your argument , your dragging in all your post's ,yet I replied to one.

Your arguing some points I might agree with and not realising I only debated your comments on high voltage at idle.

Plus see the video by buildzoid , that's how you get highest clocks while leaving the chip capable Of protecting itself, that and manually setting a max temp ,which he doesn't do and your good, IMHO.

Opinions can vary.

Though my assumptions of you are dropping after calling me a cultist for passing you a dr's name who worked on it, you asked a stupid question , I am not obligated to spend hours on Google to protect my ego and PROVE you wrong though you are, scientists and engineers worked on and tested this for the last seven years.

I might remind you that it was you who said

"Your ideology is tainted by the experience gigabyte provided you, as mine is Asus. "

You ended the post with:

"Hyper focusing leads to bad theology."

So as far as my reply was concerned, as the Aussies say, "Turn around is fair dinkum".

If you had read my original post you will have noticed that I did NOTHING to compromise the safeguards built into the system to prevent overheating.

Also if you knew enough about the Ryzen 3000 series you would know that at the data rate I have set for my RAM and thus the Infinity Fabric (which is 1:1 because I am running 3600 CL16 RAM which I have overclocked to 3733 CL16 or rather a data rate of 1867 MHz) at temperatures above 85 °C the Infinity Fabric becomes unstable and thus the benchmarking program terminates.

The same is also true if you do LN2 overclocking on a Ryzen 3000 CPU where you have to clock the FCLK down to below 1500 because anything higher will cause the system to fail at very low temps.

Thus I don't know which part of your anatomy you pulled that chestnut from.

Even giving it his best shot, Buildzoid could only get a CineBench R20 score of 9,589 after he had worked his "magic" and his second go at running the benchmark his score deteriorated, whereas I have achieved a CineBench score of 10,170 running exactly the same type of CPU and motherboard (3950X and GigaByte X570 AURUS XTREME) as he has. The thing is though, that my results don't deteriorate from one run to the next, usually the second run results in a higher score and the 10,170 was the highest I achieved after doing a number of CineBench R20 runs.

He was using Corsair Dominator Platinum 3600 CL16 RAM and I use Team Group Edition 3600 CL16 RAM. I know mine uses Samsung B-die and I assume his RAM does as well.

The other thing to consider is that my ambient temperature is normally at 28 - 29 °C whereas Buildzoid did state that it was pretty cold where he was conducting his tests and yet he was still running into thermal problems.

During a CineBench R20 run my CPU draws between 154 and 159 Watts of power and never exceeds 160 Watts, my CPU temp reaches a maximum of 84 °C but again I would have to say that the ambient temperature I just ran it at was 28 °C in my room.

Attached is a screenshot I took of one of my CineBench runs. I don't know whether or not I made one when I got the score of 10,170.
 

Attachments

  • Capture2.PNG
    Capture2.PNG
    23 KB · Views: 169
Last edited:
I might remind you that it was you who said

"Your ideology is tainted by the experience gigabyte provided you, as mine is Asus. "

You ended the post with:

"Hyper focusing leads to bad theology."

So as far as my reply was concerned, as the Aussies say, "Turn around is fair dinkum".

If you had read my original post you will have noticed that I did NOTHING to compromise the safeguards built into the system to prevent overheating.

Also if you knew enough about the Ryzen 3000 series you would know that at the data rate I have set for my RAM and thus the Infinity Fabric (which is 1:1 because I am running 3600 CL16 RAM which I have overclocked to 3733 CL16 or rather a data rate of 1867 MHz) at temperatures above 85 °C the Infinity Fabric becomes unstable and thus the benchmarking program terminates.

The same is also true if you do LN2 overclocking on a Ryzen 3000 CPU where you have to clock the FCLK down to below 1500 because anything higher will cause the system to fail at very low temps.

Thus I don't know which part of your anatomy you pulled that chestnut from.

Even giving it his best shot, Buildzoid could only get a CineBench R20 score of 9,589 after he had worked his "magic" and his second go at running the benchmark his score deteriorated, whereas I have achieved a CineBench score of 10,170 running exactly the same CPU and motherboard (3950X and GigaByte X570 AURUS XTREME) as he has. The thing is though, that my results don't deteriorate from one run to the next, usually the second run results in a higher score and the 10,170 was the highest I achieved after doing a number of CineBench R20 runs.

He was using Corsair Dominator Platinum 3600 CL16 RAM and I use Team Group Edition 3600 CL16 RAM. I know mine uses Samsung B-die and I assume his RAM does as well.

The other thing to consider is that my ambient temperature is normally at 28 - 29 °C whereas Buildzoid did state that it was pretty cold where he was conducting his tests and yet he was still running into thermal problems.

During a CineBench R20 run my CPU draws between 154 and 159 Watts of power and never exceeds 160 Watts, my CPU temp reaches a maximum of 84 °C but again I would have to say that the ambient temperature I just ran it at was 28 °C in my room.

Attached is a screenshot I took of one of my CineBench runs. I don't know whether or not I made one when I got the score of 10,170.
Theology , regarding your opinion of high volts being bad while un loaded , I read your argument the first time ,if you must have more of my opinion, software over locking as is my experience is less stable then hardware or auto and crashes can be monstrous to the OS state, I personally can't back it as a goto default stance which always required Ryzen master loaded and a profile switch.
With mine setup as it is it all cores to 4.125 all cores at 1.235 Volt's automatically fully loaded crunching 24/7 and sits at 75° but you know what , that's my rig in its environment , it's not transferable to others , I wouldn't argue your way doesn't work well, I have tried it many times going back to FX but the end result is the same long term instability and issues over time due to OS degredation from crashes and memory pages going missing, my pc is always up and loaded so stability is important.
 
Well to be fair, overclocking has always been "At Your Own Risk" since the very beginning.

If they tell you your warranty will be void if you do something they warn you not to do, that is essentially on you, for thinking you knew better.. regardless if they could tell or not. Chances are they can nowadays.

I'm not saying don't overclock, but be smart about it if you intend for it to last. I would imagine them to be fragile to a degree, look how tiny the process is. And jamming a bunch of vcore into it probably doesn't help much since it is nearly at its limit anyways.

In my world I call this common sense. On TPU, we debate that common sense whenever someone gets fed a reality that doesnt sit well with them :)
 
Most of the time I will be running my 3950X as a straight 16 Core/16 Thread with SMT Off at 4.4 Ghz for playing games or just doing my daily stuff; whereby I have the easy possibilty of running it at 16 Cores /32 Threads at 4.3 GHz with SMT On when I use something like video editing software that profits from more cores.

It's funny how EVERYBODY has all of a sudden done what I have done AFTER I have told them how to do it.

Over the past six months or so I have not met anyone who has told me the stuff PRIOR to my explaining it to them. Which is basically why I have had to work it all out for myself.

Funny how that works isn't it?

My previous main machine - an i7-990X on an X58 mobo - was up and running 24/7 for 1,900 days before I decommissioned it in favour of my current main machine, an i7-4790K, which has been running 24/7 for over 1,720 days.

Because I am not in need of a new system, I have had plenty of time to experiment to my heart's content with the Ryzen system.

I am considering going gold with it in April on my birthday.

It is nice when there is no rush.
 
Most of the time I will be running my 3950X as a straight 16 Core/16 Thread with SMT Off at 4.4 Ghz for playing games or just doing my daily stuff; whereby I have the easy possibilty of running it at 16 Cores /32 Threads at 4.3 GHz with SMT On when I use something like video editing software that profits from more cores.

It's funny how EVERYBODY has all of a sudden done what I have done AFTER I have told them how to do it.

Over the past six months or so I have not met anyone who has told me the stuff PRIOR to my explaining it to them. Which is basically why I have had to work it all out for myself.

Funny how that works isn't it?

My previous main machine - an i7-990X on an X58 mobo - was up and running 24/7 for 1,900 days before I decommissioned it in favour of my current main machine, an i7-4790K, which has been running 24/7 for over 1,720 days.

Because I am not in need of a new system, I have had plenty of time to experiment to my heart's content with the Ryzen system.

I am considering going gold with it in April on my birthday.

It is nice when there is no rush.
I can show you dated benchmarks to prove I had Ryzen clocked adequate from launch but doing so would be as lame as saying I can to me.

You don't recommend something you think wrong ie software clocking, so I wouldn't have told you to do it Had you asked ,new member.
 
huh would you look that operating something beyond manufacturer specification might break it

what in the hell is going on here


couple points
1. smaller process = less voltage that has been the case forever I am not sure why anybody would ever question that

2. shit happens cpu's degrade it happens I have a 4670k that went from being able to run 4.6@1.35 to only 4.4 after under a year

3. I am not sure what @Michael Nager Is on about but here waving your benchmark scores around doesn't earn you any brownie points around here and if you think the tongue in cheek implication that buildzoid somehow doesn't know what he is doing you where going for was missed think again that crap doesn't fly here
(look at me my score is better I am right )
fuck off


4. yes amds boost curve is a bit aggressive but board vendors are partially to blame because they change the defaults to what amd specified in there partner manuals to whatever they want to sell more boards

5. are we going to see ryzen cpus dropping dead en`mass anytime soon No not even close

voltage doesn't kill cpu's current does current is a function of voltage and load
no load no current no heat no degradation (except if you are wildly exceeding the insulting capacity of your conductor ) like 2v

Well someone sounds mildly triggered.

Was it something I said? :)

I have delidded my i7-4790K over two years ago and applied liquid metal TG Conductonaut, and it has been running at 4.4 GHz (except that for two years it ran quite a bit warmer) on all cores 24/7 for over 1,720 days (that would be 4.7 years).

The thing is though that if I didn't put in a screenshot of my benchmark then it is sure as dammit that someone would call me a liar.

I am sure that Buildzoid doesn't need anyone to White Knight for him.

The fact of the matter is though that the way I described it in the guide does not void the warranty of the CPU.

Originally when I heard about the 7 nm node from TSMC the spec was for a maximum of 1.3 Volts.

Then the pronouncements about what could be considered "safe" voltage has been very vague with regard to info coming from AMD.

Two things can damage a CPU, too much voltage over time causes Oxide Breakdown in the CPU and too much current causes Electromigration in the CPU, both of which degrade the CPU over time.

There are already reports of the performance of Ryzen 3000 CPUs degrading and they have only been out for six months or so.
 
Well someone sounds mildly triggered.

Was it something I said? :)

I have delidded my i7-4790K over two years ago and applied liquid metal TG Conductonaut, and it has been running at 4.4 GHz (except that for two years it ran quite a bit warmer) on all cores 24/7 for over 1,720 days (that would be 4.7 years).

The thing is though that if I didn't put in a screenshot of my benchmark then it is sure as dammit that someone would call me a liar.

I am sure that Buildzoid doesn't need anyone to White Knight for him.

The fact of the matter is though that the way I described it in the guide does not void the warranty of the CPU.

Originally when I heard about the 7 nm node from TSMC the spec was for a maximum of 1.3 Volts.

Then the pronouncements about what could be considered "safe" voltage has been very vague with regard to info coming from AMD.

Two things can damage a CPU, too much voltage over time causes Oxide Breakdown in the CPU and too much current causes Electromigration in the CPU, both of which degrade the CPU over time.

There are already reports of the performance of Ryzen 3000 CPUs degrading and they have only been out for six months or so.
As I said posts ago , there are a few here I could name have a few Ryzens set up with normal air cooling ,no overclock and they run these systems hard all day everyday, we will more than adequately hear about that IF and WHEN it happens.

Originally when I heard about 7nm it was just round the corner , it wasn't.

The 7nm we have now is not the same as that envisioned years ago ,plans adapt , the results of research further alter design and process.

No one is white knighting for him , he has a presence here , he could speak for himself.

But I'd wager most Here have tried your way , I didn't say it didn't work, I argued it's too faffy and is not a safe way long term, and not necessary. IMHo

And I don't think this drama regarding degradation while out of spec is necessary, you say you're an old school clocker ,all older chip's also degraded if you were far enough out of spec ie trying hard enough.
 
Back
Top