• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD Ryzen Threadripper Delidded - It's EPYC

Yes, here's a good list of CPU's to know if they were soldered or glued together with the IHS ~ https://www.overclock.net/t/305443/ihs-removals-how-to-do-it-should-i-do-it-and-the-facts
Thanks for the link.
There are some 10yo CPUs there and I have a E8400 at home still going strong, Buddy of mine was using a Q8400 up until a year ago without any issues (16h of gaming every day).
I just wanted to confirm for myself that solder has no issues in the long run. Because I'm more of a facts kinda guy than empty science.
 
this is the reason why Intel so butthurt and go all out insulting Epyc CPU in their slide decks

think about it, one slice silicon to rule them all market segment from server, HEDT, performance, mainstream, all the way down to potato-grade CPU.
meanwhile Intel need to spend R&D and new design of silicons for each segmentation.
Intel's cores have scaled from ultrabooks to HEDT for years, nothing new for them in this area.
 
Thanks for the link.
There are some 10yo CPUs there and I have a E8400 at home still going strong, Buddy of mine was using a Q8400 up until a year ago without any issues (16h of gaming every day).
I just wanted to confirm for myself that solder has no issues in the long run. Because I'm more of a facts kinda guy than empty science.
I don't imagine that being a problem for mainstream CPUs. It's probably the HEDT class which are pushed more (i.e. run warmer) that may exhibit some symptoms after some years.
 
Intel's cores have scaled from ultrabooks to HEDT for years, nothing new for them in this area.
my point is the physical silicon, AMD simply repurpose their silicon as they see fit
Intel have been using same uarch across all market segmentation, yes
but they need to create brand new silicon to be able address market demand
you dont just take Xeon 12 core cpu and disable it, sold as G4560
 
pcworld says those two additional dies is unprocessed sillicon used as spacers for an even pressure and pin contact.
So much for unlocking cores...:(
 
pcworld says those two additional dies is unprocessed sillicon used as spacers for an even pressure and pin contact.
So much for unlocking cores...:(
And people say silicon is expensive.
 
Yes, here's a good list of CPU's to know if they were soldered or glued together with the IHS ~ https://www.overclock.net/t/305443/ihs-removals-how-to-do-it-should-i-do-it-and-the-facts

Ivy was the first mainstream CPU, in a long time, that wasn't soldered, SKLX being the first HEDT which went the Ivy way. Basically there's nothing out there that suggests that solder goes kaput first, as compared to TIM, having said that we have TR & SKLX which will prove this theory one way or another.

Well my Q9450 still going strong and relatively cool (65'C gaming temp) but with the stream of demanding new games I think this year is it's final year of services
 
pcworld says those two additional dies is unprocessed sillicon used as spacers for an even pressure and pin contact.
So much for unlocking cores...:(
They would say that though would they not, pure bs imho , they're binned prior to packaging.
They would use a shim over that or dead chips.
 
my point is the physical silicon, AMD simply repurpose their silicon as they see fit
Intel have been using same uarch across all market segmentation, yes
but they need to create brand new silicon to be able address market demand
you dont just take Xeon 12 core cpu and disable it, sold as G4560
Silicon is paid for by the sq mm, so repurposing doesn't work like you think it does.
 
And people say silicon is expensive.
Relatively speaking silicon isn't that expensive, search on Ebay and you can find low quality silicon wafers for $20 American. Processing silicon into microchips is also not horribly expensive either. However, designing and making the mask sets takes many millions. If the company isn't fabless then they must also acquire the equipment to produce the chips.

The NRE cost of chips is the big expense.
 
  • Like
Reactions: r9
Well my Q9450 still going strong and relatively cool (65'C gaming temp) but with the stream of demanding new games I think this year is it's final year of services
This is another one of my points. It's more likely your CPU will become obsolete than the solder becoming defective.
 
The problem with solder is slowly the indium will get sucked onto the large patch of gold (used because of how indium bonds to it), and this will create holes in the solder that then lead to hot spots. This is why I like Intel using paste-based TIM... if it dries out, you can de-lid and replace, but when that solder wicks out, and the CPU runs hot only, you're forced to buy new gear.

This behavior of indium-based solders is properly documented, too, so while some see it as a boon, I see it a weakness in AMD's products, because, you know, I like science, not hype.

:) How many Intel users will de-lid their CPU? Your argument is pointless
 
:) How many Intel users will de-lid their CPU? Your argument is pointless
Not many, probably, but the bulk of those that do are HEDT users.
 
Not many, probably, but the bulk of those that do are HEDT users.
Honestly do you believe that HEDT user will pay over 1000$ for their CPU which is about to be used to make daily bread and then he'll decide to de-lid it? I am not perfectly convinced by this idea :)
 
Honestly do you believe that HEDT user will pay over 1000$ for their CPU which is about to be used to make daily bread and then he'll decide to de-lid it? I am not perfectly convinced by this idea :)
So do you think enthusiasts buy Celerons and try to run them at 5GHz? Atoms maybe?
 
So do you think enthusiasts buy Celerons and try to run them at 5GHz? Atoms maybe?
Not at all, I'm just wondering why would anyone even consider deliding their CPU, that's all.
 
The problem with solder is slowly the indium will get sucked onto the large patch of gold (used because of how indium bonds to it), and this will create holes in the solder that then lead to hot spots. This is why I like Intel using paste-based TIM... if it dries out, you can de-lid and replace, but when that solder wicks out, and the CPU runs hot only, you're forced to buy new gear.

This behavior of indium-based solders is properly documented, too, so while some see it as a boon, I see it a weakness in AMD's products, because, you know, I like science, not hype.

Or delid it like they just did? :D
 
Not at all, I'm just wondering why would anyone even consider deliding their CPU, that's all.

I wouldn't do it with the silly vice/hammer methods, but with the delidding tools out there for $25 or so you can easily and very safely delid. And on the Intel side it definitely makes a big difference to replace it. That's what I did.

On the Threadripper I agree I wouldn't be too concerned about aging. Since it's got quality solder you don't have to worry about delidding (which shouldn't be necessary if Intel would use decent compound themselves), and it will be obsolete before needing replaced. If I get 4 or 5 overclocked years out of a CPU I'm happy and relegate it to live out it's days in a secondary PC.
 
Is it only me or the IHS is made of copper? The last image shows four holes which they drilled for some reason, and the color of the material resembles copper. If so, this is yet another thing done right (along with the soldering). Man, if it turns out the extra cores are unlockable... Intel is in huge trouble :) although I doubt it, as it would cannibalize their EPYC sales.
 
Not at all, I'm just wondering why would anyone even consider deliding their CPU, that's all.
Well, if you're serious about overclocking, you're going to delid and make sure what's under there makes the best heat transfer possible.
You'd probably never do it and I certainly won't, but apparently there's enough people that do it out there, that this exists: https://rockitcool.myshopify.com/
 
The problem with solder is slowly the indium will get sucked onto the large patch of gold (used because of how indium bonds to it), and this will create holes in the solder that then lead to hot spots. This is why I like Intel using paste-based TIM... if it dries out, you can de-lid and replace, but when that solder wicks out, and the CPU runs hot only, you're forced to buy new gear.

This behavior of indium-based solders is properly documented, too, so while some see it as a boon, I see it a weakness in AMD's products, because, you know, I like science, not hype.

i still have a working 2600k, it may happen but by the time it does you will replace the threadripper anyways, so invalid argument/fact to me anyways.
 
Last edited:
i still have a working 2600k, it may happen but by the time it does you will replace the threadripper anyways, so invalid argument/fact to me anyways.
I have an AMD 4400+ that is pasted and still works fine too. I mean, we can say the same things about both options. Personally, I prefer paste.

Not at all, I'm just wondering why would anyone even consider deliding their CPU, that's all.

I feel the way you do, honestly, but we do have quite a few members here that would say "why wouldn't you, the paste is inferior!"

Anyway, it's awesome to see under the heatspreader anyway, so while I may not approve of de8hauer at all times, I still gotta say thanks. :p

I wonder how many cpu in the world perform a complete thermal cycle in theyr lives.
Every single one of mine, and there are a few members on here that do the LN2 stuff as well as air/water and all that too. Guys like macci and K1ngp1n drew me deep into hardware, and that led to me doing reviews. I just don't post anything about my LN2 adventures, because I'm not a competitive person. It's more about seeing how these CPUs work, and since I do tend to get quite a few free CPUs, I'm not going to have any losses from doing so. I mean, a decade ago there were many many of us buying $300 CPUs/GPUs every two weeks with our pay cheques and clocking the crap out of them. It was loads of fun. Then HWBot came, people went into teams, and that whole world became quite closed and elitist.
 
So do you think enthusiasts buy Celerons and try to run them at 5GHz? Atoms maybe?
They used to. You don't have to be rich to be an enthusiast. Remember the old 300mhz Celeron? No L2 cache... you could OC to 500Mhz easy. Now they are 10x the Ghz and 10x the cores.... holy shit!

Edit: Now that I think of it, it was either Celeron or Pentium back then, or K6 series.... And I remember the prices on even the Celerons being too high--for me anyways.
 
Back
Top