Discussion in 'Overclocking & Cooling' started by LagunaX, Apr 22, 2013.
Might be time to invest in a vice grip for the garage for the razorless delidding method...
And why? Was it confirmed they used the crap TIM or whatever?
Why do they lid them anyway? Isn't most efficient to transfer the heat directly to the heatsink?
I'd want a protective ring like on new gpus, but Intel would then have to make a list of compatible coolers (ones with flat bases) and then specify a curve for all heatsinks to follow so they don't crack the chip.
Dont understand why intel make their cpu's in a bad way that requires us to invalidate our warranty if we want to get the best out of our overclock. We are buying a K series chip for a reason. We want to overclock, so why make us invalidate warranty in order to overclock with good temperatures?!
I think they do it so that the chip overheats before it can be damaged. Extreme guys WILL NOT remove the IHS, that I have seen. This makes sure you can overclock for a long period of time, without much fuss. If things get dangerous, the chip protects itself, and it has to be "poor" enough so that watercooling users don't kill their chips either.
Then, if you do kill your chip, Intel replaces it for a cheap cost. The Tuning Plan warranty isn't something you buy at purchase...you can buy it after the chip died...you just need to wait 30 days before making a claim.
So temps really shouldn't be a problem, for anyone. Extreme guys have proven the TIM is good enough, and for the rest, it's what keeps your chips alive.
with Extreme cooling the TIM doesn't really matter because the LN2 will freeze anything too close to the pot using air alone why do you think the motherboards end up covered in ice when they are badly ventilated. Either way the TIM is terrible also chips last longer when their colder why do you think LN2 Guys can push 2v through IB chips without worrying to much about killing them. The colder a chip is the less voltage you need to use and the more voltage is required to kill it so better TIM would always be a plus.
There is quite a difference between -210C of LN2 and the 10C you drop with an IHS removed. You wont gain a discernible voltage increase with the minor amount of temp change versus LN2 it is a terrible comparison.
The lid is merely there to stop you from cracking the core. When socket A's where popular you could buy a shim that protected the chip when installing the heatsink, but it added heat. The one good thing about the shim method was it kept your core exposed. the lid protects the whole surface of the chip, but covers your core.
...and as Cadaveca is mentioned, it takes more time to heat up the lid than just the core alone. I suspect that the heat spreader adds a buffer allows the temperature to not increase too quickly in case there is poor (or no) thermal contact between the CPU and the cooler. A CPU can go from room temperature to 100*C at idle in a matter of seconds without a cooler. The heat has nowhere to go so it builds up and since the CPU and the heat spreader don't weight much or dissipate much heat by themselves, they heat up very quickly.
I suspect that it is both, to protect the core and to provide a thermal buffer. Personally if I had a "de-lidable" CPU, I don't think I would do it. In case anything happens to the CPU, I would rather be able to send it back to Intel. I don't have that problem though, and it's another reason why I'm happy with my 3820.
That and having a cover simply protects the chip. Mounting a heatsink is much more reliable with a heatspreader, much larger contact area and less chance of a defective mount. It's that simple.
More chips are likely damaged by improper heatsink installation than any other factor. Unfortunately, Intel didn't do a great job with the TIM or the mounting of the spreader. They don't garantee overclocking and never have.
Damn, I remember buying a shim for my Barton 2500+ CPU a decade ago. The CPU immediately pegged at 90C or something like that, almost ruining it. :shadedshu That P.O.S. shim then found its way to the dustbin post haste.
In fact, a corner of the die did get slightly chipped over time, yet the CPU works fine to this day.
Shim was a little thick?
Remember the shim dots on each corner around GPU's? Maybe they still have them, I haven't taken a card apart in a while.
Oh, it fitted ok and looked visually perfect. However, it stopped the cooler from putting the proper pressure on the die. Perhaps it was a tiny bit thick, dunno. However, I wasn't gonna risk damage with another shim to make sure. It was certainly sold as compatible with that CPU. However, if I remember correctly, AMD never officially sanctioned the user of shims with their CPUs, so I'm not surprised they didn't work.
Delidding gives me -20*C temperatures over previous and makes my really poorly binned silicon lottery loser 3570k perform like a typical 3570k. I still need 1.45v just to reach 4.5 ghz though. This chip sucks and if it burns out, I'll upgrade to a I7 3770k (and delid) or sidegrade to an I7 2600k where its more likely to reach higher overclock because of better binning. I was running the chip at 4.4 Ghz @ 1.375 volt (45*C typical load 85*C intel burn test) but now that its summer I am running 4.0ghz @ 1.14v (40*C typical load 75*C IBT).
If I bought a Haswell I'd delid it as well.
I only had luck delidding with Cool Lab Ultra. Other thermal pastes gave me worse than stock performance!
For a decent overclock, just get a 2600K/2700K while you still can. My 2700K did 5.5GHz on a quick test when I bought it and could have gone even higher had I wanted it. Since then, I've run it at a nice easy 4.7GHz with all the power save options turned off.
Yeah, I've felt that Sandy seems to overclock pretty well. 4.5Ghz seems to be a sweet spot for my 3820 but this board can't get it to clock much higher than 4.75Ghz without pumping a ton of voltage through it and even then the best I've been able to squeeze out of it is 4.92Ghz. In all seriousness though, regardless of how a 2700k/2600k/3770k/4770k/3820 clocks, they're all going to be about the same with respect to real world performance. They're all good chips that perform very well. I like my soldered heat spreader on my 3820 though.
Indeed, while the performance differences between the models are there, they seem almost academic in real world use. It would tend to make a noticeable difference in particular apps, rather than in a general sense.
It's really retarded that Intel's superior 22nm 3D technology overclocks worse than their old 32nm tech, but it seems that Intel just don't care about this anymore and hence used that inferior TIM on the newer processors.
I just wanna clarify that while my CPU ran just fine at 5.5GHz, it consumed a ton of power and put out a ton of heat, not to mention extra voltage needed to get there. The big Zalman flower cooler I had on it at the time was running flat out too and could barely keep the temperature down. Run any sort of real load on it like Prime95 or whatever and the temperature shot through the roof. I'd have had to use watercooling to stop it overheating with a massive overclock like that, which I'm not prepared to do. Also, I don't wanna kill it quickly with a continual overstress.
Finally, given the crap overclocks on Haswell, if I was building a system now and couldn't get an SB CPU, I'd likely go for the non-K 4770. Depressing for an enthusiast, but it just wouldn't be worth paying extra for a crap overclock.
Not impressive at all. To get any sort of decent overclock out of these we have to delid them and risk killing the chip either right then and there, or cracking the cpu die when installing the heatsink. The risk runs especially high for those of us with coolers like my H70 or other coolers of similar design where we can really crank down on the chip, putting very high pressure on it.
I'd prefer them to go back to the good old epoxy. It performed excellently with Sandy Bridge, the first generation i7 and other chips that came before them. I'm an enthusiast and if someone were to give me a 4770k system I would certainly overclock it; not because I need to but because I can and I'm always trying to be faster than I already am. That said, I'm not extreme enough to take razor blades, vices etc to my expensive hardware to get more out of it. The method Intel is using now caters to nobody but the most extreme among us. For your general enthusiast and your general user this crap thermal paste is far inferior to their epoxy. I think my i7 920 is going to be with me for a long time to come yet.
Separate names with a comma.