• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Arctic Leaks Bucket List of Socket LGA1150 Processor Model Numbers

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,668 (7.43/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
CPU cooler manufacturer Arctic (aka Arctic Cooling) may have inadvertently leaked a very long list of 4th generation Intel Core processors based on its LGA1150 socket. Longer than any currently posted lists of Core "Haswell" processors, the leak includes model numbers of nine Core i7, seventeen Core i5, five Core i3, and two Pentium models. Among the Core i7 models are already known i7-4770K flagship chip, i7-4770S, and a yet-unknown i7-4765T. The Core i5 processor list is exhaustive, and it appears that Intel wants to leave no price-point unattended. The Core i5-4570K could interest enthusiasts. In comparison to the Core i5 list, the LGA1150 Core i3 list is surprisingly short, indicating Intel is serious about phasing out dual-core chips. The Pentium LGA1150 list is even shorter.

The list of LGA1150 processor models appears to have been leaked in the data-sheets of one of its coolers, in the section that lists compatible processors. LGA1150 appears to have the same exact cooler mount-hole spacing as LGA1155 and LGA1156 sockets, and as such upgrading CPU cooler shouldn't be on your agenda. Intel's 4th generation Core processor family is based on Intel's spanking new "Haswell" micro-architecture, which promises higher performance per-core, and significantly faster integrated graphics over previous generation. The new chips will be built on Intel's now-mature 22 nm silicon fabrication process. The new chips will begin to roll out in the first-half of 2013.



View at TechPowerUp Main Site
 
Last edited:
So no more for LGA1155 ??
 
2 cpu generations a socket seem reasonable. I would prefer three. I bet intel could of made haswell 1155 if they had wanted to.
 
at least they need to use the same retention
but too bad new socket = different retention
 
I bet intel could of made haswell 1155 if they had wanted to.

I'm sure they could, but I'm willing to bet that they wouldn't have been able to implement the VRM control changes if they didn't switch.
 
2 cpu generations a socket seem reasonable. I would prefer three. I bet intel could of made haswell 1155 if they had wanted to.

These are not full generations though. Look at 775, that was great. I guess Intel is on the motherboard manufacturers payroll.
 
These are not full generations though. Look at 775, that was great. I guess Intel is on the motherboard manufacturers payroll.
Back in the 775 days Intel had some real competition, now they don't.
 
Why is all I think about while looking at this is that my first generation i7 920 is more than satisfactory? Probably because it is.

I'm also getting a strong sense of deja vu back to the Pentium 4's and Intel trying to push the clocks as high as they could reasonably go.

Then I get this knot in my stomach that computers are getting slower, not faster, because consumers aren't demanding faster (e.g. explosion of tablet and smartphone sales). Then again, developers aren't really pushing for faster hardware like they did in the 1990s and early 2000s.

Maybe it's because AMD is on a crash course and Intel has nothing better to do?

So depressing. :(
 
Why is all I think about while looking at this is that my first generation i7 920 is more than satisfactory? Probably because it is.

I'm also getting a strong sense of deja vu back to the Pentium 4's and Intel trying to push the clocks as high as they could reasonably go.

Then I get this knot in my stomach that computers are getting slower, not faster, because consumers aren't demanding faster (e.g. explosion of tablet and smartphone sales). Then again, developers aren't really pushing for faster hardware like they did in the 1990s and early 2000s.

Maybe it's because AMD is on a crash course and Intel has nothing better to do?

So depressing. :(
I know exactly what you mean. It seems like things are stagnant. There are mostly just architectural changes that are marginally better than the last.

Now they have tablets and phones with graphics superior to the Xbox 360 and PS3 (if you are unaware of this, check it out. One good example is the Unreal engine, but there are a few others).

These devices are much more powerful in their graphics than laptops with IGPs. I will be doing another build this year, probably my final build, and I will keep it in tact to remember the days when we could build our own computer.
 
Last edited:
Why is all I think about while looking at this is that my first generation i7 920 is more than satisfactory? Probably because it is.

I'm also getting a strong sense of deja vu back to the Pentium 4's and Intel trying to push the clocks as high as they could reasonably go.

Then I get this knot in my stomach that computers are getting slower, not faster, because consumers aren't demanding faster (e.g. explosion of tablet and smartphone sales). Then again, developers aren't really pushing for faster hardware like they did in the 1990s and early 2000s.

Maybe it's because AMD is on a crash course and Intel has nothing better to do?

So depressing. :(

It's because people don't need more speed. You don't need more speed, you said so yourself. The speed race for consumers ended with the Core 2 Duo. Computers are getting faster, but the applications that actually make use of the speed is getting rarer. This is a good thing, because people don't have to spend tons of cash on computers anymore, and we can focus on better performance/watt ratios. Why is this depressing? Did you really expect everyone would have a super computer in their home, even if that would be retardedly wasteful?
 
I'm a gamer. It's depressing because gaming technology isn't advancing. With all these idle cores and RAM, there should be more than enough hardware to run a pseudo-AI in games yet, it isn't being done. Where are the slew of simulators that we saw in the 1990s which pushed the bounds of what is possible? Where's the FarCry titles that attempt to render insane amounts of foilage? Where's all the physic's-based games that Aegia promised? Why did gamers kill great inovative titles like Spore because of DRM? Most of the innovations are coming from indie developers (e.g. Minecraft) but they don't have the resources to take the game design to the next level. Case in point: look at Conan O'Brien's review of Minecraft.

We're inventing ways to make computers dumber and slower (to the point they're virtualized on "clouds"), not smarter and faster (which created tons of optimism up until about 2008). Someone needs to modify AMD's logo and change it to "Gaming Devolved" and stamp it on the entire industry.


A Core i7 today is the equivilent of a supercomputer 20 years ago.
 
Last edited:
We're inventing ways to make computers dumber and slower (to the point they're virtualized on "clouds"), not smarter and faster (which created tons of optimism up until about 2008).

We're making them "slower" because we don't need them to be faster. As you yourself point out. And the cloud and virtualization is being smarter, not dumber.

A Core i7 today is the equivilent of a supercomputer 20 years ago.

They are, but that is not what I meant.

Also Minecraft is huuuggeellly overrated as a game. I'm not sure I want to call it a game. But that is OP.
 
wow....i think you guys are missing something....


Sandy Bridge was THAT GOOD!!!

it's skewing your perceptions
 
Intel is focusing on integrating more video processing power into its processors rather than just creating more powerful processors. I can see the reasoning behind that. Once they get a good architecture they can trim the fat, shrink it and slap it into the new iPad7 and make way more money than they could by selling us tech-jerks new hardware.
 
If we are going to get faster hardware, the ball really is in the software developers' park to make something that will really push current CPUs to their knees. Otherwise Intel will be more than happy focusing on increasing performance-per-watt over outright performance itself.
 
Why is all I think about while looking at this is that my first generation i7 920 is more than satisfactory? Probably because it is.

Yep, I'll likely still be running my 875K until at least the end of 2013. I haven't found anything yet it wasn't good enough for, and I don't see anything coming in the next year either.

I'm also getting a strong sense of deja vu back to the Pentium 4's and Intel trying to push the clocks as high as they could reasonably go.

Then I get this knot in my stomach that computers are getting slower, not faster, because consumers aren't demanding faster (e.g. explosion of tablet and smartphone sales). Then again, developers aren't really pushing for faster hardware like they did in the 1990s and early 2000s.

Maybe it's because AMD is on a crash course and Intel has nothing better to do?

So depressing. :(


Back in the early 2000's Software was outpacing hardware. For a lot of what people wanted to do a single core Pentium 4 just wasn't enough anymore, consumers were starting to move more into digital photograph editing and digital movie editing. Functions that used to only be reserved for high end multi-processor workstations. Now the reverse seems true, the hardware has outpaced the software, and there isn't really a large consumer demand for faster hardware, the hardware we have or have had for a while, does everything pretty fast already.
 
i dont see why it is so bad if intel focuses on performance per watt. i for one would love to have the core i5's performance in a 45 watt desktop package.
 
isn't really a large consumer demand for faster hardware, the hardware we have or have had for a while, does everything pretty fast already.

Conroe was the end all solution. A computer from that time, with upgraded RAM, is still more than enough for the majorityof consumers. Imagine using a 1997 CPU in 2002..
 
I'm really curious how the new i7 4770 will perform compared to 3770. If it will be the same performance gap like between 3770 and 2700 I foresee some dark times for Intel...
 
Conroe was the end all solution. A computer from that time, with upgraded RAM, is still more than enough for the majorityof consumers. Imagine using a 1997 CPU in 2002..

True that. And if you take into account that 1366x768 is the most popular resolution you don't even need a gaming grade graphics card. About a year ago I build a PC for a relative with a Celeron G540, H61 board, 4GB RAM, HD6670 GDDR5 and he couldn't be happier (he was "gaming" on a P4 3.2C + X800XT).
 
ALL so true......well almost all. The pad era will either fade away or spawn a pad with the power of a lap/desktop. Evolve or die guys...There will always be modders and overclockers..just in new formats...
 
Back
Top