Monday, December 31st 2012

Arctic Leaks Bucket List of Socket LGA1150 Processor Model Numbers

CPU cooler manufacturer Arctic (aka Arctic Cooling) may have inadvertently leaked a very long list of 4th generation Intel Core processors based on its LGA1150 socket. Longer than any currently posted lists of Core "Haswell" processors, the leak includes model numbers of nine Core i7, seventeen Core i5, five Core i3, and two Pentium models. Among the Core i7 models are already known i7-4770K flagship chip, i7-4770S, and a yet-unknown i7-4765T. The Core i5 processor list is exhaustive, and it appears that Intel wants to leave no price-point unattended. The Core i5-4570K could interest enthusiasts. In comparison to the Core i5 list, the LGA1150 Core i3 list is surprisingly short, indicating Intel is serious about phasing out dual-core chips. The Pentium LGA1150 list is even shorter.

The list of LGA1150 processor models appears to have been leaked in the data-sheets of one of its coolers, in the section that lists compatible processors. LGA1150 appears to have the same exact cooler mount-hole spacing as LGA1155 and LGA1156 sockets, and as such upgrading CPU cooler shouldn't be on your agenda. Intel's 4th generation Core processor family is based on Intel's spanking new "Haswell" micro-architecture, which promises higher performance per-core, and significantly faster integrated graphics over previous generation. The new chips will be built on Intel's now-mature 22 nm silicon fabrication process. The new chips will begin to roll out in the first-half of 2013.

Source: Expreview
Add your own comment

58 Comments on Arctic Leaks Bucket List of Socket LGA1150 Processor Model Numbers

#1
3870x2
by: Prima.Vera
I'm really curious how the new i7 4770 will perform compared to 3770. If it will be the same performance gap like between 3770 and 2700 I foresee some dark times for Intel...
marginally better. Intel releases are like Call of Duty titles now.
Posted on Reply
#2
ensabrenoir
by: Aquinus
Maybe. I think it really depends on how much less power it would use when running faster as well and how well AMD does with their processors. I don't think Intel is store for dark times though regardless of how much faster the 4770 is over the 3770.


Honestly I think even intel sees the writing on the wall....times are a changing... who wants to be the best vinyl record, casette tape & cd maker in an ipod world? I think were gonna see a new focus from intel.
Posted on Reply
#3
3870x2
by: TRWOV
True that. And if you take into account that 1366x768 is the most popular resolution you don't even need a gaming grade graphics card. About a year ago I build a PC for a relative with a Celeron G540, H61 board, 4GB RAM, HD6670 GDDR5 and he couldn't be happier (he was "gaming" on a P4 3.2C + X800XT).
The X800XT is the highest of the high end, he must have been doing great.
Posted on Reply
#4
jagd
There are 2 things you need more powerful / faster cpu ; gaming and video editing ( for a reguler home user of course ) . Anything else does not come to my mind need better ,any cpu on market will work for every work outside gaming/video editing for usual person i guess ( probably i missed some software with high demands , )


by: FordGT90Concept
Why is all I think about while looking at this is that my first generation i7 920 is more than satisfactory? Probably because it is.
(
Posted on Reply
#6
Jstn7477
Although LGA 1150 is coming a little "too soon," I'm actually happy that Intel released Ivy Bridge processors. I hate to sound like some kind of eco friendly nutcase, but compare the power consumption of a 45nm i7 processor with that of a 3770K and it is a huge difference. I haven't gotten my 3770K @ 4.3GHz/1.175v to consume more than 80 watts of power in IntelBurnTest (~60w in crunching, measured with HWMonitor reading the digital VRM interface) when my 2600K at 4.3GHz/1.3v took 130w in IntelBurnTest and 80w crunching. My i7-870 seems to dump out even more heat and is slower, so the advances in the process nodes are more significant than the actual performance increases.

Intel in the past few years has essentially done what the automotive industry has done in recent years: increase efficiency. You wouldn't use an old Chevy 454 big block these days over a Vortec 5300 or a brand new V6, would you? (unless you enjoy getting 8 MPG)
Posted on Reply
#7
newtekie1
Semi-Retired Folder
by: jagd
There are 2 things you need more powerful / faster cpu ; gaming and video editing ( for a reguler home user of course ) . Anything else does not come to my mind need better ,any cpu on market will work for every work outside gaming/video editing for usual person i guess ( probably i missed some software with high demands , )
I disagree, for the average consumer, gaming and video editing does not require a faster CPU than his or my several generation old i7. Gaming is basically all GPU limited at this point, the CPU makes little difference, and most video editors on the consumer level run pretty well on any decent quad-core. The only thing a faster CPU helps on is render times, but even still, the difference between the two is maybe a minute to render a decent length video.
Posted on Reply
#8
Steevo
The hardware side is there, we are waiting on the software side to catch up, and the next set of instructions that will enable the next major breakthroughs in processing. Look at what extra instruction sets did, and now how much of them are in use in all reality VS plain basic operation? Software companies are looking to cover as many systems as possible and to remain compatable with consoles and other devices, once we get past these issues I would imagine the whole need for speed will become more of a need for optimization.
Posted on Reply
#9
Frick
Fishfaced Nincompoop
by: ensabrenoir
[/B]

Honestly I think even intel sees the writing on the wall....times are a changing... who wants to be the best vinyl record, casette tape & cd maker in an ipod world? I think were gonna see a new focus from intel.
This is OP but audio is a very VERY bad example as it's a different beast alltogether. The best of those are titanicly expensive, and titanicly good despite being "old". If you get a chance, listen to a good setup with a Linn LP12 and be prepared to be blown away.

When talking about computers it's often best to talk about computers. Don't bring in other stuff that has nothing to do with them.
Posted on Reply
#10
ensabrenoir
by: Frick
This is OP but audio is a very VERY bad example as it's a different beast alltogether. The best of those are titanicly expensive, and titanicly good despite being "old". If you get a chance, listen to a good setup with a Linn LP12 and be prepared to be blown away.

When talking about computers it's often best to talk about computers. Don't bring in other stuff that has nothing to do with them.
1. You miss the point

2. Seriously? its just an analogy
Posted on Reply
#11
EarthDog
by: Frick
Of course not.
Of course. There may be some low end chips that sneak out, otherwise, its toast.

by: de.das.dude
Yeah intel usually doesnt change sockets like this. Oh wait, they do :roll:
Yeah, 2 gen's just arent enough these days...

by: james888
2 cpu generations a socket seem reasonable. I would prefer three. I bet intel could of made haswell 1155 if they had wanted to.
+1
Posted on Reply
#12
Cortex
by: Frick
We're making them "slower" because we don't need them to be faster. As you yourself point out. And the cloud and virtualization is being smarter, not dumber.
Think about this when you need to render something (video or raytracing), or play Crysis 3...

And virtualization is violation of user's privacy. So, it is being dumber from user's point of view. It just means that hardware and software are owned by an Corporation (could be evil, what do you know; If you ask me, even USA government is "evil", most governments are), and information is controlled by them.

Edit: I have logo of maybe largest cloud computing vendor for avatar. Ups. :D
Posted on Reply
#13
Frick
Fishfaced Nincompoop
by: ensabrenoir
1. You miss the point

2. Seriously? its just an analogy
Maybe so, but it was a bad analogy. A lot of analogys are stupid.

by: Cortex
Think about this when you need to render something (video or raytracing), or play Crysis 3...

And virtualization is violation of user's privacy. So, it is being dumber from user's point of view. It just means that hardware and software are owned by an Corporation (could be evil, what do you know; If you ask me, even USA government is "evil", most governments are), and information is controlled by them.
How many people render stuff, and do you need anything more than an i5 to play Crysis 3 properly? I have no idea, but I coldly assume it'll play just fine on an i3 or anything from AMD. Power users and people who needs workstations are a different lot from most end consumers, which was what I was talking about.

And you're confusing virtualization with cloud stuff. And the rest is good ol' American paranoia which is kinda silly imo.
Posted on Reply
#14
Delta6326
I can't wait foe these to come out! I can finally upgrade from my old Core2Quad! Now I just need to buckle down, study and pass my test and get a job.

I will gladly take a i7 4770k and AMD 8970/Nvidia 780...
Posted on Reply
#15
ensabrenoir
by: Frick
Maybe so, but it was a bad analogy. A lot of analogys are stupid.



How many people render stuff, and do you need anything more than an i5 to play Crysis 3 properly? I have no idea, but I coldly assume it'll play just fine on an i3 or anything from AMD. Power users and people who needs workstations are a different lot from most end consumers, which was what I was talking about.

And you're confusing virtualization with cloud stuff. And the rest is good ol' American paranoia which is kinda silly imo.
:laugh: you think quite highly of yourself... don't you....thankfully not everyone and or everything conforms to your realm of reasoning... but that's what makes us all special..... carry on!
Posted on Reply
#16
EarthDog
I understood exactly what the analogy was... LOL!
Posted on Reply
#17
Jurassic1024
by: FordGT90Concept
Why is all I think about while looking at this is that my first generation i7 920 is more than satisfactory? Probably because it is.

I'm also getting a strong sense of deja vu back to the Pentium 4's and Intel trying to push the clocks as high as they could reasonably go.

Then I get this knot in my stomach that computers are getting slower, not faster, because consumers aren't demanding faster (e.g. explosion of tablet and smartphone sales). Then again, developers aren't really pushing for faster hardware like they did in the 1990s and early 2000s.

Maybe it's because AMD is on a crash course and Intel has nothing better to do?

So depressing. :(
None of that is actually true. In fact, reading it again sounds a LOT like you're describing AMD's CPU division....

Higher clock speeds year after year? Show me. Oh right you can't because [Haswell] clocks are the same as Ivy Bridge.

Computers getting slower? Can you say GPU acceleration, impressive GPU drivers from AMD this year, Boost clocks from both nVIDIA and AMD (+ new SKU), Hyper-Threading, super cheap RAM, and <$1/GB SSD's?

Stay off the crack and out of the comment sections.
Posted on Reply
#18
Frick
Fishfaced Nincompoop
by: ensabrenoir
:laugh: you think quite highly of yourself... don't you....thankfully not everyone and or everything conforms to your realm of reasoning... but that's what makes us all special..... carry on!
So what is wrong with my reasoning? Seriously, if there is anything actually wrong with it I'll correct it. FOr realz.

And I still maintain it was a bad analogy, and that most of them are bad.
Posted on Reply
#19
EarthDog
LOL what makes a bad analogy bad is that its not understood and isnt relevant. At least for me, I could easily pick up on it.
Posted on Reply
#20
FordGT90Concept
"I go fast!1!11!1!"
by: Jurassic1024
Higher clock speeds year after year? Show me. Oh right you can't because [Haswell] clocks are the same as Ivy Bridge.
Core i7 980X = 3.33 GHz
Core i7 3970X = 3.5 GHz

There are no 6-core IvyBridge-based processors out yet.


by: Jurassic1024
Computers getting slower? Can you say GPU acceleration, impressive GPU drivers from AMD this year, Boost clocks from both nVIDIA and AMD (+ new SKU), Hyper-Threading, super cheap RAM, and <$1/GB SSD's?
The average of all computing devices is getting slower. Desktops are losing market share as weak ultrabooks, tablets, and phones take over more suited to play Angry Birds than Crysis.
Posted on Reply
#21
Major_A
by: FordGT90Concept
I'm a gamer. It's depressing because gaming technology isn't advancing. With all these idle cores and RAM
That's no lie. Very few games take advantage of quad core CPUs and even less run 64 bit natively to use more than a couple gigs of RAM. I remember the rumor of the original Athlon FX working on reverse hyper threading. I still wish that would come to fruition. I'd rather run a game on a 16Ghz single core machine than have the game barely tax 2 of my 4 cores.
Posted on Reply
#22
FordGT90Concept
"I go fast!1!11!1!"
Reverse hyperthreading was the rumor for Bulldozer. When it was found to be false, enthusaism for it plummeted.
Posted on Reply
#23
HumanSmoke
by: 3870x2
[quote="james888, post: 2812654"]2 cpu generations a socket seem reasonable. I would prefer three. I bet intel could of made haswell 1155 if they had wanted to.
These are not full generations though. Look at 775, that was great. I guess Intel is on the motherboard manufacturers payroll.[/quote]I wouldn't say its an Intel-only trait. FM1 lasted how many generations ?

I also seem to remember that mobo makers sold a reasonable number of 990FX/X boards leading up to Bulldozers launch, largely off the back of some AMD guerrilla marketing. What huge advance do the 900 chipsets offer that the 800's don't?
Posted on Reply
#24
chinmi
finally !! time to upgrade from my old 1155 i5 750 !!! :toast:

probably gonna get that unlocked 4770 version :)

- or -

should i wait for the next generation processor & socket ?? (after 1150) ??
Posted on Reply
#25
jihadjoe
by: Frick
Conroe was the end all solution. A computer from that time, with upgraded RAM, is still more than enough for the majorityof consumers. Imagine using a 1997 CPU in 2002..
Very true. I'm still on my B3 Q6600 from 2007. I've upgraded to an SSD, added some ram RAM and changed GPU to a modern one, but after a mild OC to 3GHz I've never felt it to be lacking on the CPU side.

Intel's problem really is that they've build something so good it's hard to offer a truly tangible upgrade on the performance side, since most games very quickly become GPU limited anyway. That's probably one of the reasons they've been focusing on improving power instead.

Until the next big game (or other program) comes out and brings the best CPU to its knees, then there's no compelling reason to be pushing for outright performance, rather than the far more reasonable performance-per-watt.
Posted on Reply
Add your own comment