Monday, August 3rd 2015

Intel Core "Skylake" Retail Boxes Surprisingly Colorful

The retail packaging of Intel's 6th generation Core "Skylake" processors in the LGA1151 package, will be surprisingly colorful, and a throwback to the pre-Pentium 4 era, according to spy-shots of the retail boxes of the upcoming Core i7-6700K and Core i5-6600K. What's even more surprising, is that packages of the i7-6700K and i5-6600K, which feature unlocked base-clock multipliers, making them primed for overclocking, do not include stock cooling solutions. Their retail packages resemble those of Intel's Core i7 HEDT processors. In the box, you'll find just the processor, its case-badge, and basic documentation.

Both the Core i7-6700K and Core i5-6600K feature the same integrated graphics SKU - HD Graphics 530. Both feature integrated memory controllers that support both DDR3L and DDR4 memory types. The Core i5 predictably lacks HyperThreading, and only features 6 MB of L3 cache, while the Core i7 features HyperThreading, and the full 8 MB present on the chip. The "Skylake" silicon will be built on the 14 nm process.
Add your own comment

37 Comments on Intel Core "Skylake" Retail Boxes Surprisingly Colorful

#26
FordGT90Concept
"I go fast!1!11!1!"
Ugh! I want pricing and availability for 6700 and 6600. I don't care what the box art looks like; it'll just get thrown in my case box and forgotten until I have to pull the processor out for some reason.
Posted on Reply
#27
johnspack
Here For Good!
Wow, long ways off for me. And more cores is better, especially when you run vms all day. Finally getting a 1st gen hex core.... only 10 more years until I can afford one of this gen!
Posted on Reply
#28
DeNeDe
waiting for Skylake with iris pro integrated graphics.. that HD530 is... bullshit.
Posted on Reply
#29
techy1
....I better take MOAR COARS than MOAR iGPU (and do not give me that BS that many poeple wait to spend their cash to get newest iGPU - cuz people that uses iGPU would not see any MOAR GRAPIX difference betwean skylake vs Sandys iGPU... they usually play angrybird type of games on their 1080p and watch fhd youtube vlogs - Sandy could do it prefectly years ago)
Posted on Reply
#30
Caring1
peche, post: 3325130, member: 153520"
correct ... gay scheme on i7's ....
Not that there is anything wrong with that ;)
Posted on Reply
#31
Frick
Fishfaced Nincompoop
Sony Xperia S, post: 3325007, member: 148671"
I think you confuse yourself and the others.

Only ask people whether they want their CPUs with 8-real cores instead of whatever they are forced to buy and you will see.

The results from the poll will be pro more cores. Never use people's lack of knowledge and ignorance to spoil technological advancement.

You need more cores to give customers better experience. The only question is if you want or not.
You did use the word mainstream in your first post. Power/profesional user =! mainstream. Technically speaking as well, as others have pointed out, there's zero need for it, and I think we agree it's pretty dumb people are getting things they have zero use for.

About the AMD picture, so true, they were late.
Posted on Reply
#32
DeNeDe
techy1, post: 3325370, member: 146421"
....I better take MOAR COARS than MOAR iGPU (and do not give me that BS that many poeple wait to spend their cash to get newest iGPU - cuz people that uses iGPU would not see any MOAR GRAPIX difference betwean skylake vs Sandys iGPU... they usually play angrybird type of games on their 1080p and watch fhd youtube vlogs - Sandy could do it prefectly years ago)
well.. i do not use a dedicated gpu and i don't play angry birds.. i watch 4k videos and just finished assassin's creed black flag on my igpu.. was having 35fps-40fps at full hd medium details..so mneh...
Posted on Reply
#33
Aquinus
Resident Wat-man
Sony Xperia S, post: 3325007, member: 148671"
I think you confuse yourself and the others.

Only ask people whether they want their CPUs with 8-real cores instead of whatever they are forced to buy and you will see.

The results from the poll will be pro more cores. Never use people's lack of knowledge and ignorance to spoil technological advancement.

You need more cores to give customers better experience. The only question is if you want or not.
Here are the problems with your logic, because there is more than 1.
  1. You're running under the assumption that more cores = better performance. This is simply not true. As a software developer who writes multi-threaded applications, I can tell you that there is a limit to what more cores can do and it varies from task to task. Not every task can be made to run in parallel and get a benefit from it. I can use 20 threads but that will do me no good if every thread is blocking so often I'm not using more than 1 core worth of resources at once while adding to the overhead of being multi-threaded in the first place.
  2. Intel isn't putting more cores on their mainstream platform because there is no need for it (see #1). They would rather dedicate more die space for the iGPU so they can take over yet another market from AMD. When push comes to shove, most consumers won't care about a dedicated GPU because Intel will actually have something half decent at the rate they're progressing, even when using bastardized X86 cores for GPU compute which is fricken hysterical IMHO.
  3. AMD tried adding more cores and it got them nowhere fast (see #1) but, their APUs did in the sense of what the average consumer wants (see #2).
Please explain to me where I'm wrong here.
Posted on Reply
#34
64K
Waiting on cadaveca's Skylake review and other sites too but I'm suspecting that Skylake won't make enough of a difference over my Ivy Bridge to warrant a new build. I will probably end up waiting for 10nm Cannonlake. The new 10nm architecture on whatever comes after Cannonlake should be the end of the line for silicon with Intel. New materials will be used to go to 7nm and beyond.

Aquinus, post: 3325412, member: 102461"
Here are the problems with your logic, because there is more than 1.
  1. You're running under the assumption that more cores = better performance. This is simply not true. As a software developer who writes multi-threaded applications, I can tell you that there is a limit to what more cores can do and it varies from task to task. Not every task can be made to run in parallel and get a benefit from it. I can use 20 threads but that will do me no good if every thread is blocking so often I'm not using more than 1 core worth of resources at once while adding to the overhead of being multi-threaded in the first place.
  2. Intel isn't putting more cores on their mainstream platform because there is no need for it (see #1). They would rather dedicate more die space for the iGPU so they can take over yet another market from AMD. When push comes to shove, most consumers won't care about a dedicated GPU because Intel will actually have something half decent at the rate they're progressing, even when using bastardized X86 cores for GPU compute which is fricken hysterical IMHO.
  3. AMD tried adding more cores and it got them nowhere fast (see #1) but, their APUs did in the sense of what the average consumer wants (see #2).
Please explain to me where I'm wrong here.
She just got back from a mandatory 3 month vacation for disrupting Nvidia and Intel threads with nonsense.
Posted on Reply
#35
Aquinus
Resident Wat-man
64K, post: 3325415, member: 148270"
She just got back from a mandatory 3 month vacation for disrupting Nvidia and Intel threads with nonsense.
Huh, that seems to explain why I haven't needed to argue with people as often as I normally do. :p
In all seriousness, Intel is bolstering their iGPU performance for very good reason. Intel already makes fast CPUs and has nothing more to prove on that front. If Intel's iGPUs start competing with mid-range cards it's not only going to take a bite out of AMD but NVidia as well. Hate to say it, but this is where sales are at. People like me who buy a 390 is not your average consumer. We need to remember that. TPU tends to have power users not standard run-of-the-mill consumers. It's all about market share.
Posted on Reply
#36
techy1
DeNeDe, post: 3325408, member: 149009"
well.. i do not use a dedicated gpu and i don't play angry birds.. i watch 4k videos and just finished assassin's creed black flag on my igpu.. was having 35fps-40fps at full hd medium details..so mneh...
...and? will you upgrade another CPU (and motherboard) - just for iPGU preformance gains?
Posted on Reply
#37
Sihastru
Aquinus, post: 3325412, member: 102461"
Please explain to me where I'm wrong here.
You went so far over their heads... They only understand things with the letter "X" somewhere in them.

But in all honesty Iris was the great equalizer, while Iris Pro is just the icing on the cake. Using 128MB of "GPU" cache as CPU L4 cache... I want to see those benchmarks.

And with the indie revolution happening (some say it's just about to end, I disagree), helped in great part by the disruptive mobile market, the iGPU is here to stay. All it needs to do is give us yesterday's performance, but in 4K.
Posted on Reply
Add your own comment