Tuesday, May 12th 2015

95W TDP of "Skylake" Chips Explained by Intel's Big Graphics Push

Intel's Core "Skylake" processor lineup, built on the company's swanky new 14 nanometer fab process, drew heads to its rather high 95W TDP for quad-core parts such as the Core i7-6700K and Core i5-6600K, even though their 22 nm predecessors, such as the i7-4770K and the i5-4670K run cooler, at 84W TDP. A new leaked slide explains the higher TDP. Apparently, Intel is going all-out with its integrated graphics implementation on Core "Skylake" chips, including onboard graphics that leverage eDRAM caches. The company is promising as much as 50% higher integrated graphics performance over "Haswell."

Although the chips have high rated TDP, the overall energy efficiency presents a different story. SoCs based on "Skylake" will draw as much as 60% lower power than "Haswell" based ones, translating into 35% longer HD video playback on portable devices running these chips. Intel's graphics performance push is driven by an almost sudden surge in display resolutions, with standards such as 4K (3840 x 2160) entering mainstream, and 5K (5120 x 2880) entering the enthusiast segment. Intel's design goal is to supply the market with a graphics solution that makes the two resolutions functional on desktop and video, if not gaming.
Source: AnandTech Forums
Add your own comment

72 Comments on 95W TDP of "Skylake" Chips Explained by Intel's Big Graphics Push

#1
btarunr
Editor & Senior Moderator
This also means that on systems with discrete graphics (power to IGP gated), TDP of the chips will be lower than Haswell.
Posted on Reply
#2
NC37
And had AMD not made APUs and forced Intel to get off their butts, 4k would have come and they'd be pissing their pants as they scramble to make graphics that don't suck.

So literally, thank you AMD for making this day happen. Now you just better deliver with Zen or you'll have no way to flaunt your APUs in front of Intel anymore.
Posted on Reply
#3
dj-electric
I wont give too much credit to AMD for this, as intel integrated GPUs in their systems a long time ago on mobos, and on CPUs ever since 2009. They are not late to the game in any way.

Granted, the power of those GPUs wasn't that good up until HD2500, and with what i persume as a R7 250-like performance on top skylake CPUs, the simple assumption of "all APUs have better graphics horse power than any intel CPU" is goign to change. A lot.
Posted on Reply
#4
NC37
Yeah Intel did it for awhile but their GPUs sucked till AMD started putting Radeons with their crap CPUs and calling them APUs. Intel has been the laughing stock of graphics for a long time. Heck their answer to discrete GPUs was to link a bunch of x86 CPUs together and call it a GPU. I know some people are upset that product was canceled because they all talked about how it would revolutionize how games were made. Me, I'm glad it was canned because Intel had no idea how to make GPUs. They showed off the tech for years then finally when they started getting serious they realized any performance it had was long gone.

Competition is good and when you've got AMD gloating about having discrete level graphics on a CPU, it's like a little monkey poking a gorilla with a tiny poop stick. It might not notice right away but eventually its gonna turn around and swat it.

Before APUs, the last time I remember Intel being proud of a new GPU was back with the GMA950 and that was a laughing stock when it launched. Now since APUs, they've been actually putting forth effort. I honestly was impressed the HD4000 in my laptop actually could run some things. I was thinking it would be utter crap compared to the 660M in it, and well it is, but it was better than expected.

They can't rival AMD or nVidia on high end but its good to see them finally putting forth effort into graphics. Just imagine if Intel got real serious about graphics and decided to enter the GPU war. They got the R&D, fabs, and all the brains to be able to do it.
Posted on Reply
#5
PLAfiller
NC37Just imagine if Intel got real serious about graphics and decided to enter the GPU war. They got the R&D, fabs, and all the brains to be able to do it.
Spot on. I think, capacity speaking they have the biggest infrastructure compared to any other GPU manifacturer right now. I still like the looks of Larrabee :))
Posted on Reply
#6
Wark0
This doesn't explain the 95W TDP, this TDP will be only for the K and the K doesn't have GT4e.
Posted on Reply
#7
Joss
How many buy high end i5/i7 and use the integrated graphics?
A family of APUs and other of CPUs would make more sense.
Posted on Reply
#8
Debat0r
I really think Intel isn't really doing this graphics thing all that well. Sure, they got good graphics with the new iris series, but they only put them in high-end chips, which is the market that almost exclusively uses dedicated GPUs. Are they really that stupid? To challenge AMD's APU might they need to push out a Pentium or i3 with this kind of GPU, not an i7...
Posted on Reply
#9
tehehe
Biggest problem as far as Intel GPUs go is terrible driver support. Maybe DX12/Vulkan will change that assuming most games will be written with these apis.
Posted on Reply
#10
PLAfiller
Debat0rI really think Intel isn't really doing this graphics thing all that well. Sure, they got good graphics with the new iris series, but they only put them in high-end chips, which is the market that almost exclusively uses dedicated GPUs. Are they really that stupid?
I don't think they are, because you still gonna pay for the whole chip and they collect your toll for the iGPU regardless of whether you will use it or not. So if you want to have a high end CPU from Intel, you pay for the iGPU as well it seems so.
Posted on Reply
#11
RejZoR
Intel had GPU's for ages, but they were absolute garbage until AMD forced them to do something. They are still rather rubbish, but at least they improved them significantly.
Posted on Reply
#12
techy1
I wish Intel would stop doing iGPUs... cuz I do not need them... I will always choose mid-top range discrete GPU (and still - even a low end GPUs are far more capable than iGPU) - so that iGPU is waste of my money, time and TDP (Intel is like: "oh - look brand new iGPU +50% over previous gen, almost 4k capable - we did this all for you buddy, no matter that you do not and will not use iGPU - but our team worked 2 years (and in that time made only like 0-5% CPU gain) to give you this experience you will never use or need - lucky you, now pay extra for our useful work". I would choose, for that extra $ and extra TDP a +1% CPU power than +50% iGPU .. But there is no such a option (never was and never will be - and that sux).
Posted on Reply
#13
Aquinus
Resident Wat-man
lZKoceI don't think they are, because you still gonna pay for the whole chip and they collect your toll for the iGPU regardless of whether you will use it or not. So if you want to have a high end mainstream CPU from Intel, you pay for the iGPU as well it seems so.
I updated your comment. As I've said in the past, skt1150 is Intel's mainstream platform not a HEDT platform like 2011-3. As a result, like APUs, there is an expectation for there to be graphics on chip. So while Intel might have "fast" or "high end" CPUs for the socket, it doesn't change the fact that it's still a mainstream consumer platform, much as AMD' APU lineup are.
NC37Heck their answer to discrete GPUs was to link a bunch of x86 CPUs together and call it a GPU.
That's because only nVidia and AMD have the rights to shader technology. Intel was forced to use x86-like cores to do GPU because that's what they had access to.
Debat0rI really think Intel isn't really doing this graphics thing all that well. Sure, they got good graphics with the new iris series, but they only put them in high-end chips, which is the market that almost exclusively uses dedicated GPUs. Are they really that stupid? To challenge AMD's APU might they need to push out a Pentium or i3 with this kind of GPU, not an i7...
I do? My laptop has an Iris Pro in it and it will do everything from work to 4k video. I have to say Iris Pro is a lot better than all of the other iGPU Intel has conjured up.
lZKoceSpot on. I think, capacity speaking they have the biggest infrastructure compared to any other GPU manifacturer right now. I still like the looks of Larrabee :))
Intel can't compete in that market unless they built an entirely new GPU architecture from scratch. Once again, Intel doesn't own the rights to make shaders where AMD and nVidia do. That alone complicates matters.
techy1I wish Intel would stop doing iGPUs... cuz I do not need them... I will always choose mid-top range discrete GPU (and still - even a low end GPUs are far more capable than iGPU) - so that iGPU is waste of my money, time and TDP (Intel is like: "oh - look brand new iGPU +50% over previous gen, almost 4k capable - we did this all for you buddy, no matter that you do not and will not use iGPU - but our team worked 2 years (and in that time made only like 0-5% CPU gain) to give you this experience you will never use or need - lucky you, now pay extra for our useful work". I would choose, for that extra $ and extra TDP a +1% CPU power than +50% iGPU .. But there is no such a option (never was and never will be - and that sux).
Most people don't use computers like people at TPU do. Intel has a market to satisfy and most of it wants iGPUs. Intel doesn't really give two shits what we want out of a mainstream platform. If you told that to Intel, they would probably laugh at you and tell you to go skt2011-3 if you so bent out of shape about it. Simple fact, most consumers don't need a discrete graphics card, at least not right off the bat if they ever will. Mainstream platform means mainstream features, if you don't like it, then don't buy it. Also Intel's IPC has increase more than 1% and squeezing performance out of a core that they've already squeezed a lot out of is a tedious task. Maybe you should go help Intel design a new one... :slap:

Also Iris Pro can drive 4k video on a 4k display. I know because I've done it and it works great. Is it good for gaming, not really, but is it good for everything except gaming or maybe even a bit of light gaming? Sure. Just remember, Intel makes most of its money off businesses, not individual consumers, so it only makes sense that their products reflect the market share and a huge chunk of the market uses iGPUs or has no use for discrete graphics.
Posted on Reply
#14
Yellow&Nerdy?
I don't get why Intel even includes an iGPU on the unlocked models. I'm pretty sure no one buys unlocked processors and don't get a dedicated GPU.
Posted on Reply
#15
RejZoR
Because it's cheaper for them to just churn out all the same CPU's than designing two different designs, one with iGPU and one without.
Posted on Reply
#16
techy1
AquinusMost people don't use computers like people at TPU do. Intel has a market to satisfy and most of it wants iGPUs. Intel doesn't really give two shits what we want out of a mainstream platform. If you told that to Intel, they would probably laugh at you and tell you to go skt2011-3 if you so bent out of shape about it. Simple fact, most consumers don't need a discrete graphics card, at least not right off the bat if they ever will. Mainstream platform means mainstream features, if you don't like it, then don't buy it. Also Intel's IPC has increase more than 1% and squeezing performance out of a core that they've already squeezed a lot out of is a tedious task. Maybe you should go help Intel design a new one... :slap:

Also Iris Pro can drive 4k video on a 4k display. I know because I've done it and it works great. Is it good for gaming, not really, but is it good for everything except gaming or maybe even a bit of light gaming? Sure. Just remember, Intel makes most of its money off businesses, not individual consumers, so it only makes sense that their products reflect the market share and a huge chunk of the market uses iGPUs or has no use for discrete graphics.
Sure thing - i3 or Pentium buyers do not even know what letters GPU means and slogans like "+50% mooaar... 4K" is good enough to buy for them... but what about i7 and "K"-series? - nobody of mainstream nor needs or understands those benefits... why those series needs "+50% moaar 4K" slogans? cuz those clients (who needs and or understands "i7") should only be pissed of by reading something like that. (I believe that 4k video is fine for that new iGPU... I was positively surprised that my netbook with i3 Sandy back in its the days could run 1080p.. but then again - the keywords are: "netbook", "i3", "no discrete" and not: "i7-k", "Z170", "above 200$-range GPU")
Posted on Reply
#17
Lionheart
I'm all for better Intel GPU's but like some of you guys mentioned, kinda feels like a waste when most PC enthusiasts will be using a dedicated GPU but then again I can see why too, Intel wanna compete & be above AMD on all levels... :nutkick:
Posted on Reply
#18
Aquinus
Resident Wat-man
Lionheartkinda feels like a waste when most PC enthusiasts will be using a dedicated GPU
There is this thing called skt2011-3. Don't invest in a mainstream platform if you don't like the features, it's really that simple. Intel just does the "one size fits all" thing for 1150 and for the vast majority of people, that's fine.
techy1Sure thing - i3 or Pentium buyers do not even know what letters GPU means and slogans like "+50% mooaar... 4K" is good enough to buy for them... but what about i7 and "K"-series? - nobody of mainstream nor needs or understands those benefits... why those series needs "+50% moaar 4K" slogans? cuz those clients (who needs and or understands "i7") should only be pissed of by reading something like that. (I believe that 4k video is fine for that new iGPU... I was positively surprised that my netbook with i3 Sandy back in its the days could run 1080p.. but then again - the keywords are: "netbook", "i3", "no discrete" and not: "i7-k", "Z170", "above 200$-range GPU")
Then by your logic, no one on a mainstream platform cares about overclocking either. The problem is that statement is false. An i7 is a performance model CPU for a mainstream platform. No one ever said anything about the i7 being a mainstream CPU, but it doesn't change the fact that this socket is intended to reach some of the lowest price points (by Intel's standards) and the highest, however it always needs to be able to cater to the lowest because that's where the bulk of the sales are, not with enthusiasts wanting stuff just a certain way. If you're really that bent out of shape, then don't buy the damn socket and just go with Intel's HEDT platform instead.

Another way of putting it is: Enthusiasts are a very small niche and most "enthusiasts" don't even want to spend much money anyways, so you'd be stuck with a mainstream platform anyways by virtue of your budget. So no, this is Intel making a one-size-fits-all. If you're not happy with it, you have options, it's called Haswell-E. I felt a similar way when I upgraded 3 years ago, hence why I have a 3820 and not a 2600k.
Posted on Reply
#19
Prima.Vera
Why the fk do I need strong GPU on a i7-K processor is beyond my comprehension! Usually people buying i7s are buying for gaming and multimedia. I only need very basic GPU and that's it. HUGE WASTE of transistors, therefore also big arse TDP. Seriously, sometimes I think those managers from Intel are worst than monkeys.
:laugh::toast::D
Posted on Reply
#20
Aquinus
Resident Wat-man
Prima.VeraWhy the fk do I need strong GPU on a i7-K processor is beyond my comprehension! Usually people buying i7s are buying for gaming and multimedia. I only need very basic GPU and that's it. HUGE WASTE of transistors, therefore also big arse TDP. Seriously, sometimes I think those managers from Intel are worst than monkeys.
...because it's a mainstream platform, skt1156/1155/1150 are all mainstream platforms with a full lineup of CPUs from entry to performance. Cool your jets and calm down. Maybe you should go work for them and design a new CPU if you shit don't stink. If you're really that bent out of shape about the platform then get a HEDT platform and stop complaining.
Posted on Reply
#21
TheinsanegamerN
techy1I wish Intel would stop doing iGPUs... cuz I do not need them... I will always choose mid-top range discrete GPU (and still - even a low end GPUs are far more capable than iGPU) - so that iGPU is waste of my money, time and TDP (Intel is like: "oh - look brand new iGPU +50% over previous gen, almost 4k capable - we did this all for you buddy, no matter that you do not and will not use iGPU - but our team worked 2 years (and in that time made only like 0-5% CPU gain) to give you this experience you will never use or need - lucky you, now pay extra for our useful work". I would choose, for that extra $ and extra TDP a +1% CPU power than +50% iGPU .. But there is no such a option (never was and never will be - and that sux).
Go buy a six core i7 and stop complaining. Intel already made what you want. yeah, it cost way more, but guess what? a tiny market demands high prices. socket 1366 and 2011 exist for people like you.
Posted on Reply
#22
Yellow&Nerdy?
RejZoRBecause it's cheaper for them to just churn out all the same CPU's than designing two different designs, one with iGPU and one without.
No doubt that is the case and the reason why Intel is doing it this way, but I wonder if they could possibly disable the iGPU on the unlocked models.
Posted on Reply
#23
ensabrenoir
.....silly enthusiasts(me included)..... business class and mainstream pays the bills. In the all in one desktop, laptop and New Mobile World Order An i7 with a strong igpu is killer because its all about efficiency/battery life/small form factor/price. Adding discrete gpus adds costs,heat,size and power draw....all minuses to the common professional female/male. Successful business 101..... know your market/customers(revenue source) and supply a product that they need/like and will use at a price they're willing pay. And say what you will about Intel's prices......their continuing success still speaks louder.
Posted on Reply
#24
Lionheart
AquinusThere is this thing called skt2011-3. Don't invest in a mainstream platform if you don't like the features, it's really that simple. Intel just does the "one size fits all" thing for 1150 and for the vast majority of people, that's fine.
Lol there's also this thing called money. Why would I invest in a more expensive platform that doesn't benefit me in gaming other than more cores that don't get utilized (In before DX12 BS) and better PCI-E bandwidth for SLI & Crossfire???? Yeah makes total sense.. o_O

When did I say I don't like the features of Intel integrated GPU, I said "kinda feels like a waste when most PC enthusiasts will be using a dedicated GPU but then again I can see why too, Intel wanna compete & be above AMD on all levels" See! I said I understand why they're doing it. :slap:
Posted on Reply
#25
techy1
Anyhow - decent discussion. But I highly doubt that even a pentium/i3 client buys PC without crappy discrete... because he has 0 choice there... if he walks in a PC shop (being uninformed as he is) then the sales guy will sale him like "GT610 with 4GB DDR3" (or something those lines) why? cuz he can (cuz client will not understand anyway) and why not take extra 50 $? The same is in online stores bunch of those low cost entry level PC is complete for complete unaware people, those complects are not completed to be most cost efficient - there is almost always some crappy few generations old discrete GPU packed in. so I do not belive that "if mainstream asks for iGPU, then mainstream gets it". but most of what pisses me off that when new CPUs come out - the new iGPU gets the most attention and CPU improvements gets less and less attention and guess what - less and less improvements. cuz there is no need - everything what tech people are talking is 1) how huge of improvment in iGPU department this generation will be and 2) how much mainstream needs it and how good that Intel got their iGPU on this high level
Posted on Reply
Add your own comment
Apr 28th, 2024 23:11 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts