Friday, July 24th 2015

Skylake iGPU Gets Performance Leap, Incremental Upgrade for CPU Performance

With its 6th generation Core "Skylake" processors, Intel is throwing in everything it's got, into increasing performance of the integrated graphics. This is necessitated not by some newfound urge to compete with entry-discrete GPUs from NVIDIA or AMD, but a rather sudden increase in display resolutions, after nearly a decade of stagnation. Notebook and tablet designers are wanting to cram in higher resolution displays, such as WQHD (2560 x 1440), 4K (3840 x 2160), and beyond, and are finding it impossible to achieve them without discrete graphics. This is what Intel is likely after. The aftereffect of this effort would be that the iGPU will be finally capable of playing some games at 720p or 900p resolutions, with moderate eye-candy. Games such as League of Legends should be fairly playable, even at 1080p. Intel claims that its 9th generation integrated graphics will over a 50% performance increment over the previous generation.

Moving on to CPU, and the performance-increase is a predictable 10-20% single/multi-thread CPU performance, over "Broadwell." This is roughly similar to how "Haswell" bettered "Ivy Bridge," and how "Sandy Bridge" bettered "Lynnfield." Intel will provide platform support on some of its "Skylake-U" ultraportable variants, for much of the modern I/O used by today's tablets and notebooks, which required third-party controllers, and which competing semi-custom SoCs natively offer, such as eMMC 5.0, SDIO 3.0, SATA 6 Gb/s, PCIe gen 3.0, and USB 3.0. Communications are also improved, with 2x 802.11 ac, Bluetooth 4.1, and WiDi 6.0.
Source: FanlessTech
Add your own comment

100 Comments on Skylake iGPU Gets Performance Leap, Incremental Upgrade for CPU Performance

#51
Petey Plane
Brother DrakeThough I understand why Intel has put so much into onboard graphics with smartphones, laptops and office workstations making up such a large part of the market, I wish that they would make a line of CPU's for people who use discreet graphics cards as well OR(!) work with the graphics card makers to create an architecture that shares graphics workload between the CPU and the external graphics card when one is available. That may be too much to ask but why not make a line of CPUs that are just CPUs? I feel that Intel is kind of ignoring the (very large) gamer market.
the i7 Extreme X and K chips are for that. Also, the economies of scale are why they don't make 1151 socketed chips without iGPUs, since they all share the same basic architecture. It doesn't make a lot of sense for Intel to fab a separate chip architecture without an iGPU, when it can just as easily build 1 fab, bin the higher performance chips, and leave it up to the end user to determine if they want to use the iGPU or not. If they were to build a line of 1151 CPUs without iGPUs, they would cost significantly more, since the economies of scale would not benefit the much smaller production line.

As far as sharing resources, that is really a driver issue, and one that Microsoft says will be addressed with DX12.
Posted on Reply
#52
RejZoR
Ferrum MasterI can also those are also the users that don't pay! Take your facts a bit from the realistic point of view. So around 4mil peps actually use the igpu? Are you kidding? From the whole CPU batch?
Question is, how many of those users are actually using discrete graphic cards. You can't claim market share if GPU is attached to every CPU whether you use it or not.
Posted on Reply
#53
Petey Plane
RejZoRQuestion is, how many of those users are actually using discrete graphic cards. You can't claim market share if GPU is attached to every CPU whether you use it or not.
The Steam Hardware survey records your primary display device, so even if your have an iGPU, it will record the discreet GPU that is outputting video. I think the 20% number is accurate. I'm sure there are a lot of people in less developed areas of the world playing LoL, DOTA and CS 1.6 at 720 with the settings turned down, easily doable with even an older Intel iGPU.
Posted on Reply
#54
tabascosauz
Brother DrakeThough I understand why Intel has put so much into onboard graphics with smartphones, laptops and office workstations making up such a large part of the market, I wish that they would make a line of CPU's for people who use discreet graphics cards as well OR(!) work with the graphics card makers to create an architecture that shares graphics workload between the CPU and the external graphics card when one is available. That may be too much to ask but why not make a line of CPUs that are just CPUs? I feel that Intel is kind of ignoring the (very large) gamer market.
The problem with this is that making a line of CPUs without iGPUs requires a lot of resources, because the current 2C / 4C dies for Haswell don't just go into one product. And yes, the i3s and Pentiums shouldn't be handicapped 4C dies; they are reportedly smaller in size and with just 2 cores. Very large gamer market? Are they ignoring us gamers? We still use their CPUs, and even the lack of overclockability in Haswell isn't a huge problem for gamers.

And you gotta keep in mind that Intel High End Desktop platforms are made just for that, and a large part of the "gamer market" doesn't even game on desktops, let alone gaming desktops or custom built desktops.
Petey Planethe i7 Extreme X and K chips are for that. Also, the economies of scale are why they don't make 1151 socketed chips without iGPUs, since they all share the same basic architecture. It doesn't make a lot of sense for Intel to fab a separate chip architecture without an iGPU, when it can just as easily build 1 fab, bin the higher performance chips, and leave it up to the end user to determine if they want to use the iGPU or not. If they were to build a line of 1151 CPUs without iGPUs, they would cost significantly more, since the economies of scale would not benefit the much smaller production line.

As far as sharing resources, that is really a driver issue, and one that Microsoft says will be addressed with DX12.
However, it IS viable to have a "gamer" line of CPUs with iGPU disabled, without having to spend as much resources as a new CPU die would need. Intel has proved this for 3 generations so far with E3-12x0 SKUs that have their iGPUs disabled. Although Intel clearly doesn't think that gaming is very important because as of E3 v4 the trend has been broken and the E3s were never marketed for gaming anyways. If Intel offer a similarly handicapped unlocked SKU in the future, it could be interesting, though this would have a separate, potentially detrimental impact on its existing i5-xxxxK and i7-xxxxK SKUs.
Posted on Reply
#55
ensabrenoir
Enthusiast........sometimes we just don't get it. The p.c. landscape has changed. Intel knows what its doing and they are doing it at an insanely fast rate......themselves.......would have sworn that they would've had to have bought someone to do it.
Posted on Reply
#56
Petey Plane
tabascosauzThe problem with this is that making a line of CPUs without iGPUs requires a lot of resources, because the current 2C / 4C dies for Haswell don't just go into one product. And yes, the i3s and Pentiums shouldn't be handicapped 4C dies; they are reportedly smaller in size and with just 2 cores. Very large gamer market? Are they ignoring us gamers? We still use their CPUs, and even the lack of overclockability in Haswell isn't a huge problem for gamers.

And you gotta keep in mind that Intel High End Desktop platforms are made just for that, and a large part of the "gamer market" doesn't even game on desktops, let alone gaming desktops or custom built desktops.



However, it IS viable to have a "gamer" line of CPUs with iGPU disabled, without having to spend as much resources as a new CPU die would need. Intel has proved this for 3 generations so far with E3-12x0 SKUs that have their iGPUs disabled. Although Intel clearly doesn't think that gaming is very important because as of E3 v4 the trend has been broken and the E3s were never marketed for gaming anyways. If Intel offer a similarly handicapped unlocked SKU in the future, it could be interesting, though this would have a separate, potentially detrimental impact on its existing i5-xxxxK and i7-xxxxK SKUs.
I get what you're saying, but knowing modern marketing, they'd probably charge more for the chip with the iGPU disabled.
Posted on Reply
#57
Brother Drake
If Intel is pushing the graphics architecture then it stands to reason that some processors will fail testing for that part of the processor. Those processors can have the graphics disabled and sold as a 'P' series as Intel did with Sandy Bridge. They (should) cost less and use less power than fully functional processors. I also think Intel needs to remember that the customers that have always driven the advancement of top end CPUs are gamers and graphics specialists and that market is the most profitable on a price per unit basis.
Posted on Reply
#58
alwayssts
Ferrum MasterAn update to a thing I do not use... an iGPU...

SCREW it... Sandy Bridge FTW...
While I hear ya on the IGP, and people crapping on enthusiast improvements in Skylake in general...I dunno, man. Even though people seem to agree it's generally not worth it at this point for discrete gpu gaming and/or general tasks...I really want off of this old boat. It's not broken, but the wood paneling is starting to fade.

I want M.2/Sata express capability. I want to actually be able to RAID my MX100's (remember many of us Sandy users can't use Raid0 + Trim, at least without a hell of a lot of trouble). Hell, I want pci-e 3.0.

I also really would like added tangible performance for video decoding stuffs, etc (50%?).

Many people seem to take the approach that upgrading from a 4c to a 4c will long be a fool's errand, and the only way to go is up (ie to a 5820k)....which is fair....but I, for one, think Skylake is going to get there just as well. It may be through IPC rather than cores, and overclocked vs overclocked (I think at stock the 5820k will justify it's ~10% higher price) , but that's okay.

While I'm not quite on board with the NUC future, I am on board with mini-itx/microatx present. Just like with Sandy Bridge, I could see building a fairly small performance box out of Skylake; perhaps one that can more-or-less rival a similar-price build on x99. Whenever I think of simply building the latter now, I feel compelled to check the BTU rating of my air conditioner...I just don't think it would work (for me).

IOW, it may be a side-step for us enthusiasts, sure....but it's a side-step when coupled with all the improvements over the past few years finally pushes it over the edge, imho.

Do I fault anyone that goes 5820k+? No.

How about waiting for Zen (which may compete or be slightly better over-all while perhaps a better price for an 8-core part)? What about Skylake-E, which actually will probably be the big step forward?

All of those are respectable opinions.

To me though, I need a new platform, preferably the best combo of longevity/newest features, with a worth-while upgrade on the cpu which preferably won't use a ridiculous amount of power overclocked.

I think that's Skylake.
Posted on Reply
#59
tabascosauz
Plus, HD Graphics has its uses. Quick Sync is a nice thing to have when recording games, and when you need to test the odd thing or two sans graphics card, Intel's graphics will do nicely.

Very few of us are extreme overclockers and the presence of an iGPU is not the end of the world. If Intel sat back and kept HD 2500 performance for 5 generations, whiners would complain about how they're just making money off helpless consumers and not doing anything for them. If Intel is improving iGPUs as it is now, the same whiners complain about how Intel GPUs suck and how the Future is Fusion.
Posted on Reply
#60
RejZoR
Skylake-E might be interesting. But that part is coming out sometime 2016 which is still faaaaar away...
Posted on Reply
#61
GreiverBlade
tabascosauzPlus, HD Graphics has its uses. Quick Sync is a nice thing to have when recording games, and when you need to test the odd thing or two sans graphics card, Intel's graphics will do nicely.
Shadowplay? DVR? ... used both (ok shadowplay is better ...) but HD graphic has it's use only if he's alone ... which is rarely the case in a i5-i7 machine (unless laptop but) and you don't game so often on a laptop or HD graphics (well i would not spit on a beffier HD graphics indeed ... but for mobile CPU ... eventually HTPC tho AMD is still a better option~ i am quite happy about the HD Graphics 5500 vs the HD Graphics 4400 on my laptop tho )

as for testing things when you main gpu go kia/mia ... well i always have a spare GT730/HD5450 but for a mITX or even Micro ATX indeed it's usefull (HTPC again ... or lanbox at most)
Posted on Reply
#62
RichF
tabascosauzBulldozer was 100% AMD's problem, and 100% AMD's downfall. Through the Pentium 4-esque pipeline problem, supremely slow cache and the attempt to stay 'upgradable' by not moving to FCH, AMD still tried to distort the facts and try to make it sound as if Bulldozer was better than SB with its 8 'cores'. When you are fighting an uphill battle with the X6 1100T being all that you have to offer, something like Bulldozer is probably going to hurt you more than staying conventional. Intel had a backup plan for Prescott in the Pentium III that eventually ended up as Yonah and Core 2. AMD had nothing.

You could say that AMD had no resources to devote to making a backup plan because Intel had bribed the OEMs those years ago. Whatever the excuse might be, the burden of Bulldozer rested squarely on an already weak AMD's shoulders, while Intel had a nice surprise with Sandy Bridge supremacy.

In the corporate world, exactly how much space is devoted to conscience and morality? When Intel was fumbling with the monstrosity that was Prescott, what did you expect them to do? Pull an AMD and place all their hopes on Prescott?
The FX chips are not that bad. If they were on 22nm they could be clocked even higher. The main problem is that they were released before the rest of the tech was ready for them (Windows thread management, DX12, etc.) and now that the tech has almost caught up they're still stuck on 32nm.

CMT is a sound strategy, but not when DX11 and Windows are poor at multithreading.
Posted on Reply
#63
Caring1
Brother Drakewhy not make a line of CPUs that are just CPUs? I feel that Intel is kind of ignoring the (very large) gamer market.
They do, they're called XEON!
Posted on Reply
#64
RejZoR
Xeons are for workstations, not gaming rigs. Unless you're really loaded in which case sure, go with a Xeon...
Posted on Reply
#65
Nordic
RejZoRQuestion is, how many of those users are actually using discrete graphic cards. You can't claim market share if GPU is attached to every CPU whether you use it or not.
Looks like about 20% use intel gpu's for gaming.
store.steampowered.com/hwsurvey
RejZoRXeons are for workstations, not gaming rigs. Unless you're really loaded in which case sure, go with a Xeon...
Actually, xeons are pretty cheap. You can get the equivalent of an i7 for the price of an i5 with xeons. That is if you don't mind not being able to overclock.
Posted on Reply
#66
RejZoR
I don't think there is any need. With my i7 920 I can feel in games like Killing Floor 2 that at stock clocks of 2,66 GHz it lacks some grunt. Overclocked to just 3,2 GHz it already feels like million times smoother gameplay even in later waves with tons of enemies. at 4 GHz and 4,2 GHz, it's silky smooth no matter what.

So, theoretically, if Skylake is lets say 50% more efficient per MHz and is factory clocked at 4,2 GHz, it should be fine even at stock speeds for a very long time.

I wonder if Skylake-E will feature more cores and what will they use eDRAM for. The CPU part or only as aid for iGPU. I'm kinda aiming at that for year 2016. If it'll be able to provide me with all the compute power for 6 more years like i7 920 did, that would be amazing.
Posted on Reply
#67
LAN_deRf_HA
All the 920 champions have quite the rose tinted view of the first gen i7s. They don't clock well, ran hot and hungry when pushed past 4ghz, and weren't any faster than yorkfield for gaming. They were stomped on as soon as Sandy showed up. I don't know of anyone who jumped on Sandy from Nehalem and regretted it. And now it's been replaced 6 times over, each time with a performance boost, and you people are still bemoaning the lack of advancement? Get over it. These mental gymnastics used to justify keeping a cpu from 2008 are absurd.
Posted on Reply
#68
RejZoR
They don't clock well?! From 2,66 GHz to 4,2 GHz is "not clock well" !? Your mental gymnastics are broken...

If I'd give you one system with overclocked i7 920 and one with Skylake and you wouldn't be able to tell a difference. I don't give a shit about synthetic benchmarks and few examples where you actually notice it. I'm often encoding videos and compressing data with 7zip and this thing is hammering workloads like a champ with 8 threads. And with games, maybe you gain 2fps using latest CPU. Totally worth replacing entire platform and spending 1k € on it... by your logic.
Posted on Reply
#69
hat
Enthusiast
Have to agree with RejZoR. 2.66 > 4 not clocking well? That's pretty damn well if you ask me. Haswell clocks like shit compared to the origional i7. I'm sure the lower end chips could clock well, IF we were allowed...
Posted on Reply
#70
Thefumigator
RejZoRThey don't clock well?! From 2,66 GHz to 4,2 GHz is "not clock well" !? Your mental gymnastics are broken...

If I'd give you one system with overclocked i7 920 and one with Skylake and you wouldn't be able to tell a difference. I don't give a shit about synthetic benchmarks and few examples where you actually notice it. I'm often encoding videos and compressing data with 7zip and this thing is hammering workloads like a champ with 8 threads. And with games, maybe you gain 2fps using latest CPU. Totally worth replacing entire platform and spending 1k € on it... by your logic.
The i7 920 is one of the nicest CPUs around, however, it heats up quite a lot.
In my case, I own a less powerful CPU, FX 8320 since 2012, and I still can't think on upgrading, the thing is a beast.
Posted on Reply
#71
RejZoR
It's not that problematic to be honest. I had it in TT Lanbox, cooled by Thermalrights low profile cooler, crammed below PSU. Granted, I couldn't OC it to 4,2GHz that way but it was manageable at 3,2GHz. With AiO, it feels superb even at lowest fan RPM, it's what I use for ALL workloads. And lets don't forget, it's the exact same configuration as Skylake (same cache size, same number of cores etc) just made with a lot larger process node. It's 6 years old after all... It's normal that it's "hot" compared to current processors.
Posted on Reply
#72
R-T-B
Caring1They do, they're called XEON!
More like HEDT
LAN_deRf_HAAll the 920 champions have quite the rose tinted view of the first gen i7s. They don't clock well, ran hot and hungry when pushed past 4ghz, and weren't any faster than yorkfield for gaming. They were stomped on as soon as Sandy showed up. I don't know of anyone who jumped on Sandy from Nehalem and regretted it. And now it's been replaced 6 times over, each time with a performance boost, and you people are still bemoaning the lack of advancement? Get over it. These mental gymnastics used to justify keeping a cpu from 2008 are absurd.
I bought a Haswell-E from Nehalem and regretted it. Start counting man. Nehalem also clocked insanely well... Haswell-E and it's entire 20% (from nehalem) IPC increase? Look at my overclock. Do the math. Not worth it.
Posted on Reply
#73
1Kurgan1
The Knife in your Back
I'm happy I bought my 5820k instead of waiting for these after seeing news like this.
Posted on Reply
#74
GreiverBlade
R-T-BI bought a Haswell-E from Nehalem and regretted it. Start counting man. Nehalem also clocked insanely well... Haswell-E and it's entire 20% (from nehalem) IPC increase? Look at my overclock. Do the math. Not worth it.
i am happy i have a 4690K instead of the 920 i had ...and 16gb 2400 instead of 12gb 1600 (ok from 3.5 to 4.5 is not same as 2.66 to 4.0 ... after all 340mhz is a huge gap and HT is a totally usefull thing... and triple channel give a HUGE edge over double in day to day use) and did cost a bit less than initial investment, (i suspect it's the cost of the 2011-v3 who make you feel that ... while with a 4790K you could have a different feeling for a fraction of the cost involved)
face it 1366 actual is 1150/1151 not 2011/2011-v3 (unless you have a 990X ) you have the right to be nostalgic but being realist is more important.

just in case ... even at 20%(unless it's Haswell-E only improvement) more ipc a 4790K is base clock at 4.0 if my 920 did effectively go from 2.66 to 4.0 a 4790K do that at stock and can clock higher and would need less power ... technically my 4690K is faster than my 920 at their respective OC and, too, need less power (NO HT INVOLVED THANKS, keep it realistic in my domain ... i dont edit video use heavy threaded soft, just gaming and day to day browsing, some bench sometime) ... end words : better and worth the change (plus as i said in another thread, reselling my 1366 setup got me 30% of the 1150 setup investment back but i didn't took a 2011-v3 :laugh: )

well for the IGP ... i am happy i have a igp in my cpu in case my discrete crap out (if my GPU would do that ... i never needed more than once .... for re flashing a 6950)
Posted on Reply
#75
R-T-B
GreiverBladeyou have the right to be nostalgic but being realist is more important.
It's not nostalgia. It's that the money to performance ratio just isn't there. Another video card would've served me far better.

Mind you, I was going from a 990x i7, not quite the same as the 920... Same damn core though minus AES instructions.
Posted on Reply
Add your own comment
May 9th, 2024 15:49 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts