Friday, November 4th 2016

Intel Core i5-7600K Tested, Negligible IPC Gains

Ahead of its launch, a Core i5-7600K processor (not ES) made its way to Chinese tech publication PCOnline, who wasted no time in putting it through their test-bench, taking advantage of the next-gen CPU support BIOS updates put out by several socket LGA1151 motherboard manufacturers. Based on the 14 nm "Kaby Lake" silicon, the i5-7600K succeeds the current i5-6600K, and could be positioned around the $250 price-point in Intel's product-stack. The quad-core chip features clock speeds of 3.80 GHz, with 4.20 GHz max Turbo Boost frequency, and 6 MB of L3 cache. Like all its predecessors, it lacks HyperThreading.

In its review of the Core i5-7600K, PCOnline found that the chip is about 9-10% faster than the i5-6600K, but that's mostly only due to its higher clock speeds out of the box (3.80/4.20 GHz vs. 3.50/3.90 GHz of the i5-6600K). Clock-for-clock, the i5-7600K is just about 1% faster, indicating that the "Kaby Lake" architecture offers only negligible IPC (instructions per clock) performance gains over the "Skylake" architecture. The power-draw of the CPU appears to be about the same as the i5-6600K, so there appear to be certain fab process-level improvements, given the higher clock speeds the chip is having to sustain, without a proportionate increase in power-draw. Most of the innovation appears to be centered on the integrated graphics, which is slightly faster, and has certain new features. Find more performance figures in the review link to PCOnline below.
Sources: PCOnline.com.cn, WCCFTech
Add your own comment

116 Comments on Intel Core i5-7600K Tested, Negligible IPC Gains

#76
newtekie1
Semi-Retired Folder
TheinsanegamerNYou are technically correct, as MIPS had a 64 bit CPU in 1991. alpha (1992) and SPARC (1995)and IBM (1995) all beat intel (2001) and AMD (2003) to the punch. However, the AMD implementation in 2003 is the one that dominates windows servers today, and a large number of servers outside of specialized servers run AMD64.


However, in the DESKTOP space, where 99% of consumers exist, AMD beat intel to the punch by three years. athlon64 came out in 2003, core 2 came out in 2006.
Exactly, AMD was not the first 64-bit processor to market. They were just the first 64-bit desktop processor to market. Also AMD didn't beat Intel by three years, they beat them by about 4 months. AMD's Sledgehammer core was the first to support x86-64 on the desktop platform. Sledgehammer came out late Sept. 2003. Intel's Prescott core was their first desktop processor to support x86-64, and it came out Feb. 1 2004. Not that it really mattered, since there wasn't a mainstream 64-bit desktop operating system available until 2005, when XP 64-bit was released. And XP 64-bit was just a rushed job of stripping down Server 2003 that wasn't really that good, in fact it was pretty terrible. We didn't see a decent desktop 64-bit OS until Vista in late 2006. Yes, Vista was actually better than something...

Of course, AMD was so intent on beating Intel to market with a 64-bit desktop processor, that they had to leave out other features on the processor. Like dual-channel memory...which just killed their performance. It wasn't until Socket 939 was released that we got to really see what desktop 64-bit processors from AMD were really capable of, and when AMD really started to lay the beating down on Intel in performance.
Posted on Reply
#77
EarthDog
MxPhenom 216Intel hasn't claimed any increase in IPC though.
I know.. that was my point. :)
Posted on Reply
#78
TheinsanegamerN
newtekie1Exactly, AMD was not the first 64-bit processor to market. They were just the first 64-bit desktop processor to market. Also AMD didn't beat Intel by three years, they beat them by about 4 months. AMD's Sledgehammer core was the first to support x86-64 on the desktop platform. Sledgehammer came out late Sept. 2003. Intel's Prescott core was their first desktop processor to support x86-64, and it came out Feb. 1 2004. Not that it really mattered, since there wasn't a mainstream 64-bit desktop operating system available until 2005, when XP 64-bit was released. And XP 64-bit was just a rushed job of stripping down Server 2003 that wasn't really that good, in fact it was pretty terrible. We didn't see a decent desktop 64-bit OS until Vista in late 2006. Yes, Vista was actually better than something...

Of course, AMD was so intent on beating Intel to market with a 64-bit desktop processor, that they had to leave out other features on the processor. Like dual-channel memory...which just killed their performance. It wasn't until Socket 939 was released that we got to really see what desktop 64-bit processors from AMD were really capable of, and when AMD really started to lay the beating down on Intel in performance.
Good point about the P4s. Forgot intel added AMD64 to them.

The lack of dual channel memory, however, only affected the older socket 754 CPUs, socket 939 and 940 chips, including the 2003 "sledgehammer" athlon 64 (FX-51), had dual channel memory. The Athlon 64 3200+, which released at the same time, was limited to single channel on socket 754.
Posted on Reply
#79
P4-630
newtekie1Of course, AMD was so intent on beating Intel to market with a 64-bit desktop processor
Thats what I meant.
Posted on Reply
#80
mcraygsx
efikkanIf you have a quad core Sandy Bridge (or newer) at ~3.5 GHz, there is absolutely no reason to upgrade to a new quad core, unless your hardware is so old that it suffers from stability issues.

The only reason to upgrade is to get more cores, and then you'll need to go for the E-platform (Broadwell-E).


Most hardware gradually get unstable with age, symptoms usually occur when the hardware is older than 4-5 years, and many computers are replaced before they are 8 years.

With the flat-lining of desktop computer performance the past decade, consumer usually have longer upgrade cycles for desktops than in the 90s, when people upgraded because the new one was much better.

The primary revenue source for Intel is professional customer, and there business is still booming.
I don't believe its true that hardware magically becomes UNSTABLE with age. We have older Pentium II AKA Slot A still running just like the first day we've had one. It has been running almost non stop for many years now. Same can be said for RIVA TNT2 it still uses.

But hardware do and can degrade when we try to go outside of manufacturers specifications.
Posted on Reply
#81
newtekie1
Semi-Retired Folder
TheinsanegamerNGood point about the P4s. Forgot intel added AMD64 to them.

The lack of dual channel memory, however, only affected the older socket 754 CPUs, socket 939 and 940 chips, including the 2003 "sledgehammer" athlon 64 (FX-51), had dual channel memory. The Athlon 64 3200+, which released at the same time, was limited to single channel on socket 754.
Yes, but I don't consider the Socket 940 chips to be desktop processors. They were just relabeled Opterons. They required server motherboards and used a socket intended for the server market. And I believe it even required ECC memory. Sure there were a few 940 boards that were marketed more towards the desktop, just like there always is, but in the end they were still server motherboards. Socket 754 was the desktop socket. It wasn't until Socket 939 came out in June 2004 that the desktop platform got dual-channel memory. Their rush to market with the crippled 754 was a big reason they didn't get as far ahead as they could have over Intel.
Posted on Reply
#82
efikkan
mcraygsxI don't believe its true that hardware magically becomes UNSTABLE with age. We have older Pentium II AKA Slot A still running just like the first day we've had one. It has been running almost non stop for many years now. Same can be said for RIVA TNT2 it still uses.

But hardware do and can degrade when we try to go outside of manufacturers specifications.
It's a well known fact that several electronic components degrade over time, including all silicon chips, capacitors, soildering, etc. If you have 1000 computers for 30 years, more and more of them will slowly fail, while some of them will continue to work for a long time.

Heat, overclocking, humidity, etc. may shorten the lifetime even more.
Posted on Reply
#83
newtekie1
Semi-Retired Folder
efikkanIt's a well known fact that several electronic components degrade over time, including all silicon chips, capacitors, soildering, etc. If you have 1000 computers for 30 years, more and more of them will slowly fail, while some of them will continue to work for a long time.

Heat, overclocking, humidity, etc. may shorten the lifetime even more.
Degradation does happen over time due to electronmigration. If left at stock settings, processors will last long beyond their usefulness. That is why they can be overclocked so far. The manufacturers purposely leave a pretty big amount of headroom in what the processor is actually capable of in order to make sure the processor doesn't die prematurely.

The same can't be said about some of the other components in the systems though. Cheap electrolytic capacitiers, for example, can sometimes start to degrade within a year. And I've seen computers that were only a few years old with caps that tested way out of spec, even if they look fine(though some didn't look so fine). The computer was unstable for sure, and a recap of the motherboard fixed it.
Posted on Reply
#84
ratirt
newtekie1Exactly, AMD was not the first 64-bit processor to market. They were just the first 64-bit desktop processor to market. Also AMD didn't beat Intel by three years, they beat them by about 4 months. AMD's Sledgehammer core was the first to support x86-64 on the desktop platform. Sledgehammer came out late Sept. 2003. Intel's Prescott core was their first desktop processor to support x86-64, and it came out Feb. 1 2004. Not that it really mattered, since there wasn't a mainstream 64-bit desktop operating system available until 2005, when XP 64-bit was released. And XP 64-bit was just a rushed job of stripping down Server 2003 that wasn't really that good, in fact it was pretty terrible. We didn't see a decent desktop 64-bit OS until Vista in late 2006. Yes, Vista was actually better than something...

Of course, AMD was so intent on beating Intel to market with a 64-bit desktop processor, that they had to leave out other features on the processor. Like dual-channel memory...which just killed their performance. It wasn't until Socket 939 was released that we got to really see what desktop 64-bit processors from AMD were really capable of, and when AMD really started to lay the beating down on Intel in performance.
You are missing the point. At least for me. AMD wasn't the first to TRY create a 64bit instructions for the CPU that's for sure nonetheless AMD was the first to succeed with server and desktop when everybody else failed. What's important it WAS FULLY AMD INNOVATION that they have created not taking it from anyone like intel did with Prescott. AMD Is more innovative and products released prove it. And it was 3 years. We don't talk about when Intel took the AMD64 bit instructions and release its competitive processor but when created something valuable for CPU industry. For me it was SIMD instructions and MMX which have some value.
mcraygsxI don't believe its true that hardware magically becomes UNSTABLE with age. We have older Pentium II AKA Slot A still running just like the first day we've had one. It has been running almost non stop for many years now. Same can be said for RIVA TNT2 it still uses.

But hardware do and can degrade when we try to go outside of manufacturers specifications.
Pentium 2 :) Riva TNT2. Nice times :) Cant believe you still keep that and work on those :)
I sill have an old Athlon XP (thoroughbred) somewhere and intel 6600 quad core, Core 2 duo E8200 and I7 860( I think) :D With Mobo and everything. I bet they still work :)
Posted on Reply
#85
ratirt
alucasaSo do I. So, that's 2 out of ... well, you know.
Not sure about the I7 kaby's but skylake's are not more energy efficient. While 3770K oscillates in 77W TDP range and that can be also lowered by manual voltage tweak. The I7 6700k and even I5 6600K both are 91W TDP. Where do you see that improvement? Cause I don't see it.
Posted on Reply
#86
EarthDog
First, there are IPC gains between ivy and skylake, (5%?), there is also a 500 mhz clockspeed difference between the two. I'd be willing to bet it's around or less than a 77W part when it's run at the same speed and it's still faster clock for clock. ;)

Also, both chips can lower their voltage at stock speeds.
Posted on Reply
#87
ratirt
EarthDogFirst, there are IPC gains between ivy and skylake, (5%?), there is also a 500 mhz clockspeed difference between the two. I'd be willing to bet it's around or less than a 77W part when it's run at the same speed and it's still faster clock for clock. ;)

Also, both chips can lower their voltage at stock speeds.
Well I don't know about you but if 5% is a huge improvement that's your opinion. Stating that new CPU's skylake use less power is not right. IPC better yeah it is. Stated previously by you 5%? You're not even sure if it is 5% or lower. Other thing with TDP. Both I7 and I5 skylake has a TDP of 91Watts while I7 3770k has only 77W. I5 skylake and I7 3770k has same clock speeds. the 6700 has a boost of 300 while on boost. I didn't measure this information is base from Intel's web page.


Table shows different what goes with that will generate more heat.
Posted on Reply
#88
EarthDog
Never said 5% was a huge improvement bud... just putting it out there that it (ipc) needs to be considered as a variable... same as the 500 mhz base clockspeed difference. again, lower the skylake to 3.5ghz and it's voltage, you are likely a hell of a lot closer than you are at stock to matching ivybridge tdp.. if not right there AND it's (negligibly) faster. ;)

Here man... I googed some results for you regarding IPC where you can validate my UNDERESTIMATE of its IPC performance. ;)
www.anandtech.com/show/9483/intel-skylake-review-6700k-6600k-ddr4-ddr3-ipc-6th-generation/9

Cliff's:
Ivy Bridge to Haswell: Average ~11.2% Up
Haswell to Broadwell: Average ~3.3% Up
Broadwell to Skylake (DDR4): Average ~2.7% Up
So closer to 15%, not 5% as I read that link. ;)
Posted on Reply
#89
mrthanhnguyen
When will we see a big jump from Intel? My 3770k still doing its job perfectly and the 7700k maybe gives me 20% boost in some certain game only.
Posted on Reply
#90
64K
mrthanhnguyenWhen will we see a big jump from Intel? My 3770k still doing its job perfectly and the 7700k maybe gives me 20% boost in some certain game only.
Possibly with 10nm Cannonlake we will see a decent improvement but I'm skeptical of a big jump in performance over Skylake/ Kaby Lake. The last I read Cannonlake could come Q3 or Q4 2017 so it's a while yet and it could possibly get delayed again.

imo we aren't going to see a lot more gains until Intel moves away from silicon. Necessity is the mother of invention so I guess it comes down to whether Intel believes that there really needs to be significantly faster CPUs than we already have for the average user to justify the costs of R&D.
Posted on Reply
#91
ERazer
damn, I guess ima hold onto my 2600k :ohwell:
Posted on Reply
#92
EarthDog
ERazerdamn, I guess ima hold onto my 2600k :ohwell:
Just curious how much performance increase you need...?

You also have a 1080 which has a glass ceiling on it in some titles from the CPU.
Posted on Reply
#93
ERazer
EarthDogJust curious how much performance increase you need...?

You also have a 1080 which has a glass ceiling on it in some titles from the CPU.
Actually the 1080 is getting bottle neck a little bit and the PCI-E 2nd gen is not helping.

Minimal IPC gain but i guess its time to upgrade for newer features ie PCI-E 3 and M.2, probably gain 7-12fps and thats the dilemma dropping 500-600$ for little gain worth it.
Posted on Reply
#94
EarthDog
You are losing ~ 1% on PCIe, then depending on the title/game/settings, it could be 0 or it could be 20 FPS. Worth it is obviously up to you, but I personally wouldn't like buying such an expensive card to only have it capped by my system. Now, are your settings WAYT WAY WAY more than playable? Of course, but in the back in my feeble head... I wouldn't run a 1080 on a 2600K without considering an upgrade in the near term. :)
Posted on Reply
#95
ERazer
EarthDogYou are losing ~ 1% on PCIe, then depending on the title/game/settings, it could be 0 or it could be 20 FPS. Worth it is obviously up to you, but I personally wouldn't like buying such an expensive card to only have it capped by my system. Now, are your settings WAYT WAY WAY more than playable? Of course, but in the back in my feeble head... I wouldn't run a 1080 on a 2600K without considering an upgrade in the near term. :)
With the current games that i play i get steady 90-140 (non AAA games) which is fine for now. Guess ill make my decision when TPU ran some bench.

That 2600k though most worth it CPU i ever bought! nah Intel is just freaking slacking.
Posted on Reply
#96
efikkan
ERazerdamn, I guess ima hold onto my 2600k :ohwell:
The CPU code of games usually scales pretty bad (suffering from branch mispredictions and cache misses), and most of the improvements since Sandy Bridge wouldn't yield any significant improvements in rendering performance. So for strictly gaming performance you'll be more or less just left with clock frequency scaling.

Other workloads, such as photo and video editing might scale better on newer CPUs.
Posted on Reply
#97
simlariver
ERazerActually the 1080 is getting bottle neck a little bit and the PCI-E 2nd gen is not helping.

Minimal IPC gain but i guess its time to upgrade for newer features ie PCI-E 3 and M.2, probably gain 7-12fps and thats the dilemma dropping 500-600$ for little gain worth it.
I'm in a similar situation, althought I only have a 480 ...
Laptops are a completely different thing

The power gain from KabyLake+DDR4 isn't really worth it right now if I'm only looking at games/general performance for money. I'm more concerned by newer tech but I feel like that tech (USB 3.1 gen 2, NVMe, Thunderbolt, etc) isn't mainstream enough to be really an investment for the future.
How many pcie NVMe hard drives are on the market right now ? It's pretty limited.
What about USB Type C ?
How many mobo support alternative uses of USB 3.1 type C like hdmi (No wait, scratch that, I hate dongles)
Posted on Reply
#98
ratirt
efikkanThe CPU code of games usually scales pretty bad (suffering from branch mispredictions and cache misses), and most of the improvements since Sandy Bridge wouldn't yield any significant improvements in rendering performance. So for strictly gaming performance you'll be more or less just left with clock frequency scaling.

Other workloads, such as photo and video editing might scale better on newer CPUs.
Maybe it's because the CPU's intel now releases it's just simple die shrink letting it to be naturally faster from the die shrink than from any improvements. I bet if Intel shrunk ivy it would have performed similar to Skylake and would have way less power usage. The only thing that's added is the AVX 2.0 instruction set and that's all.
Posted on Reply
#99
efikkan
ratirtMaybe it's because the CPU's intel now releases it's just simple die shrink letting it to be naturally faster from the die shrink than from any improvements. I bet if Intel shrunk ivy it would have performed similar to Skylake and would have way less power usage. The only thing that's added is the AVX 2.0 instruction set and that's all.
KabyLake is not a die shrink.
Haswell and Skylake were new architectures over SandyBridge, both featured improved larger prefetchers and other improvements which offer limited IPC gains, but highly dependent on workload.
AVX 2 was introduced with Haswell.
Posted on Reply
#100
Blueberries
What a disappointment, I was expecting at least a 5% increase in IPC if not 7-8%. HVEC improvements are great, but I can stream 4k@60FPS with an i3-6320 so I'm not jumping out of my seat for KL.
Posted on Reply
Add your own comment
May 8th, 2024 16:22 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts