Thursday, August 3rd 2017

Intel Coffee Lake CPUs Will Require New Motherboards

A motherboard maker's official Twitter feed has just confirmed what we all had an inkling to believe already: Intel's upcoming Coffee Lake architecture, which promises the first major change in Intel's line-up strategy, won't be compatible with existing motherboards and chipsets. Now, granted, companies' official Twitter feeds may sometimes be open to failures or miscommunication from the account operator at the time of writing, but still, the company hasn't doubled back on the original posting.

This does make slightly more sense than other times where Intel didn't offer support for a new generation of its processors in a past chipset. However, this does confirm that Intel users are again left without an upgrade path for the top-of-the-line Intel solutions they may have acquired already. If you purchased an i7-7700K and were expecting to upgrade to an Intel six-core next round, you'll have to rethink that strategy, and your budget, to include a new motherboard with a new chipset (expectedly, Z370.)
Source: via Tom's Hardware
Add your own comment

48 Comments on Intel Coffee Lake CPUs Will Require New Motherboards

#26
ensabrenoir
Many people truly just want the best regardless of who makes it. For the longest that has been Intel but they've been caught sleeping so Amd is on fire right now. And Intel doesn't have a true answer yet so it looks like they're just dumping anything and everything on the market just to stay relevent. No one's gonna go for that.
Posted on Reply
#27
80-watt Hamster
If the rumors of core counts greater than four on mainstream chips are true, it could be that more pins are required than are available on 1151. Or it's just Intel being Intel and sticking with their two-generation-per-socket strategty. It's lame either way; I was looking forward to replacing my 6600K with a Coffee Lake chip. Looks like I should be watching 7700K pricing instead.
Posted on Reply
#28
Bansaku
There is a reason I have not upgraded since 2012, and just when I think it might be worth it, Intel lines it's pockets again by forcing consumers to by a whole new motherboard with it's new CPU. Ya, me thinks it's time to go full team RED soon....
Posted on Reply
#29
HopelesslyFaithful
R0H1TRight, does that include this?


Rubbish six core 1600 trails the fastest mainstream CPU by a whopping (min) 8fps, clock for clock, I bet 4k is even closer!
yes and if you play at 120hz ULMB that ryzen is an unplayable trash. 138/114fps can be painful in fast pace FPSs and thats the OC 7700K.

frame drops in ULMB or VR are extremely annoying and game ruining. 1% is a lot of damn frames to be below 120hz. .1% is a far better metric.

The new intel HEDT would also be awful. I am expected for the 8700K for my main gaming rig. If i only had the money to get threadripper for my server too :/

Also what RAM are they using? A lot of games are highly affected by RAM I guess according to
eurogamer refer i just read.

I rock the golden 3200 14-14-14-32/34?

Never bothered to test a difference i just buy quality.
Posted on Reply
#30
CiTay
Another thing about Z370: It's basically a refreshed Z270 chipset, according to some new slides.

pics.computerbase.de/7/9/4/0/0/2-1080.2481911201.jpg

"KBL Refresh PCH"

The proper 300-series chipsets, aka "CNL PCH" (Cannonlake Platform Controller Hub) will come way after the first wave of mobos with the Z370, and will exceed the Z370's features in certain ways.

pics.computerbase.de/7/9/4/0/0/3-1080.3116651924.jpg

So things will start with a Z370 chipset very similar to the Z270, but mandatory for the new CPUs. The real new chipsets will come early 2018 and will have a few improvements over the supposed flagship chipset.
Posted on Reply
#31
OSdevr
HimymCZeThat's the main issue, CoffeeLake will still be in LGA1151 socket and from HW pov everything on MB and socket is the same... This just looks like a greedy PUSH for costumers to upgrade.
AssimilatorYou have no idea how power delivery works, do you?
Mind explaining Assimilator? Are you referring to increased power requirements of some of the Coffee Lake chips or chip and board layout? Interposers have power planes and decoupling capacitors on them. Power can be delivered to any part of the chip with just a via connecting to the proper plane.
Posted on Reply
#32
DeathtoGnomes
HopelesslyFaithfulyes and if you play at 120hz ULMB that ryzen is an unplayable trash. 138/114fps can be painful in fast pace FPSs and thats the OC 7700K.

frame drops in ULMB or VR are extremely annoying and game ruining. 1% is a lot of damn frames to be below 120hz. .1% is a far better metric.

The new intel HEDT would also be awful. I am expected for the 8700K for my main gaming rig. If i only had the money to get threadripper for my server too :/

Also what RAM are they using? A lot of games are highly affected by RAM I guess according to
eurogamer refer i just read.

I rock the golden 3200 14-14-14-32/34?

Never bothered to test a difference i just buy quality.
The current cache of games that take advantage of single core performance and games that use multiple cores better. Future games will still perform according to that differences. As long as Intel attempts to keep the Crown for single core gaming, people will have to compromise if they do a high-core AMD build and want to play a single core performance game. I dont think the new intel HEDT will as awful as you think. I'd rather see performance close or even similar only to have it boil down to a price war.

VR will definitely benefit from cores and cant afford to drop frames, not even .1%.
Posted on Reply
#33
HopelesslyFaithful
DeathtoGnomesThe current cache of games that take advantage of single core performance and games that use multiple cores better. Future games will still perform according to that differences. As long as Intel attempts to keep the Crown for single core gaming, people will have to compromise if they do a high-core AMD build and want to play a single core performance game. I dont think the new intel HEDT will as awful as you think. I'd rather see performance close or even similar only to have it boil down to a price war.

VR will definitely benefit from cores and cant afford to drop frames, not even .1%.
6 core at 5GHz will trash any HEDT system in gaming period. (4 core at 5GHz still beats all HEDTs expect in niche areas). The slide above shows that. Not even my source material.

Also sure future games will use more threads but thats 1, 2, 3, or 6 years form now and people have been saying this for the last 5 years and yet we keep saying this. Most games that are not AAA are still single thread and that doesn't count the 10,000 older games that are still 1 or 2 thread limited.
Posted on Reply
#34
phanbuey
HopelesslyFaithful6 core at 5GHz will trash any HEDT system in gaming period. (4 core at 5GHz still beats all HEDTs expect in niche areas). The slide above shows that. Not even my source material.

Also sure future games will use more threads but thats 1, 2, 3, or 6 years form now and people have been saying this for the last 5 years and yet we keep saying this. Most games that are not AAA are still single thread and that doesn't count the 10,000 older games that are still 1 or 2 thread limited.
This is all true ^ the 8700K will be a killer chip.

It's too bad intel decided to not allow in place upgrade
Posted on Reply
#35
DeathtoGnomes
HopelesslyFaithful6 core at 5GHz will trash any HEDT system in gaming period. (4 core at 5GHz still beats all HEDTs expect in niche areas). The slide above shows that. Not even my source material.

Also sure future games will use more threads but thats 1, 2, 3, or 6 years form now and people have been saying this for the last 5 years and yet we keep saying this. Most games that are not AAA are still single thread and that doesn't count the 10,000 older games that are still 1 or 2 thread limited.
Multi-core gaming is here, its still a bit early and we might not see the full effects if it didnt take so damn long to develop a game to take advantage. Thats also why we dont see more newer games using multi-core, there is a rush to poop games out right and left, devs just dont have the time. There are a couple MMO's that have already switched to multi-core support, Rift comes to mind there. Not sure of any others that made such a jump.


I dont think any HEDT will beat any non-HEDT in gaming, that seems like common knowledge. Didnt think Intel could push 5ghz on all 6 cores, I see varying stories.
Posted on Reply
#36
silapakorn
Doesn't bug me at all since I'm planning to change my MB anyway.
Posted on Reply
#37
thesmokingman
HopelesslyFaithful6 core at 5GHz will trash any HEDT system in gaming period. (4 core at 5GHz still beats all HEDTs expect in niche areas). The slide above shows that. Not even my source material.

Also sure future games will use more threads but thats 1, 2, 3, or 6 years form now and people have been saying this for the last 5 years and yet we keep saying this. Most games that are not AAA are still single thread and that doesn't count the 10,000 older games that are still 1 or 2 thread limited.
This reads like the last battle cry for low core cpus, whilst the industry gears up to unleash 16 core parts onto the market. lol
Posted on Reply
#38
Psinet
I guess Intel have made my decision for me - if I need a new motherboard to get a CPU with more than quad-core on Z270 - then I just jump ship to AMD while I am at it.

If this IS true, this is the end for me and Intel. LGA1151 V2? WTF is that? It is Intel profiteering from what was a loyal gaming fan-base, is what it is.

AMD could not have asked for a better incentive from Intel if they blackmailed them.

I am sure Intel has a great excuse lined up - and I am sure that this is just them testing the waters before they finally decide we are dumb enough to wear it and make it official.
Posted on Reply
#39
HopelesslyFaithful
This should be required reading for everyone because for some reason 35 years later people still don't understand this and this has been known since 1982! This is again why I don't use my Server as my main rig. It is substantially slower in day to day tasks and I can feel it.

jlelliotton.blogspot.com/p/the-economic-value-of-rapid-response.html
DeathtoGnomesMulti-core gaming is here, its still a bit early and we might not see the full effects if it didnt take so damn long to develop a game to take advantage. Thats also why we dont see more newer games using multi-core, there is a rush to poop games out right and left, devs just dont have the time. There are a couple MMO's that have already switched to multi-core support, Rift comes to mind there. Not sure of any others that made such a jump.


I dont think any HEDT will beat any non-HEDT in gaming, that seems like common knowledge. Didnt think Intel could push 5ghz on all 6 cores, I see varying stories.
War Thunder is still single thread limited and so is Planetside 2. PS2 is 100% impossible even on a 4.8GHz 6700K to get near 120hz. War Thunder still has many sags and dips due to CPU limits even on my 6700K at 4.8GHz. The fact is if it isn't AAA game it is single thread or some shotty 1.5 core threading. (main thread with network and other stuff on second core.) NS2 is single thread limited too like any older game. I'll take a fast single thread rig any day of the year. Most indy games i know are single thread too. Again if it isn't AAA I would be shocked if it properly supports 4 cores let alone 8+.

If i only played AAA sure I would grab an HEDT but regular day to day computing and 99% of all games are single thread and many old games are still horribly limited via single thread which is why i have the system i have.

This comes from someone who has always had one of the fastest single thread systems available and an HEDT server next to him to compare and constantly runs monitoring software to watch for single thread limits and I can barely think of a few programs that i use that are actually threaded. I would love is this was not a fact but it is hence why I pay out of my ass for an SL chip with freakishly expensive RAM. (3200 14-14-14-34). This is newer RAM which is better option. Approximately same latency and way better freqs.

You can get 5.2GHz 7700Ks and 14nm++ is supposed to be better so 4 core should see another 400 MHz according to historical trends and intel statements so 6 core should reach 5GHz+ at the top end even with 2 extra cores.
thesmokingmanThis reads like the last battle cry for low core cpus, whilst the industry gears up to unleash 16 core parts onto the market. lol
hardly you just dont understand how most programs work and how much things are still single thread dependent. Compare my computer vs a ryzen in browsing and mine beats the crap out of it. Compare my system in OCR, PDF, office, and any none threaded program (aka nearly everything).

Most of Wiondows 7 is single thread too. Win10 has made some small improvements on threadeding the OS on how windows (screens like explorer) load but much is still not.

I would love to get a 8700K for main rig and a 16 core threadripper for server because i could use the extra cores for zipping and rippings and the extra PCIe lanes. But I cant afford the upgrade TT

This is my 4.8GHz 4790K kraken score with 2400mhz 10-12-12-31
This is my 4.8GHz 6700K kraken score
This is Silicon lotteries 4.8GHz 6700K kraken score with 3866 15-19-19-39

The difference is due to RAM. We each win different areas because I have lower latency but he has higher BW.

This is Silicon Lotteries 5.4Ghz 6700K score o_O

He has a 5.5-5.7GHz Phase Change Kaby now TT I want

If you care these are my 2 rigs below

Desktop
4.8GHz 6700K
G.Skill 32GB DDR4 3200 14-14-14-34 2T
ASRock OC Forumla
512GB 950 PRO
2x1TB MX200 RAID 0
980TI 1500MHz
Core X9
10GbE

Server/WS
4.2-4.4GHz 1650v3
64GB DDR4 ECC RDDIMM
ASRock X99 WS
LSI 9211
Intel SAS Expander
14x3TB
8x6TB
480GB SSD
Predator X 360
Norco 4224
10GbE
Posted on Reply
#40
[XC] Oj101
I can confirm that CL won't work on current chipsets :) (or :( I guess). It's been known for a while - all existing leaks/screenshots are either on Z370 or fake.
Posted on Reply
#41
DeathtoGnomes
HopelesslyFaithfulThis should be required reading for everyone because for some reason 35 years later people still don't understand this and this has been known since 1982! This is again why I don't use my Server as my main rig. It is substantially slower in day to day tasks and I can feel it.

jlelliotton.blogspot.com/p/the-economic-value-of-rapid-response.html



War Thunder is still single thread limited and so is Planetside 2. PS2 is 100% impossible even on a 4.8GHz 6700K to get near 120hz. War Thunder still has many sags and dips due to CPU limits even on my 6700K at 4.8GHz. The fact is if it isn't AAA game it is single thread or some shotty 1.5 core threading. (main thread with network and other stuff on second core.) NS2 is single thread limited too like any older game. I'll take a fast single thread rig any day of the year. Most indy games i know are single thread too. Again if it isn't AAA I would be shocked if it properly supports 4 cores let alone 8+.

If i only played AAA sure I would grab an HEDT but regular day to day computing and 99% of all games are single thread and many old games are still horribly limited via single thread which is why i have the system i have.

This comes from someone who has always had one of the fastest single thread systems available and an HEDT server next to him to compare and constantly runs monitoring software to watch for single thread limits and I can barely think of a few programs that i use that are actually threaded. I would love is this was not a fact but it is hence why I pay out of my ass for an SL chip with freakishly expensive RAM. (3200 14-14-14-34). This is newer RAM which is better option. Approximately same latency and way better freqs.

You can get 5.2GHz 7700Ks and 14nm++ is supposed to be better so 4 core should see another 400 MHz according to historical trends and intel statements so 6 core should reach 5GHz+ at the top end even with 2 extra cores.



hardly you just dont understand how most programs work and how much things are still single thread dependent. Compare my computer vs a ryzen in browsing and mine beats the crap out of it. Compare my system in OCR, PDF, office, and any none threaded program (aka nearly everything).

Most of Wiondows 7 is single thread too. Win10 has made some small improvements on threadeding the OS on how windows (screens like explorer) load but much is still not.

I would love to get a 8700K for main rig and a 16 core threadripper for server because i could use the extra cores for zipping and rippings and the extra PCIe lanes. But I cant afford the upgrade TT

This is my 4.8GHz 4790K kraken score with 2400mhz 10-12-12-31
This is my 4.8GHz 6700K kraken score
This is Silicon lotteries 4.8GHz 6700K kraken score with 3866 15-19-19-39

The difference is due to RAM. We each win different areas because I have lower latency but he has higher BW.

This is Silicon Lotteries 5.4Ghz 6700K score o_O

He has a 5.5-5.7GHz Phase Change Kaby now TT I want

If you care these are my 2 rigs below

Desktop
4.8GHz 6700K
G.Skill 32GB DDR4 3200 14-14-14-34 2T
ASRock OC Forumla
512GB 950 PRO
2x1TB MX200 RAID 0
980TI 1500MHz
Core X9
10GbE

Server/WS
4.2-4.4GHz 1650v3
64GB DDR4 ECC RDDIMM
ASRock X99 WS
LSI 9211
Intel SAS Expander
14x3TB
8x6TB
480GB SSD
Predator X 360
Norco 4224
10GbE
Was there a specific point in the blog you want to convey to everyone, or are you just trying to tell people that if they dont know what a mainframe they are stupid? Most people cant afford or would use separate server / gaming rigs (not including consoles), so I wont waste my time discussing that point.

Note:The fact is 90% (not 99%) of indie games are shit, usually solo programmers who dont have the expertise, but try anyway, to fully realize a game, prior to the beginning of coding, and not knowing what makes it good. Given the current amount of indie games that have over-flooded the market, I'd say, ya, that 99% number does reflect "all games" using single threads, so it isnt too far off.
Posted on Reply
#42
HopelesslyFaithful
DeathtoGnomesWas there a specific point in the blog you want to convey to everyone, or are you just trying to tell people that if they dont know what a mainframe they are stupid? Most people cant afford or would use separate server / gaming rigs (not including consoles), so I wont waste my time discussing that point.

Note:The fact is 90% (not 99%) of indie games are shit, usually solo programmers who dont have the expertise, but try anyway, to fully realize a game, prior to the beginning of coding, and not knowing what makes it good. Given the current amount of indie games that have over-flooded the market, I'd say, ya, that 99% number does reflect "all games" using single threads, so it isnt too far off.
I was referring to indie as in like anything not major developer. but most major dev games are still single thread limited except major AAA games....basically anything not what these places reviews

The study was showing user productivity in regards to response times. It has nothing to do with mainframes.

Windows 7 fade animation is 250ms and it feels painfully slow. I turn off windows fade animation and windows load in 30-50ms. This is a night and day difference in responsiveness. Those speeds are below what IBM even studied...granted 300ms was huge so they never bothered to try lower....this was 35 years ago.

Your action per minute skyrocket the faster your system is as the IBM study showed. It appears to be an exponential increase.

300ms is not the limit. removing windows 7 fade animation on windows allowed 30-50ms response when opening and closing windows.

This applies to anything, web browsing, OS, anything that you click and do.

This is why I use the system I have and remove BS animations because they actually hurt your ability to complete tasks.

I also set my android phone to run .5x animation times so its twice as responsive and its soooooo much better.

Web browsing on my 7500U vs 1650v3 vs my 6700K is hugely different and I really loath my 7500U and get still annoyed on my server because how much slower pages load and how less responsive it is.

It is like CRT vs LCD or ULMB LCD vs LCD. Once you see it...feel it....notice it....you never go back.
Broad Applicability
The studies described up to this point involved scientists, engineers, and programmers. A test conducted with administrative professionals indicates that the same benefits can be realized with sub-second response time in data base applications. Component forecasters at IBM's Poughkeepsie facility make frequent reference to an online data base when estimating requirements for electronic parts. The work involves the maintenance of part inventories, bills of materials, and timetables of production and delivery, all tasks similar to those handled by production planners in many organizations.
Five component forecasters were provided subsecond response time for a half-day experiment during which their transaction rate productivity was measured. In their normal working environment they had a system response time of five or more seconds and an average individual productivity rate of 99 transactions per hour. During the test they worked at an average of 336 transactions per hour, a productivity increase of 339%.
So like SAP are prime examples of this. If you work for a company and their SAP is slower than 300ms response times they are pissing a ton of money away in user productivity.

This applies to anything but SAP is a great example of how this works and how annoying this is,



You can see expert users or people who know exactly what they are doing are most hurt by this delay but even novice are badly hurt by delay.
Posted on Reply
#44
phanbuey
I wonder if they will do a Coffe Lake-X variant compatible with x299
Posted on Reply
#45
HopelesslyFaithful
DeathtoGnomesThanks for taking time to explain that.
no problem...only like the 100th time across forums lol.
phanbueyI wonder if they will do a Coffe Lake-X variant compatible with x299
that is a good question. Maybe a 44 lane quad channel :D I know....i can wish
Posted on Reply
#46
80-watt Hamster
HopelesslyFaithfulYou can see expert users or people who know exactly what they are doing are most hurt by this delay but even novice are badly hurt by delay.
That's really interesting. My job isn't dependent on action throughput for productivity, but I did notice that it took me quite a bit longer to edit a spreadsheet when we upgraded to Office 2016/365, particularly on an older laptop where the additional animation introduced a perceptible lag to many input actions. It's much better with those animations turned off.
Posted on Reply
#47
HopelesslyFaithful
80-watt HamsterThat's really interesting. My job isn't dependent on action throughput for productivity, but I did notice that it took me quite a bit longer to edit a spreadsheet when we upgraded to Office 2016/365, particularly on an older laptop where the additional animation introduced a perceptible lag to many input actions. It's much better with those animations turned off.
This applies to anything even windows 7 windows opening and closing and web browsing. Now different tasks will have different user response delay so its not a 100% scale. But if you log how many actions (clicks, moves, or whatever) per hour you can get a feeling about how much of an improvement might happen if you reduce system delay*.

The study above showed like 1-2 seconds saved per action going from 600ms-300ms so you probably could save another 1-2 seconds per action with getting 100ms responses, which is why i removed windows fade/transitions animations and turned my phone to .5x animation timing. I would set it to 0 animations but certain tasks require animations. Ever use tinder or bumble in swiping? 0 animations results in did i swipe left or right? I have no idea. picture just vanished! roflmao

*system delay: This counts mouse input lag (sample rating, input lag, and more), OS delay, display input lag, display hz and so on.

going from a 125 hz mouse to a 1000hz mouse saves you 7ms
going to 120hz vs 60hz (may save you 8ms give or take)
a monitor with less input lag can save 10-50ms
removing windows 7 animations can save 50-220ms (no idea about win 10...i only use it on laptop and the amount of pointless time wasting animations is rage inducing.)
getting a 5.2Ghz CPU vs 4GHz cpu can easily save in certain tasks 25% of your time. (this is largely thanks to shotty programing. my OCR on my Fujitsu document scanner is single thread. #!$&!#$&#$& Why!?!?!)
and the list goes on from simple optimizations you can do that are not horribly expensive but can add up if you go to the extreme like me :D

The difference from my desktop vs laptop is horrifyingly annoying. :banghead::roll:
Posted on Reply
#48
TheOne
Here's hoping this is a misunderstanding or a decision that can be changed.
Posted on Reply
Add your own comment
Apr 25th, 2024 02:26 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts