Thursday, August 3rd 2017
Intel Coffee Lake CPUs Will Require New Motherboards
A motherboard maker's official Twitter feed has just confirmed what we all had an inkling to believe already: Intel's upcoming Coffee Lake architecture, which promises the first major change in Intel's line-up strategy, won't be compatible with existing motherboards and chipsets. Now, granted, companies' official Twitter feeds may sometimes be open to failures or miscommunication from the account operator at the time of writing, but still, the company hasn't doubled back on the original posting.
This does make slightly more sense than other times where Intel didn't offer support for a new generation of its processors in a past chipset. However, this does confirm that Intel users are again left without an upgrade path for the top-of-the-line Intel solutions they may have acquired already. If you purchased an i7-7700K and were expecting to upgrade to an Intel six-core next round, you'll have to rethink that strategy, and your budget, to include a new motherboard with a new chipset (expectedly, Z370.)
Source:
via Tom's Hardware
This does make slightly more sense than other times where Intel didn't offer support for a new generation of its processors in a past chipset. However, this does confirm that Intel users are again left without an upgrade path for the top-of-the-line Intel solutions they may have acquired already. If you purchased an i7-7700K and were expecting to upgrade to an Intel six-core next round, you'll have to rethink that strategy, and your budget, to include a new motherboard with a new chipset (expectedly, Z370.)
48 Comments on Intel Coffee Lake CPUs Will Require New Motherboards
frame drops in ULMB or VR are extremely annoying and game ruining. 1% is a lot of damn frames to be below 120hz. .1% is a far better metric.
The new intel HEDT would also be awful. I am expected for the 8700K for my main gaming rig. If i only had the money to get threadripper for my server too :/
Also what RAM are they using? A lot of games are highly affected by RAM I guess according to
eurogamer refer i just read.
I rock the golden 3200 14-14-14-32/34?
Never bothered to test a difference i just buy quality.
pics.computerbase.de/7/9/4/0/0/2-1080.2481911201.jpg
"KBL Refresh PCH"
The proper 300-series chipsets, aka "CNL PCH" (Cannonlake Platform Controller Hub) will come way after the first wave of mobos with the Z370, and will exceed the Z370's features in certain ways.
pics.computerbase.de/7/9/4/0/0/3-1080.3116651924.jpg
So things will start with a Z370 chipset very similar to the Z270, but mandatory for the new CPUs. The real new chipsets will come early 2018 and will have a few improvements over the supposed flagship chipset.
VR will definitely benefit from cores and cant afford to drop frames, not even .1%.
Also sure future games will use more threads but thats 1, 2, 3, or 6 years form now and people have been saying this for the last 5 years and yet we keep saying this. Most games that are not AAA are still single thread and that doesn't count the 10,000 older games that are still 1 or 2 thread limited.
It's too bad intel decided to not allow in place upgrade
I dont think any HEDT will beat any non-HEDT in gaming, that seems like common knowledge. Didnt think Intel could push 5ghz on all 6 cores, I see varying stories.
If this IS true, this is the end for me and Intel. LGA1151 V2? WTF is that? It is Intel profiteering from what was a loyal gaming fan-base, is what it is.
AMD could not have asked for a better incentive from Intel if they blackmailed them.
I am sure Intel has a great excuse lined up - and I am sure that this is just them testing the waters before they finally decide we are dumb enough to wear it and make it official.
jlelliotton.blogspot.com/p/the-economic-value-of-rapid-response.html War Thunder is still single thread limited and so is Planetside 2. PS2 is 100% impossible even on a 4.8GHz 6700K to get near 120hz. War Thunder still has many sags and dips due to CPU limits even on my 6700K at 4.8GHz. The fact is if it isn't AAA game it is single thread or some shotty 1.5 core threading. (main thread with network and other stuff on second core.) NS2 is single thread limited too like any older game. I'll take a fast single thread rig any day of the year. Most indy games i know are single thread too. Again if it isn't AAA I would be shocked if it properly supports 4 cores let alone 8+.
If i only played AAA sure I would grab an HEDT but regular day to day computing and 99% of all games are single thread and many old games are still horribly limited via single thread which is why i have the system i have.
This comes from someone who has always had one of the fastest single thread systems available and an HEDT server next to him to compare and constantly runs monitoring software to watch for single thread limits and I can barely think of a few programs that i use that are actually threaded. I would love is this was not a fact but it is hence why I pay out of my ass for an SL chip with freakishly expensive RAM. (3200 14-14-14-34). This is newer RAM which is better option. Approximately same latency and way better freqs.
You can get 5.2GHz 7700Ks and 14nm++ is supposed to be better so 4 core should see another 400 MHz according to historical trends and intel statements so 6 core should reach 5GHz+ at the top end even with 2 extra cores. hardly you just dont understand how most programs work and how much things are still single thread dependent. Compare my computer vs a ryzen in browsing and mine beats the crap out of it. Compare my system in OCR, PDF, office, and any none threaded program (aka nearly everything).
Most of Wiondows 7 is single thread too. Win10 has made some small improvements on threadeding the OS on how windows (screens like explorer) load but much is still not.
I would love to get a 8700K for main rig and a 16 core threadripper for server because i could use the extra cores for zipping and rippings and the extra PCIe lanes. But I cant afford the upgrade TT
This is my 4.8GHz 4790K kraken score with 2400mhz 10-12-12-31
This is my 4.8GHz 6700K kraken score
This is Silicon lotteries 4.8GHz 6700K kraken score with 3866 15-19-19-39
The difference is due to RAM. We each win different areas because I have lower latency but he has higher BW.
This is Silicon Lotteries 5.4Ghz 6700K score o_O
He has a 5.5-5.7GHz Phase Change Kaby now TT I want
If you care these are my 2 rigs below
Note:The fact is 90% (not 99%) of indie games are shit, usually solo programmers who dont have the expertise, but try anyway, to fully realize a game, prior to the beginning of coding, and not knowing what makes it good. Given the current amount of indie games that have over-flooded the market, I'd say, ya, that 99% number does reflect "all games" using single threads, so it isnt too far off.
The study was showing user productivity in regards to response times. It has nothing to do with mainframes.
Windows 7 fade animation is 250ms and it feels painfully slow. I turn off windows fade animation and windows load in 30-50ms. This is a night and day difference in responsiveness. Those speeds are below what IBM even studied...granted 300ms was huge so they never bothered to try lower....this was 35 years ago.
Your action per minute skyrocket the faster your system is as the IBM study showed. It appears to be an exponential increase.
300ms is not the limit. removing windows 7 fade animation on windows allowed 30-50ms response when opening and closing windows.
This applies to anything, web browsing, OS, anything that you click and do.
This is why I use the system I have and remove BS animations because they actually hurt your ability to complete tasks.
I also set my android phone to run .5x animation times so its twice as responsive and its soooooo much better.
Web browsing on my 7500U vs 1650v3 vs my 6700K is hugely different and I really loath my 7500U and get still annoyed on my server because how much slower pages load and how less responsive it is.
It is like CRT vs LCD or ULMB LCD vs LCD. Once you see it...feel it....notice it....you never go back. So like SAP are prime examples of this. If you work for a company and their SAP is slower than 300ms response times they are pissing a ton of money away in user productivity.
This applies to anything but SAP is a great example of how this works and how annoying this is,
You can see expert users or people who know exactly what they are doing are most hurt by this delay but even novice are badly hurt by delay.
The study above showed like 1-2 seconds saved per action going from 600ms-300ms so you probably could save another 1-2 seconds per action with getting 100ms responses, which is why i removed windows fade/transitions animations and turned my phone to .5x animation timing. I would set it to 0 animations but certain tasks require animations. Ever use tinder or bumble in swiping? 0 animations results in did i swipe left or right? I have no idea. picture just vanished! roflmao
*system delay: This counts mouse input lag (sample rating, input lag, and more), OS delay, display input lag, display hz and so on.
going from a 125 hz mouse to a 1000hz mouse saves you 7ms
going to 120hz vs 60hz (may save you 8ms give or take)
a monitor with less input lag can save 10-50ms
removing windows 7 animations can save 50-220ms (no idea about win 10...i only use it on laptop and the amount of pointless time wasting animations is rage inducing.)
getting a 5.2Ghz CPU vs 4GHz cpu can easily save in certain tasks 25% of your time. (this is largely thanks to shotty programing. my OCR on my Fujitsu document scanner is single thread. #!$&!#$&#$& Why!?!?!)
and the list goes on from simple optimizations you can do that are not horribly expensive but can add up if you go to the extreme like me :D
The difference from my desktop vs laptop is horrifyingly annoying. :banghead::roll: