• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel to Debut its Core "Skylake" Processors at Gamescom 2015

Hey guys, what if I'm upgrading from the 2600k to this. Will I see any significant improvements? I'm planning to do SLI for my rig, would it be more viable to wait for the X series instead? Thanks!
 
You can expect about a 5% to 10% increase in game FPS going from the 3770K to the 6770K, under best case scenarios. In resolutions of 1440 and above, the difference will be lower, more like 2% to 3%. Some large scale multilayer games, like Battlefield 4 (and therefore Starwars Battlefront and the eventual Battlefield 5) and Planetside, do benefit from a more powerful CPU, but it's still going to be less than 10% difference at resolutions over 1080.

Yeah, we can all blame AMD for the slow advancements in the CPU. It's really a joke actually given how miniscule the advancement is. And this is not going to get better either, after Skylake, AMD is still 5 generations behind given how pathetic their chipsets and "APUs" are. This makes me really mad ;(
 
Last edited:
Hey guys, what if I'm upgrading from the 2600k to this. Will I see any significant improvements? I'm planning to do SLI for my rig, would it be more viable to wait for the X series instead? Thanks!

If your primary interest is gaming you will see an FPS improvement of about 10% in games from the new CPU and DDR4 RAM at any resolution above 1080. I have 2500K, and i'm going to wait one more generation myself, for Cannonlake, the 10nm die shrink of Skylake.

As far as SLI, you don't need an X (or whatever they call the follow up to 2011-v3) if you are only using 2 GPUs. 2 8X lanes of PCI 4.0 will be more than enough. Even with PCI 3.0, there is virtually no difference between a card running at PCI 16x and PCI 8x. Unless you're running a 3 way 4k setup, even modern GPUs cant fully populate 16x PCI lanes. Save your money on the CPU and Mobo and buy the most expensive GPUs your budget can afford. You only need the X platform if you are thinking about 3 or more GPUs.
 
With all the AMD drama.

REBRAND!
Skylake is a whole new architecture.


We should have by now 8 cores / 16 threads as cheap mainstream by now. But because of 0 (zero) competition, we won't get this too soon...
I'm really curious what will be the difference between my i7 3770K and this i7 6770K on gaming perf.. REALLY curious!
I cant wait to see the difference going from an i5 2500k stock (im oc'd atm @4.5GHz but ill run stock for comparisons sake) to that of the i5 6500k.
 
I find it kinda funny how they went full circle with the naming scheme... They are basically back to SB...
 
Actually there are six chipsets for socket 1151.

And it's been like that for the past two generations.
I liked it very much when 90 series debuted and there were only 2 chipsets (if you exclude X99) because there were only two choices and that meant that there was less thinking and planning. After this series comes out there is going to be so much products on the market and that is what I do not like. Take a look at this (https://www.asus.com/motherboards/) and select Z97 chipset and you will find two pages of motherboards but with different configuaration. Now add to that more companies and more chipsets and you get a nightmare when buying a motherboard.


BTW, this doesn't save Intel money, it costs them more to offer these options
Actually it does save Intel money because some chipsets especially X99 are a lot costlier. But if we exclude overpriced X99 then I can see how Intel could save money by producing only one chipset.
 
Actually it does save Intel money because some chipsets especially X99 are a lot costlier. But if we exclude overpriced X99 then I can see how Intel could save money by producing only one chipset.
Why are you even bringing X99 up in this argument??? It's the most irrelevant chipset when talking about intel giving too many options.

H81 is a good budget option, H87/97 is a good midrange option, and Z87/97 is good for enthusiasts. X99 is on a completely different platform so should not even be considered for anyone who is not going to need 8 cores or just want to have a 2011 system... What's more is X99 boards use a completely different layout, there are much more connections to the chipset...
 
H81 is a good budget option, H87/97 is a good midrange option, and Z87/97 is good for enthusiasts.
But it would be a lot better if there would be only Zxxx available which has all the things the other cheaper ones have and buying a motherboard would be easier.

Why are you even bringing X99 up in this argument??? It's the most irrelevant chipset when talking about intel giving too many options.
You are right, shouldn't have brought it up since it was not a part of the argument but it does show that too many chipsets per socket/platform can be troubling (6 chipsets per 1151) and one per socket/platform can simplify things (X-type chipsets).
 
Q and B chipsets are rarely used outside of business AIOs.

Remember Ivy Bridge? That had two Z chipsets... I'd say ATM it is much less confusing now than it was back then.

And what about back on 775, when NVidia also made chipsets...
 
I wonder if this will be the time to upgrade my ivy bridge
 
But it would be a lot better if there would be only Zxxx available which has all the things the other cheaper ones have and buying a motherboard would be easier.
wow are you serious? encourage people that don't care about overclocking or getting all the feature they offer to buy those $140+ chipsets?

And actually happens the opposite, from what I have seen most people only know about Z chipset and higher end K series about Intel, people that are on the fence of jumping to Intel, this is the argument they bring. "idk an Intel rig cost too much" , "you have to spend a lot on a motherboard"...etc and that's because they don't know about CHEAPER options like B85, H81, H87/97 or CPU options like i5-4440, i5-4460, i5-4560 that aren't all about overclocking but still deliver great performance over AMD.

Chipsets like B81 are getting known as of late because of the Pentium G3258 fuzz that has the eye of every budget gamer, it can be overclocked even on those cheap chipsets.
 
why 8 cores.?.?. just to have bigger numbers like AMD... which means nothing... what do you do that needs 8 cores?

4 fast cores faster then 6 or 8 slow cores... moar! Ghz!

game performance is 5FPS more maybe

You want 6 cores... get a 5960x or a Xeon and get 12 cores

Because competition would have pushed higher and thus leveraged Microsoft. Likely we'd have seen something like Windows 10 much earlier. Intel has stagnated to the point that now other industries are pressing Microsoft for better multi-core awareness. Gaming is certainly one of them given that all the consoles are 8 core machines. With extremely weak cores that desperately need multithreading to even function.

Still why stop at 6 core? Why not make 8 the optimal setup for 10? Answer is likely Intel. I suspect we'll see more 6 core CPUs from Intel hitting outside the server/workstation market. They've already done it within the last couple years. I wouldn't be surprised if eventually i7s were all 6 core and up. Then you'd have i5s being Quads and i3s being duals. So 6/12, 4/8, 2/4. If prelim reports are correct and AMD CPUs become viable again thanks to the changes in Win 10, it would be a natural shift for Intel. They'd have to because the only area AMD CPUs could beat Intel in was multithreading and perhaps Win 10 pronounces that even more. Thus 8 core optimization would have hurt Intel worse. Especially with Zen coming.

AMD can wave a ton of cores in Intel's face but if the OS can't utilize them properly, it'll just be more of the same and Intel will win.
 
FX is not going to get any less unviable than it already is...
 
Yeah, we can all blame AMD for the slow advancements in the CPU. It's really a joke actually given how miniscule the advancement is. And this is not going to get better either, after Skylake, AMD is still 5 generations behind given how pathetic their chippers and "APUs" are. This makes me really mad ;(

We can also blame SUN, SGI , HP , IBM power pc (apple) and digital... All the other CPU vendors that failed.

Lets face it 10 - 15 years ago intel held 1 or 2 cards. They had huge competition in the server market and high end specialized work stations. Even amd went 64bit before intel....

Now intel holds a full deck. They have no competition.
They control server , work station , gamming.

Back then CPU advancements where huge now its tiny. But everyone seems to be like OMG new revison have to upgrade everything. It dosn't help developers are lazy as well and wont program any games that only 30% of the PC market or people with high end machines can run.

Lets face it PC's arent as sexy as they use to be. Programers and lazy and everyone wants to make as much $$$ as possible. With out any competition Intel will continue to slowly trickle fed the technology.
 
wow are you serious? encourage people that don't care about overclocking or getting all the feature they offer to buy those $140+ chipsets?
Fist of all, you can get a new motherboards with Z97 chipset for less than a $100 (http://www.newegg.com/Product/Produ...scription=z97&bop=And&Order=PRICE&PageSize=30) and secondly in my case where there would be only one chipset for 1151 CPUs I did not mean that it would be as expensive as Zxxx chipsets but more along the lines of Hxxx and Bxxx chipsets.
 
I wouldn't OC any quad core on those VRMs...
 
But there is no need to spend even $100, if there are $50-$80 options that can be adjusted to people's wallet and needs.

and secondly in my case where there would be only one chipset for 1151 CPUs I did not mean that it would be as expensive as Zxxx chipsets but more along the lines of Hxxx and Bxxx chipsets.
Well, that's exactly what they are for, Hxx is cheaper, no overclocking but same features as Z and Bxx is cheaper, no overclocking and less features of a Z.
 
Back then CPU advancements where huge now its tiny. But everyone seems to be like OMG new revison have to upgrade everything. It dosn't help developers are lazy as well and wont program any games that only 30% of the PC market or people with high end machines can run.

Lets face it PC's arent as sexy as they use to be. Programers and lazy and everyone wants to make as much $$$ as possible. With out any competition Intel will continue to slowly trickle fed the technology.

*It doesn't help developers are clever as well and won't program any games that only 30% of the PC market or people with high end machines can run
 
The integrated memory controller of "Skylake" CPUs support both DDR3 and DDR4 memory standards, and should prove to be a transition point between the two.
great... that would hurt me to lose a "still plenty" 2x8gb DDR3 2400 C10 kit ... waiting on the review of the CPU is now the main idea tho i don't think skylake should be a worthy update over a Haswell DC i5-4690K... (what they expect? +10% increase?)
and DDR4 ... is not really a "BIG" improvement...

oh well after that little post : no upgrade before the next next line from Intel (or maybe Zen from AMD ... who knows ...)
 
I find it kinda funny how they went full circle with the naming scheme... They are basically back to SB...

You mean Core 2 / Conroe?

Back then CPU advancements where huge now its tiny.

Completely untrue, because today's CPUs do far more with far less power than the CPUs of yesteryear. Example: the Athlon 64 3200+ is a single-core design that consumes 89W and was released in 2003. The Celeron J1800 is a dual-core that consumes 10W and was released in 2013. The Celeron outperforms the Athlon while consuming far less power - in other words, it took 10 years to lower the power consumption by a factor of 10. That's pretty damn impressive any way you look at it.
 
I hear Elon Musk is considering starting a new company that makes high-end GPUs and CPUs, which would certainly throw a wrench in the works...but he's probably too busy re-inventing several other industries. That headline would scare the crap out of Intel, nVidia, and AMD, considering his past efforts at disruption. Just attaching his name to the new company would guarantee massive capital investment. C'mon, Iron Man, we know you're into computers and gaming, how 'bout it? Just as a hobby?
 
I hear Elon Musk is considering starting a new company that makes high-end GPUs and CPUs, which would certainly throw a wrench in the works...but he's probably too busy re-inventing several other industries. That headline would scare the crap out of Intel, nVidia, and AMD, considering his past efforts at disruption. Just attaching his name to the new company would guarantee massive capital investment. C'mon, Iron Man, we know you're into computers and gaming, how 'bout it? Just as a hobby?

Nope, Musk cannot create CPUs or GPUs, at least not the ones that can ever compete with Intel, AMD, and NVIDIA, no matter how much money he throws at it. You see, there's this thing called IPR and patents. The modern x86 CPU and PC GPU are a result of an IPR cross-licensing clusterfvck between Intel, NVIDIA, and AMD, which leaves no room for additional players. At best, Musk can create chips that compete with Samsung and Qualcomm in the mobile device space.
 
Even amd went 64bit before intel....

That's not entirely true. Intel released a 64-bit CPU a bit before AMD. Intel developed and released the Itanium line in 2001, which was a completely 64-bit CPU. The problem was it didn't run x86 programs that were written for 32-bit CPU's. So in 2003 AMD released the Athlon 64's which used the AMD64 instruction set, and was backwards compatible with 32-bit programs. If AMD had embraced a straight x64 approach all programs would be 64-bit by now, but we've slowly migrated over because AMD offered a "better" solution to the transition for x86 to x64.
 
That's not entirely true. Intel released a 64-bit CPU a bit before AMD. Intel developed and released the Itanium line in 2001, which was a completely 64-bit CPU. The problem was it didn't run x86 programs that were written for 32-bit CPU's. So in 2003 AMD released the Athlon 64's which used the AMD64 instruction set, and was backwards compatible with 32-bit programs. If AMD had embraced a straight x64 approach all programs would be 64-bit by now, but we've slowly migrated over because AMD offered a "better" solution to the transition for x86 to x64.
ok let say the 1st 64bit "normal" customer cpu (and affordable ...) ... was AMD ... (and would have been a pure 64 bit transition good? no legacy support? i don't think so :D, so then thanks AMD ;) )
 
Back
Top