• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Intel Core i7 "Broadwell-E" HEDT Chips Arrive in 2016

It is a fact of life. I have a smart phone and I don't use 95% of its functionality.

If you are using 5% only of the functionality of your smartphone, then you are most probably quite stupid :D and neither deserve to own that phone nor you need it. :D

Amd's Zen is planned for a 2016 release.

Now, with the good news that GloFo will have access to more resources regarding new technologies from IBM, then I am very optimistic that these 2016 babies from AMD will cause quite a lot of problems to the competition and will stir nicely the environment.
 
SB users save money, again lol

The jig is up. Shrinking and x86 are done.
 
If you are using 5% only of the functionality of your smartphone, then you are most probably quite stupid :D and neither deserve to own that phone nor you need it. :D

wow dude, really? :shadedshu:




I just need a new GPU and more RAM
 
SB users save money, again lol

The jig is up. Shrinking and x86 are done.


SB (Sandy Bridge?) if so, then i agree, 3 years on, and still have not seen any reason to replace my 4.6ghz 2500k
 
SB (Sandy Bridge?) if so, then i agree, 3 years on, and still have not seen any reason to replace my 4.6ghz 2500k
X99 is a high-end multi-core platform, primarily meant for users that need the threads and added performance capabilities. Most users don't have need for current chips at all, if they have anything SB or later. But, the power savings is really nice, and Haswell does offer significant performance increase compared to SB, so if you do renders or such workloads, and time and power consumption is important, then SB and SB-E are very much obsolete.
 
...............o_O Th isa true Hedt here. to release anything for this platform any earlier would be unwise. After 2 years most enthusiast will be more likely snap up any upgrade. Then a year after that, release a new platform/ socket, Profitable Business 101. Not like anything new will suddenly appear that these machines cant handle.
 
X99 is a high-end multi-core platform, primarily meant for users that need the threads and added performance capabilities. Most users don't have need for current chips at all, if they have anything SB or later. But, the power savings is really nice, and Haswell does offer significant performance increase compared to SB, so if you do renders or such workloads, and time and power consumption is important, then SB and SB-E are very much obsolete.

Right, i understand what the X99 platform is for. But my computer is mainly for gaming (and the internets), as i have a feeling most of this site's users' computers are. Because of that, there is still no reason to upgrade from my 2500k. I just find it strange when people foam at the mouth to upgrade their x79s and such, just for gaming, when there are no titles that benefit an increase from 6 to 8 CPU cores, and in general, the Z97 platforms will run games faster, even with 3 way SLI. But, i guess there will always be a need to brag about the size of your e-peen.
 
Right, i understand what the X99 platform is for. But my computer is mainly for gaming (and the internets), as i have a feeling most of this site's users' computers are. Because of that, there is still no reason to upgrade from my 2500k. I just find it strange when people foam at the mouth to upgrade their x79s and such, just for gaming, when there are no titles that benefit an increase from 6 to 8 CPU cores, and in general, the Z97 platforms will run games faster, even with 3 way SLI. But, i guess there will always be a need to brag about the size of your e-peen.
You are very right. But, the only reason is a lack of software advancement that most users would benefit from. That's not to say there is a need for software advancement, per se, but if there was something out there that everyone needed the added performance for, or if the lowered power consumption proved to truly save money, then things might be different.

Large-scale deployments benefit purely from the performance increase and power savings, which together, make the upgrade to faster chips a savings, rather than an expense.

Haswell-based designs also make tablets and such much more attractive performance-wise, from that added power savings, too.
 
140w TDP and people bitch about AMD's 125w 8-core FX processors. Must be Intel's fab process ain't working too well if a 14nm process requires 140w for a 6-8 core CPU.
 
140w TDP and people bitch about AMD's 125w 8-core FX processors. Must be Intel's fab process ain't working too well if a 14nm process requires 140w for a 6-8 core CPU.
It depends on what level of performance this chip can do with its 140W... I bet its a bit more than AMD can do with its 125W, don't you think? 8)
 
140w TDP and people bitch about AMD's 125w 8-core FX processors. Must be Intel's fab process ain't working too well if a 14nm process requires 140w for a 6-8 core CPU.
Again, its a 140w thermal design power, not power consumption.

These two cpu's are aimed at different markets. I think the extra heat(tdp) is justified.
 
If you are using 5% only of the functionality of your smartphone, then you are most probably quite stupid :D and neither deserve to own that phone nor you need it. :D
You have no right to say if I deserve it or not, nor to call me stupid. You are correct that I do not need it, but it was free and I like nice things for free.

Just like I don't need my smartphone, I don't need broadwell-e. None the less, I will continue to drool over it and would be a happy owner of one. I would also put it to 100% use 24/7 for WCG!
 
It depends on what level of performance this chip can do with its 140W... I bet its a bit more than AMD can do with its 125W, don't you think? 8)

That's not the point. A lot of fanbois have been bashing AMD for their 125w CPUs on 32nm even though the 8-core CPUs are also just 125w. For Intel to release a 14nm node CPU that has a 140w TDP is absurd for a 6-8 core model. If their process was even remotely close to being proper they should be in the 95w range on 14nm.
 
That's not the point. A lot of fanbois have been bashing AMD for their 125w CPUs on 32nm even though the 8-core CPUs are also just 125w. For Intel to release a 14nm node CPU that has a 140w TDP is absurd for a 6-8 core model. If their process was even remotely close to being proper they should be in the 95w range on 14nm.
Without knowing performance or clockspeeds, your point is rather... unimportant. 140W TDP means that they can force the same and more current into a smaller process, and actually indicates the opposite of what you're trying to say.

With what you are saying, the clock speed an dperforamnce is equal, and that's not likely. There's also this wonderful thing called cache size and speed that makes AMD look so poor with the low TDP/performance ratio.
 
This only prove how much worth investment in X99.
That will be top platform 15 months at least and than Broadwell E will support X99 chipset and 2011-3.
I would like to see Rampage 5 Extreme Full Black immediately before New Year at least some pictures.
 
That's not the point. A lot of fanbois have been bashing AMD for their 125w CPUs on 32nm even though the 8-core CPUs are also just 125w. For Intel to release a 14nm node CPU that has a 140w TDP is absurd for a 6-8 core model. If their process was even remotely close to being proper they should be in the 95w range on 14nm.
Very few people, even enthusiasts, bash a company because its products consume too much power. They bash products when they consume too much power for a given performance level, especially when their competitors can provide the same performance using less power.

This paradigm works both ways; it's not an anti-AMD bias. You forget that Intel's single core Pentium 4 Prescott processors even today still have a reputation for being hot and power hungry, yet they were only 95W-130W TDP CPUs. Modern HEDT processors consume more power than the Prescott Pentium 4 and yet very few people complain about power consumption of these HEDT CPUs because the performance is there to match.
 
Well I hope the top-tier chip is actually a native octa-core and not a gimped 12 core Xeon.

Does it really matter? Haswell E/EP comes in 3 core configurations
6-8, 10-12, and 14-18.

The 5960x could be a fully enabled 8-core, or a cut-down 12 core. The lower core count CPUs are a way of managing yields. For the same reason a 12-core Xeon could be a fully enabled part, or a cut-down of the 18 core die.
 
Those fucking assholes. Meh.

If they were good and smart, they would launch software capable to fully utilise the 18-core CPUs, to flood the market with them, and we, to be happy already with gaming at Ultra HD resolutions and beyond...

But hey no, it is a matter of fact of life to share the same planet with bastards.
 
Does it really matter? Haswell E/EP comes in 3 core configurations
6-8, 10-12, and 14-18.

The 5960x could be a fully enabled 8-core, or a cut-down 12 core. The lower core count CPUs are a way of managing yields. For the same reason a 12-core Xeon could be a fully enabled part, or a cut-down of the 18 core die.
It actually does matter since the different die have different memory controller and ring bus configurations. A 12 core die cut down to 8 cores would have more memory bandwidth but higher cache latency than the native 8-core die due to having double the number of memory controllers but a more complicated ring bus with snoop filter.

That said, Intel also knows that different die configurations have different performance characteristics and has shown no evidence that they sell different die configurations under the same model number. They'll always pick the smallest die possible even if yields are low because once yields improve, you would have to either release a new SKU or gimp chips that would otherwise be fully functional.
 
I never look for power consumption because I always go on single cards...
OK 1000W example in gaming is not normal but I talk about power consumption of premium Radeon and GeForce graphics during last few years, I never complain.
It's bad when AMD give half performance of Intel Performance in every day situations for double more power consumption and offer similar performance only if some application use all cores and if AMD CPU is overclocked and Intel not. I think overclocked AMD CPU need more power than GTX970 SLI... That's not normal situation because single core is almost half of Intel.
He waste power and today on earth that is something worse but they have special treatment because anti monopoly and nobody want situation where Intel no competition.
It would be interesting to someone compare power need for GTX980 SLI reference in gaming vs FX-8350 on 4.8-5.0GHz in Prime95 or FX-9590 overclocked.
 
Those fucking assholes. Meh.

If they were good and smart, they would launch software capable to fully utilise the 18-core CPUs, to flood the market with them, and we, to be happy already with gaming at Ultra HD resolutions and beyond...

But hey no, it is a matter of fact of life to share the same planet with bastards.

lolwut o_O

Calm the duck down.
 
That's not the point. A lot of fanbois have been bashing AMD for their 125w CPUs on 32nm even though the 8-core CPUs are also just 125w. For Intel to release a 14nm node CPU that has a 140w TDP is absurd for a 6-8 core model. If their process was even remotely close to being proper they should be in the 95w range on 14nm.

Ummm, SB-E 6C/12T CPUs had 2.27 billion transistors and the 8150 had what, 1.2B? skt2011 and 2011-3 CPUs consume a lot of power because they tend to have twice the number (or more) transistors than an 8 core AMD CPUs.

AM3+ CPUs also don't have the PCI-E root complex on the CPU, that's more power. SB-E has 4 memory channels where AMD has 2, that's more space and power. AMD processors are also made on an SOI process which has traditionally produced more leakage than Intel's HKMG process for any given manufacturing node.

With respect to transistors:
42780.png
 
I'm tired of both the fans of Intel and AMD arguing over stupid and petty things. If you were to look from SB to Haswell there's almost no reason to upgrade. Almost insignificant increases in performance, which were mirrored by the enthusiast level boards. The difference between them is in the surrounding components. DDR4 hasn't really changed anything, but finally having SATA III everywhere has made quite a difference.

Likewise, a 140 watt TDP isn't unreasonable. Neither AMD nor Intel can fight the laws of physics, so jamming more components into a design will invariably increase thermal dissipation needs. As nodes get smaller and smaller this is compounded, because more switching occurs over a smaller area. Yes, voltages may decrease with shrinks. On the same token, increasing component count more than compensates for that reduction.


As far as AMD, they've announced that they aren't competing with Intel for the HEDT market. While they might have a position to defend, they've publicly acquiesced to Intel. Their best silicon is dedicated to APUs, so we haven't seen anything genuinely good come from AMD in years. Thuban was the last real competitor, and it only existed because it could OC to the moon and back. Releasing an 8 thread processor at 125 watt TSP and a 12 thread processor at 140 watt TDP isn't really unreasonable. With Intel throwing everything onto the CPU that's very reasonable heat generation.


In short, Broadwell-e is treading water. It's a minor incremental improvement, and changing the low end offering to something with all the PCI-e complex unlocked makes no sense. Intel would have to juggle the top end offerings to compete with it, and the high end consumer level offering would have huge competition. If the H/P/Z based PCH offerings don't have DDR4 pricing to differentiate themselves Intel will be removing the high-end sales there for low end enthusiast boards. I highly doubt Intel is that stupid.

At any rate, Broadwell-e is shaping up to be a joke. Like IB-e, there's no compelling reason to pay for a minor upgrade. Like SB-e to Haswell-e, the performance increase isn't likely to be justified by the large initial cost. Broadwell-e isn't making me want to give up on SB-e, despite any shortcoming I've experienced. Minor performance gains in multiple generations means no compelling reason to give Intel any of my money.
 
Back
Top