• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Is Intel going to deliver a processor/chipset worth waiting for?

Joined
Feb 22, 2016
Messages
2,430 (0.71/day)
Processor Intel i5 8400
Motherboard Asus Prime H370M-Plus/CSM
Cooling Scythe Big Shuriken & Noctua NF-A15 HS-PWM chromax.black.swap
Memory 8GB Crucial Ballistix Sport LT DDR4-2400
Video Card(s) ROG-STRIX-GTX1060-O6G-GAMING
Storage 1TB 980 Pro
Display(s) Samsung UN55KU6300F
Case Cooler Master MasterCase Pro 3
Power Supply Super Flower Leadex III 750w
Software W11 Pro
The quote below prodded me back into considering the always uncertain decision to await the next great thing.

I think people should wait for this later this year, considering the CEO is "betting the entire company on it" 2nm node is going to change a lot of performance charts I suspect, this is a Tock times two moment later this year I have a feeling and 1,8nm shortly after will be a tick, but a good tick

Betting he can pass his massive bonus and still inflating domestic foundry costs onto th.... World isn't getting any cooler and I doubt Intel processors and mobo's with their chipset are going to be anything short of stratospherically priced. Not much worth discussing to that end. ;)

The question once again is will Intel deliver a processor and chipset worth waiting for?
 
Of course they will just perhaps not this year.
 
Obviously they will, writing off a company of Intels caliber as completely incapable is silly. If AMD managed to pull through some really bad times when they were releasing straight trash in the CPU market, then Intel surely will bounce back.
Hell, not like their current offerings are awful. They might be somewhat inferior to AMDs in some regards, but still solid. We are a far cry from the absolute slaughter that was Bulldozer vs Sandy/Ivy.
 
The quote below prodded me back into considering the always uncertain decision to await the next great thing.



Betting he can pass his massive bonus and still inflating domestic foundry costs onto th.... World isn't getting any cooler and I doubt Intel processors and mobo's with their chipset are going to be anything short of stratospherically priced. Not much worth discussing to that end. ;)

The question once again is will Intel deliver a processor and chipset worth waiting for?

fair, but I have thought even since before Covid it was all going to come crashing down, yet it never does. Money makes the world go round. mobo's will arrive, cpu's will arrive, prices might rise a little, maybe $299 instead of $229 for a really decent midrange Mobo, but is 70 the end of the world? nah. we will keep truckin on.

and I think there is a difference between 12600k vs 13600k vs 14600k upgrade year over year vs what is coming, what is coming has been in the works for a very very long time. going to be some nice gains I think for AMD and Intel both.
 
Of course they will just perhaps not this year.

That is the meat of it. Sorely regretting holding off until 14th gen with some amount of expectation it would offer reasonable improvements in some desirable manner.

Obviously they will, writing off a company of Intels caliber as completely incapable is silly. If AMD managed to pull through some really bad times when they were releasing straight trash in the CPU market, then Intel surely will bounce back.
Hell, not like their current offerings are awful. They might be somewhat inferior to AMDs in some regards, but still solid. We are a far cry from the absolute slaughter that was Bulldozer vs Sandy/Ivy.

Though unlikely, business are made up of divisions that fall or see new birth. For instance, they could abandon ongoing development of GPU. Delayed consumer timeline in order to meet commercial and fiscal requirements is certainly the nearer possibility.

fair, but I have thought even since before Covid it was all going to come crashing down, yet it never does. Money makes the world go round. mobo's will arrive, cpu's will arrive, prices might rise a little, maybe $299 instead of $229 for a really decent midrange Mobo, but is 70 the end of the world? nah. we will keep truckin on.

Being upfront with cynicism - as it pairs with reality - was intended to push away at least some of the vitriolic Team chatter.

A great amount of doubt overwhelmingly exists any decline in pricing will appear. Fair to say everyone can agree on that.

and I think there is a difference between 12600k vs 13600k vs 14600k upgrade year over year vs what is coming, what is coming has been in the works for a very very long time. going to be some nice gains I think for AMD and Intel both.

Though I agreed in large with sentiment quoted in OP. Bit confused at this most recent statement. Gains across the PC industry are heavily trending towards being tough to assess. It is a tough question what we will see next even with being slightly geared towards optimism.
 
Last edited:
it will come... but i think it might be a while... intel is also cashing in on the AI trend...
until that subsides.. we might not see that sandy bridge... surprise again..
 
This happens every single year. "The next thing is the best thing just you wait and see!"

Then it isn't. Then, instead of taking the reality check at face value, the chorus begins anew, "No but NEXT thing will be he best thing!"

We are running into the limits of physics. Intel is obfuscating the truth with their new process naming scheme by pretending that sweeping improvements are happening, but the truth is that we've been essentially stuck at a fixed scale for a few years now. A few tweaks to how things are printed at the lowest end of the scale to achieve higher precision across all layers have happened, and each of those tweaks is being treated as a new node. This is not like 20 years ago when the physical size of features was changing dramatically.
 
That is the meat of it. Sorely regretting holding off until 14th gen with some amount of expectation it would offer reasonable improvements in some desirable manner.



Though unlikely, business are made up of divisions that fall or see new birth. For instance, they could abandon ongoing development of GPU. Delayed consumer timeline in order to meet commercial and fiscal requirements is certainly the nearer possibility.



Being upfront with cynicism - as it pairs with reality - was intended to push away at least some of the vitriolic Team chatter.

A great amount of doubt overwhelmingly exists any decline in pricing will appear. Fair to say everyone can agree on that.



Though I agreed in large with sentiment quoted in OP. Bit confused at this most recent statement. Gains across the PC industry are heavily trending towards being tough to assess. It is a tough question what we will see next even being slightly geared towards optimism.

if a 2nm node doesn't have really good improvements over the current 7nm node Intel uses (I think) then the industry is going to die overnight and the stocks will become penny stocks lol
 
and I think there is a difference between 12600k vs 13600k vs 14600k upgrade year over year vs what is coming, what is coming has been in the works for a very very long time. going to be some nice gains I think for AMD and Intel both.

It's safe to disregard 14th Gen entirely if you're looking for a generational leap. The 14th Gen chips are not a new generation or kind of processor. They're completely unworthy of that title, the whole thing is set up like a scam. It's just Raptor Lake and have absolutely no physical changes despite their Refresh designation. Their internal model, processor revision and stepping are completely identical, they simply ship them with more aggressive clocks on certain SKUs, and introduced the black sheep configuration in the i7-14700K - a 3-cluster (12 E-core) configuration has always been possible but hadn't been released in 13th Gen to make sure that there was a large gap between the Core i7 and Core i9 lineups in MT.

This happens every single year. "The next thing is the best thing just you wait and see!"

Then it isn't. Then, instead of taking the reality check at face value, the chorus begins anew, "No but NEXT thing will be he best thing!"

We are running into the limits of physics. Intel is obfuscating the truth with their new process naming scheme by pretending that sweeping improvements are happening, but the truth is that we've been essentially stuck at a fixed scale for a few years now. A few tweaks to how things are printed at the lowest end of the scale to achieve higher precision across all layers have happened, and each of those tweaks is being treated as a new node. This is not like 20 years ago when the physical size of features was changing dramatically.

Sane evaluation, I agree. I think processors will begin to advance at a much slower pace, and while I feel GPUs haven't quite hit that limitation yet (largely thanks to Nvidia's brilliant engineering), I think that post-Blackwell, they just might. RDNA 3 has utterly failed to pull its weight relative to RDNA 2 across all segments and it shows on a card like the 7900 XTX.

We'll see. Ada did bring some amazing improvements over Ampere if you're strictly talking an ASIC by ASIC basis (eg. full GA102 as seen in 3090 Ti to a hypothetical full AD102 that NVIDIA still hasn't shipped in any segment to date), but the yields have clearly been a challenge and the cost has gotten so high that all gaming-grade SKUs have been upmarked with way lower class silicon (like 4060 using the very smallest AD107 chip) or using heavily cut-down silicon like the RTX 4090 and its lowest-end AD102 configuration, and even in the professional and high-margin AI segments, we haven't received fully enabled chips in most of the stack, with them being completely absent at the high end.

The quote below prodded me back into considering the always uncertain decision to await the next great thing.

Betting he can pass his massive bonus and still inflating domestic foundry costs onto th.... World isn't getting any cooler and I doubt Intel processors and mobo's with their chipset are going to be anything short of stratospherically priced. Not much worth discussing to that end. ;)

The question once again is will Intel deliver a processor and chipset worth waiting for?

That boils down to FOMO. I think at this late stage, spending big bucks on a tricked out enthusiast grade LGA 1700 system is probably not a good idea. Arrow and the LGA 1851 platform will come within a year, this is no secret. However, if you get a decent deal and is building at any level other than flagship, I don't see why wait. Just buy your hardware and be happy.

Same goes for GPUs. If you are buying at the upper-midrange, like RTX 4070 Ti SUPER at most, go ahead and be happy. But I consider it not worth dropping 2k on a 4090 since the 5090 will come before the year's out. New and improved product lines are coming, and it's rare for Intel to release a fluke like the 14th Gen. These exist only to appease shareholders, they aren't new products and minimal effort has been made to conceal that fact: it's just marketed as new so investors don't pull out for failure to achieve.
 
Last edited:
That is the meat of it. Sorely regretting holding off until 14th gen with some amount of expectation it would offer reasonable improvements in some desirable manner.
Why the regret?
 
This happens every single year. "The next thing is the best thing just you wait and see!"

Then it isn't. Then, instead of taking the reality check at face value, the chorus begins anew, "No but NEXT thing will be he best thing!"
There is a case to be made for some advancement at cost of holding back easily incorporated elements across the most recent gens. Historical displacement of Super designated Nvidia GPU by the next release should by rights have a processor capable of pairing with it. Processor moving from +300w OC to sub-150w would be one tactic for establishing parity. Likewise a drastic reduction in wasted energy as heat.

LOL, my problem is holding out too long because I know how many next big things need to be cycled through before you reach the moment a new plateau forms. Had I chose better within 8th gen things might not look as dire for gaming purposes. 8400 and H370 otherwise remains a great low heat and low power pairing that suggests longevity up there with i5-2400 builds.

Why the regret?

There was at least a small chance LGA 1700 would see one more product cycle bringing a sizable movement above 13th gen. Not expected, quietly watching for it to develop.

Over ripe fruit and mud weren't in demand before true specs were verified. No time was wasted once they were.
 
if a 2nm node doesn't have really good improvements over the current 7nm node Intel uses (I think)

Don't drink the marketing kool-aid. 20A isn't 2nm. It's a rebrand of what Intel used to call 5nm on their roadmap, and that 5nm wasn't actually 5nm but is instead a target density using new RibbonFET transistors. The 20A branding is Intel trying to spotlight that their 5nm RibbonFET transistor layout with PowerVIA will be equivalent to the competition's "2nm" nodes, which are also not 2nm but various tweaks of 5nm EUV now called 4nm.
 
Magic 8Balls says keep dreaming.
 

Interest Reaction GIF
 
As intel said themselves it will be awhile as AMD momentum is to strong to compete properly. maybe 3-5yrs from now they might have something thats competitive? and I dont mean just on out right speed, but heat and energy officiant as well, not a blazing melting waste of Sand that needs 50% more cores to compete.
 
Arrow lake is said to ditch the hyper threads and still provide the usual gen to gen improvement of 5%. 2nm probably means a lot less power and lower voltage compared to 7nm, worth the wait.
 
I don't think anyone knows the answer to the question, but generally speaking, I never recommend waiting for the next thing. If you need a PC now, you need it now. Otherwise, you could be waiting forever. Not to mention there's teething issues with any new platform. Many people don't have the patience for such things.
 
Well let see if Intel can slap a boat load of cache on their next gen CPUs ;)
 
Well let see if Intel can slap a boat load of cache on their next gen CPUs ;)
In absolute numbers, they quite already do. Every E-core cluster packs it's own cache inflating the numbers the higher the tier.
 
I don't think anyone knows the answer to the question, but generally speaking, I never recommend waiting for the next thing. If you need a PC now, you need it now. Otherwise, you could be waiting forever. Not to mention there's teething issues with any new platform. Many people don't have the patience for such things.

This is also how I buy my hardware, when/if I need something then I go ahead and buy it and call it a day.
I mean sure I might wait a little if I know what exactly is coming and when and its 100% good for me 'like in the case of my 12100F in 2022 february' but otherwise nope I will just buy whats on the market at the time of my need.
 
Probably not. You can use the existing ones with lower power target.
If it can do 5.5Ghz at same power that is still 1Ghz more. pretty good.
Im still not convinced that intel has a working 20A, or simply pulls a Bartlett lake 12 core and calls it a day.
the best part is if the "F" CPU arives without a GPU tile. So we don't overpay for something useless that just sits there.
 
And these CPU's will be without hyper-threading so those P-cores must be good ones...
 
We'll see. Ada did bring some amazing improvements over Ampere if you're strictly talking an ASIC by ASIC basis (eg. full GA102 as seen in 3090 Ti to a hypothetical full AD102 that NVIDIA still hasn't shipped in any segment to date), but the yields have clearly been a challenge and the cost has gotten so high that all gaming-grade SKUs have been upmarked with way lower class silicon (like 4060 using the very smallest AD107 chip) or using heavily cut-down silicon like the RTX 4090 and its lowest-end AD102 configuration, and even in the professional and high-margin AI segments, we haven't received fully enabled chips in most of the stack, with them being completely absent at the high end.
Hey. Hey. Let's be reasonable. The 4090 is no longer the most bottom of the barrel silicon. NV has cut it down even further for the RTX 5000, RTX 5880 and 4090D. So, you know, bright side and all.
 
I don't think anyone knows the answer to the question, but generally speaking, I never recommend waiting for the next thing. If you need a PC now, you need it now. Otherwise, you could be waiting forever. Not to mention there's teething issues with any new platform. Many people don't have the patience for such things.
Waiting forever happens to be very budget friendly. As much as I wanted a 7950x I suspect waiting for the 9950x will be well worth it in terms of refined 2nd gen AM5 boards and CPU's.
 
Back
Top