• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Is Intel going to deliver a processor/chipset worth waiting for?

As intel said themselves it will be awhile as AMD momentum is to strong to compete properly. maybe 3-5yrs from now they might have something thats competitive? and I dont mean just on out right speed, but heat and energy officiant as well, not a blazing melting waste of Sand that needs 50% more cores to compete.
You've read too many horror stories about Intel.
 
Waiting forever happens to be very budget friendly. As much as I wanted a 7950x I suspect waiting for the 9950x will be well worth it in terms of refined 2nd gen AM5 boards and CPU's.
If you can afford to wait because your current system is good enough for your needs, then sure, I agree. Upgrading with every generation is a waste of money anyway.
 
Hi,
Intel has been releasing "best chips" every 6+- months trolling amd x3d performance
So yeah waiting is always better because now intel only adds more power to catch up with amd which is not best but in Intel's case it's all they got atm lol
 
Hi,
Intel has been releasing "best chips" every 6+- months trolling amd x3d performance
So yeah waiting is always better because now intel only adds more power to catch up with amd which is not best but in Intel's case it's all they got atm lol
Didn't Intel have power efficiency on their side at 35w or something like that with the 14900k?
 
The quote below prodded me back into considering the always uncertain decision to await the next great thing.
This is a never-ending story. There will ALWAYS be "a" next great thing just around the corner. If every technology in computers today was on the exact same "next great thing" track, this would be easy to sort out. But it does not work that way. Especially when there are competing technologies too.

GPUs, CPUs, RAM, USB, HDMI, DP, PCIe, BT, wifi, storage, Ethernet, [fill in the blank] are all hardware (and protocol) technologies that are advancing at their own pace on their own tracks. Yeah, they might be tracks running in parallel, but separate tracks just the same.

For this reason, I say if you "need" a new computer today, and have the resources, buy today. If you can wait, then wait and in the meantime, define your needs, build up your budget, do your homework, then pull the trigger when ready. But stop waiting for what's just around the corner because the day after you pull that trigger, something new and improved will pop up just around the corner again.

BTW, the same is true for audio reproduction technologies and the latest and greatest in TV technologies too.
 
You've read too many horror stories about Intel.
Likely just able to read Intel default power consumption
They are pretty horrible though hehe

User playing with PL limits is not what the chips do by default so it should not be an argument for some sort of claim of efficiency.
But then again that is all intel people can argue for they got bollocks otherwise :laugh:
 
Well for laptops the Core Ultra is their first platform I'd recommend since Zen2. As for desktops...They might skip Meteor Lake and the Intel 4 node altogether, and go Straight for Lunar on 3nm
 
Normally you should see no more than 125W after a second. the chip does exactly that. but aren't motherboard makers setting their long and short duration to 4096W by default just to win the market
User playing with PL limits is not what the chips do by default so it should not be an argument for some sort of claim of efficiency.
 
The question once again is will Intel deliver a processor and chipset worth waiting for?
Depends on your definition of "worth waiting for"....

IF the usual paltry, miniscule, 2-5% YOY performance increases meets your definition of that, then yea, go ahead & wait....

But as already mentioned, if you need a new rig NOW (or an updated one), and that will meet your requirements, then buy it/build it NOW :)

Just remember, the "next big thing" is always just around the corner, until it isn't, and the "next, next big thing" comes along, hehehe :D
 
It's safe to disregard 14th Gen entirely if you're looking for a generational leap. The 14th Gen chips are not a new generation or kind of processor. They're completely unworthy of that title, the whole thing is set up like a scam. It's just Raptor Lake and have absolutely no physical changes despite their Refresh designation. Their internal model, processor revision and stepping are completely identical, they simply ship them with more aggressive clocks on certain SKUs, and introduced the black sheep configuration in the i7-14700K - a 3-cluster (12 E-core) configuration has always been possible but hadn't been released in 13th Gen to make sure that there was a large gap between the Core i7 and Core i9 lineups in MT.


<snip>

Sorry, this same thing was said about 13th gen.

So yes there are 3 official generations (12-14) on the LGA 1700 socket, all released in 2 years (Nov 2021 - . All 3 of them are still out there for sale, and yes there was diminishing gains with 14th gen. Who really thought there wouldn't be? We weren't even supposed to have a 14th gen RL refresh.

There's a pretty big difference between 12th and 14th gen. Like in the chart below, that's 38.1% difference in AV1 encoding from AL to RL refresh. Most benchmarks outside of games show a 20% gain.

And that, with a full node disadvantage vs TSMC.

AMD is going to lose that advantage they are getting from TSMC in 2025. We'll see how that works out.

1709568546552.png
 
Normally you should see no more than 125W after a second. the chip does exactly that. but aren't motherboard makers setting their long and short duration to 4096W by default just to win the market
Hi,
No they still have short timers on auto in bios it takes a user to alter those timers
If they didn't temps would be way worse.

Pretty much why people have been saying limiting PL's is the answer to get the highest efficiency chip in the world lol but they need to send that message to Intel because they sure don't do it by default that to takes user alterations.
 
I was hoping. I'm ok waiting for now with my 12700K. I can always get a cheap(ish) 14th jump at some point, but don't really want to do a switch to AMD as my PC is just so granite stable, don't want to risk losing that.
 
The quote below prodded me back into considering the always uncertain decision to await the next great thing.



Betting he can pass his massive bonus and still inflating domestic foundry costs onto th.... World isn't getting any cooler and I doubt Intel processors and mobo's with their chipset are going to be anything short of stratospherically priced. Not much worth discussing to that end. ;)

The question once again is will Intel deliver a processor and chipset worth waiting for?
Depends on how long your willing to wait with your current performance. I just upgraded from a 2600K to a 12700K after 12 and a half years. But PC also isnt my primary gaming platform anymore AND I got a pretty solid deal on the CPU. Don't expect Intel to suddenly come out of the doors swinging with a massive jump in performance that doesn't also come with a massive jump in cost. That speculation has been around for over a decade and yet time after time we keep seeing the incremental percentages that year over year don't even justify the notion of upgrading unless you absolutely have to have the newest and greatest thing or have tons of disposable income for your PC hobby.
 
Sorry, this same thing was said about 13th gen.

So yes there are 3 official generations (12-14) on the LGA 1700 socket, all released in 2 years (Nov 2021 - . All 3 of them are still out there for sale, and yes there was diminishing gains with 14th gen. Who really thought there wouldn't be? We weren't even supposed to have a 14th gen RL refresh.

There's a pretty big difference between 12th and 14th gen. Like in the chart below, that's 38.1% difference in AV1 encoding from AL to RL refresh. Most benchmarks outside of games show a 20% gain.

And that, with a full node disadvantage vs TSMC.

AMD is going to lose that advantage they are getting from TSMC in 2025. We'll see how that works out.

View attachment 337595

You clearly misunderstood. 12th Gen is a different type of core - Alder Lake, with a different P-core architecture. However, 13th and 14th gen are bit by bit, physically identical processors with zero changes or improvements in between them. The sole exception is the configuration with 3 E-core clusters sold as 14700K - which was possible in 13th but never made commercially available. They have the same capabilities, characteristics, internal model number, revision, etc.
 
Probably not. You can use the existing ones with lower power target.
Normally you should see no more than 125W after a second. the chip does exactly that. but aren't motherboard makers setting their long and short duration to 4096W by default just to win the market

Very narrow band of realistic use cases between full snort and well trained.

To the best of my knowledge 4095.9w default has been present on a large number of mobo going back some time. It is on my current one. The change appears to have been implementing it where the impact is considerable and easily overlooked by those more enthused than enthusiast.

Again to the best of my knowledge this bump was usually accomplished through shipped BIOS promoting good feelings about new purchase.

I was hoping. I'm ok waiting for now with my 12700K. I can always get a cheap(ish) 14th jump at some point, but don't really want to do a switch to AMD as my PC is just so granite stable, don't want to risk losing that.

For just a moment I switched store pages to look at X3D. Having mixed and rinsed between the two options (and a few years dabbling in Mac) numerous times. Interest just isn't there right now to deal with the tradeoffs you'd accept impacting granite stable operation.

Depends on how long your willing to wait with your current performance. I just upgraded from a 2600K to a 12700K after 12 and a half years. But PC also isnt my primary gaming platform anymore AND I got a pretty solid deal on the CPU. Don't expect Intel to suddenly come out of the doors swinging with a massive jump in performance that doesn't also come with a massive jump in cost.

Expect this build to make it from current 7 years to around that point. For broader use minus gaming.

It seems I'm not the only one here with an eye on L(cache) or bogus cores that get turned off for gaming anyways. 2nd to 8th to 14th may be progression I maintain despite the other noise.

BTW, the same is true for audio reproduction technologies and the latest and greatest in TV technologies too.

B/W had incredible picture clarity along with ability to portray three dimensional scenes. Few remember or can be bothered to notice how slow color tv has been to attain this. Or if it has.

I'm willing to give partial credit on audio reproduction. Modern electronics paired with old analog equipment goes places digital is beating a very slow path towards. As with A/V example preceding.
 
This happens every single year. "The next thing is the best thing just you wait and see!"
It does, but it rarely holds true.

Just look at history (Intel mainstream only).

Any reason to wait instead of buying the current model now?

2010: Yes, wait for Sandy Bridge next year
2011: No
2012: No
2013: No
2014: No, unless you really wanted the unique 5775C
2015: N/A
2016: Yes, wait for 8700K next year
2017: No (This one I'm the least certain about --9900K--)
2018: N/A
2019: No
2020: Yes, wait for Alder Lake next year (Not that other 2021 thingy lol)
2021: No
2022: No

No doesn't mean bad CPU's here, most of them were alright even if not that exciting or worth waiting a whole year for.

N/A means Intel didn't launch anything new the following year.
 
Last edited:
Depends on how long your willing to wait with your current performance. I just upgraded from a 2600K to a 12700K after 12 and a half years. But PC also isnt my primary gaming platform anymore AND I got a pretty solid deal on the CPU. Don't expect Intel to suddenly come out of the doors swinging with a massive jump in performance that doesn't also come with a massive jump in cost. That speculation has been around for over a decade and yet time after time we keep seeing the incremental percentages that year over year don't even justify the notion of upgrading unless you absolutely have to have the newest and greatest thing or have tons of disposable income for your PC hobby.
That's a very subjective opinion. One I don't share. The difference between a 9700K and your 12700K is massive. That's only 3 gens. Fierce competition has escalated the "arms race". The difference between that 9th and 12th gen i7 is much larger than your 2600K to say a 7700K. Zen hit and everything changed for the better. 9th and 11th gen were misfires overall, but LGA1700 righted the ship. Performance improvement has been a bit pedestrian on the latest platform but there has been improvement. On AMD's side performance has skyrocketed since the FX days. And even since OG Zen.

Besides, it is not about us DIYers, it's about prebuilts. And OEMs love to have a new shiny to market every year. Average shopper doesn't know much beyond bigger number better, and new better than old.

The topic of the thread was answered already so count me as +1 for that. As to if upcoming CPUs like Arrow Lake will be an impressive improvement? It'd better be, if they want some of the lost DIY market share back. But prebuilts will ensure its success regardless. Intel has dominated there forever, and I don't see that changing anytime soon.
 
Very narrow band of realistic use cases between full snort and well trained.

To the best of my knowledge 4095.9w default has been present on a large number of mobo going back some time. It is on my current one. The change appears to have been implementing it where the impact is considerable and easily overlooked by those more enthused than enthusiast.

Again to the best of my knowledge this bump was usually accomplished through shipped BIOS promoting good feelings about new purchase.



For just a moment I switched store pages to look at X3D. Having mixed and rinsed between the two options (and a few years dabbling in Mac) numerous times. Interest just isn't there right now to deal with the tradeoffs you'd accept impacting granite stable operation.



Expect this build to make it from current 7 years to around that point. For broader use minus gaming.

It seems I'm not the only one here with an eye on L(cache) or bogus cores that get turned off for gaming anyways. 2nd to 8th to 14th may be progression I maintain despite the other noise.



B/W had incredible picture clarity along with ability to portray three dimensional scenes. Few remember or can be bothered to notice how slow color tv has been to attain this. Or if it has.

I'm willing to give partial credit on audio reproduction. Modern electronics paired with old analog equipment goes places digital is beating a very slow path towards. As with A/V example preceding.

It really is granite stable, never had a Bsod since i built it(truly not on my kids life) it just works with no problems at all.
 
2016: Yes, wait for 8700K next year

This one is most interesting because in 2016 we were waiting for the 7700K still. The 5775C and 6700K launched within only a few months of each other in mid-2015 (May and August respectively) but then the 7700K didn't release until January 2017, with the 8700K coming out in October following a very awkward period of time when everyone knew that the 7700K wasn't the biggest thing Intel had, but nobody knowing if Intel had any plans to release it the same year or make everyone wait while they continued to sell a mildly overclocked 6700K as the "flagship". Ryzen put a lot of hustle in Intel's step to release the 8700K in the same year.
 
No.
AMD isn't either.

This is fundamentally a stupid question. I remember absolutely loving the leaks for 3930k and x79 chipset. 8 SATA III connections, plenty of power, and the first 6 core chip that felt substantial. Sandy Bridge was a monster, and the 3930k would offer insane performance to go with windows 7 actually being a 64 bit OS that didn't require jumping through hoops to get working (though I still love you XP).

What we got was a thermal hog, a chipset that gave-up on SATA III to have a bunch of SATA II, and hyperthreading that sometimes simply crapped the bed. It was rapidly outpaced by newer connectivity, and my anecdotal experience was that it was insanely picky on memory. Not ZEN 1/2 levels, but enough that it didn't like any more than a very mild overclock...and on some units wouldn't even allow XMP to work.

That was the second that I learned two things. Don't buy pro-sumer...because it isn't future resistant any more than anything else. Don't assume slides are true until data is provided from the people selling you stuff. In the last year I've gone AM4 and a 5700x....despite the presence of newer chips. I need 60%, the older generation gives me 70%, the newer generation gives me 80%, and the halo product offers 100%. Consider me happy at 40% of the cost and 117% of the performance, rather than 100% of the cost and 167% the performance.
 
That's a very subjective opinion. One I don't share. The difference between a 9700K and your 12700K is massive. That's only 3 gens. Fierce competition has escalated the "arms race". The difference between that 9th and 12th gen i7 is much larger than your 2600K to say a 7700K. Zen hit and everything changed for the better. 9th and 11th gen were misfires overall, but LGA1700 righted the ship. Performance improvement has been a bit pedestrian on the latest platform but there has been improvement. On AMD's side performance has skyrocketed since the FX days. And even since OG Zen.

Besides, it is not about us DIYers, it's about prebuilts. And OEMs love to have a new shiny to market every year. Average shopper doesn't know much beyond bigger number better, and new better than old.

The topic of the thread was answered already so count me as +1 for that. As to if upcoming CPUs like Arrow Lake will be an impressive improvement? It'd better be, if they want some of the lost DIY market share back. But prebuilts will ensure its success regardless. Intel has dominated there forever, and I don't see that changing anytime soon.

If someone bought a prebuilt that has been as stable as my rig has i am pretty sure they would be very happy and it would be a very good advert for the hardware. I have always said, reliability/stability is much more important than power. What is the point of having the fastest performing hardware if it is unreliable and crashes all the time.
 
Likely just able to read Intel default power consumption
They are pretty horrible though hehe

User playing with PL limits is not what the chips do by default so it should not be an argument for some sort of claim of efficiency.
But then again that is all intel people can argue for they got bollocks otherwise :laugh:
0.2KW / day, according to the information provided by the wattmeter. The whole system, without monitor.
Because I don't "play" Cinebench daily, I can firmly state that this system is more efficient than any AMD in the tasks I use it for. Even in multi-core, 13500 limited to 88W (the maximum consumption of an AMD with TDP 65W) gets the same score as a 7800X3D. without the power limit it gets 21k, a score impossible to reach by any ryzen 7.

The nonsense about consumption is based only on tests carried out under extreme conditions.

pwr 00.jpg
pwr.jpg
 
The quote below prodded me back into considering the always uncertain decision to await the next great thing.



Betting he can pass his massive bonus and still inflating domestic foundry costs onto th.... World isn't getting any cooler and I doubt Intel processors and mobo's with their chipset are going to be anything short of stratospherically priced. Not much worth discussing to that end. ;)

The question once again is will Intel deliver a processor and chipset worth waiting for?
Wait for a processor and chipset? I just use my PC 6 years, then decide to upgrade or maintain the status quo.
 
0.2KW / day, according to the information provided by the wattmeter. The whole system, without monitor.
Because I don't "play" Cinebench daily, I can firmly state that this system is more efficient than any AMD in the tasks I use it for. Even in multi-core, 13500 limited to 88W (the maximum consumption of an AMD with TDP 65W) gets the same score as a 7800X3D. without the power limit it gets 21k, a score impossible to reach by any ryzen 7.

The nonsense about consumption is based only on tests carried out under extreme conditions.

View attachment 337679View attachment 337680
for normal use and gaming, intel machines use pretty low amounts of power. browsing my rig uses 38w
 
Back
Top