• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Is Intel going to deliver a processor/chipset worth waiting for?

You clearly misunderstood. 12th Gen is a different type of core - Alder Lake, with a different P-core architecture. However, 13th and 14th gen are bit by bit, physically identical processors with zero changes or improvements in between them. The sole exception is the configuration with 3 E-core clusters sold as 14700K - which was possible in 13th but never made commercially available. They have the same capabilities, characteristics, internal model number, revision, etc.

I didn't misunderstand anything. Raptor-Lake and Raptor-Lake refresh are obviously the same cores, the main difference between RPL and ADL was in the cache in the first place. Nothing from ADL to the new RPL-S significantly changed.

What you probably don't know, is that similar things can be said of Zen 2 vs Zen 3.
 
for normal use and gaming, intel machines use pretty low amounts of power. browsing my rig uses 38w
This is a simulation of reading this topic. From the first post to the last, all read carefully. It took 5 minutes 30 seconds.
Statically, the processor consumes 1-1.2 W.
With scrolling, it jumps up to 2.4 W.
Moving to the next page brought a peak of 10W.
If I spend an hour reading (forum, history, news, etc.), the processor consumes less than 2W.
Another hour for youtube 1080p, the same processor has an average of 5.5W.

It should be noted that this consumption also includes the IGP consumption. The consumption for reading a forum and playing youtube in parallel is the same as the power consumption of a dedicated entry level video card in idle.
It is absolutely ridiculous to talk about the consumption of a processor, now, when it offers computing power far beyond the needs of a home user. Yes, everyone needs a Ferrari for the streets of Calcutta.

read forum.jpg
 
Personally, as someone that actually uses the upgradability of platforms
Intel's history makes them a non-option.
-and to answer OP's query directly: No, they won't. (not unless you have 'professional' applications)
Because the majority voted "just for gaming", I have some information that may affect you emotionally: the 3070Ti loses a maximum of 5% using the old i5-10500 compared to the 14700KF. This maximum is reached only in games that are played with hundreds of fps. In the others, AAA heavy, the losses are up to 2%.
The eyes do not see these small losses, the brain does not enter into suspicions.

All the discussion is based on the extreme tests of the reviewers and a lot of noise is made by those who are light years away from an RTX 4090, fighting heroically in games with old video cards or below 0070.
Best of te best, says the owner of 5800X3D.
Best of best, says the 7800X3D owner who gave up the 5800X3D.
WTF dude! Best of the best only one year?
 
I didn't misunderstand anything. Raptor-Lake and Raptor-Lake refresh are obviously the same cores, the main difference between RPL and ADL was in the cache in the first place. Nothing from ADL to the new RPL-S significantly changed.

What you probably don't know, is that similar things can be said of Zen 2 vs Zen 3.

Both Raptor and Zen 3 have very significant changes against their predecessors, though. Raptor has an improved P-core with higher IPC rate and larger caches, while Zen 3 entirely reworked the internal hierarchy and CCX arrangement. These changes may sound minor, but they are very much impactful

13>14... didn't bring anything at all
 
Last edited:
Similar things can be said of Zen 2 vs Zen 3.
No. Zen 3 was a major architectural leap where AMD unified the CCD, making all 8 cores share the same L3 cache. On Zen 2, you had 4 cores sharing half the cache, and the other 4 sharing the other half. That's why the 3300X was much faster than the 3100. Both were 4-core chips, but the 3300X had one CCX disabled, leaving it with 4 cores with access to all 16 MB cache, but the 3100 had a half of each of the two CCXs disabled, having 2 cores access to 8 MB L3, and the other 2 cores access the other 8 MB.
 
Worth is subjective.

There are some things that could be improved on, but in general I'm pretty happy about the bang I got for my buck with my 12700K. At the time of purchase, the A620/B650 boards available at the price I got my Z690 board at were inferior in terms of I/O. The AMD CPU at the same price point at the time, the Ryzen 7600X, was not competitive in multithreaded performance.
 
It does, but it rarely holds true.

Just look at history (Intel mainstream only).

Any reason to wait instead of buying the current model now?

2010: Yes, wait for Sandy Bridge next year
2011: No
2012: No
2013: No
2014: No, unless you really wanted the unique 5775C
2015: N/A
2016: Yes, wait for 8700K next year
2017: No (This one I'm the least certain about --9900K--)
2018: N/A
2019: No
2020: Yes, wait for Alder Lake next year (Not that other 2021 thingy lol)
2021: No
2022: No

No doesn't mean bad CPU's here, most of them were alright even if not that exciting or worth waiting a whole year for.

N/A means Intel didn't launch anything new the following year.
I was gonna say something along these lines. 'Just buy it now because you need it' really isn't great advice.

There are lots of meh/shitty/mediocre/interchangeable generations of hardware, and there are paradigm shifts in hardware. Even minor shifts matter that way. When we moved from the quad to the 6c12t and up Intels for example, which is mentioned above in the shape of an 8700K. Or when Zen basically killed the HEDT segment on the spot.

Another one would be the first X3D.

But then another aspect in all of this would also be price. Buying a new thing thats great at launch is generally not the most cost effective choice.
Ideally, you wait until just after a new big thing has lost its initial shine and becomes the new norm, and then you buy at a more reasonable price in line with its performance. That's how you get the longest, most durable benefit out of it and top-end performance alongside. Generally, that performance will last you long enough to ride that hardware until the next big thing comes along. If you zoom out like that, all the stuff in between is really just minor revisions of that good idea, until that fruit's gone, and then something new comes along.

Yearly or bi-gen upgrades are and have most of the time been pretty cost ineffective. Also, you can't really say buying a certain segment is better or worse, but its generally a better idea to buy more powerful hardware rather than 'just enough', as the latter is penny wise pound stupid in general, and another way to not be able to buy into the optimal time frame/hardware generation because 'you need something now'.
 
that solves it, DDR6 and a more powerful P core in a meaningful number not happening before 2027. arrow lake is a pure 6 core locked CPU with low freq like a regular 14400 or probably the lack of HT compansated with 20% higher clock and and frakkly I don't know who is that for. I mean it certainly does the job with 14 threads but it doesn't sound a full scale launch kind of thing. where is the something-7 and -9
 
I hope so, as I dislike the AM5 IHS almost as much as I dislike E cores.
 
What a dumb question, sure it will. My bet is next generation of intel processors will win:

editorschoice.gif


No matter what :P
 
that solves it, DDR6 and a more powerful P core in a meaningful number not happening before 2027. arrow lake is a pure 6 core locked CPU with low freq like a regular 14400 or probably the lack of HT compansated with 20% higher clock and and frakkly I don't know who is that for. I mean it certainly does the job with 14 threads but it doesn't sound a full scale launch kind of thing. where is the something-7 and -9
Intel releases a new product range or a refreshed product range of its mainstream desktop and mobile processors every year.

Coming later this year/early 2025 are Series 2 Ultra (Arrow Lake) desktop, Series 2 Ultra (Arrow Lake) mobile to replace the current HX processors and Series 3 Ultra (Lunar Lake) sub-28w TDP mobile to replace Meteor Lake.

The Ultra tag means that the CPU has AI processing capabilities.

Within this large range of CPUs some fit the 6 P-core + 8 E-Core configuration but there are many other configurations. So there is no question of Series 2 Ultra being confined to whatever chip a leaker claims to have seen. The other issue is the no hyperthreading claim. One leaker says they saw a 24 thread Arrow Lake, and this was a 8P + 16E without hyperthreading. One of the current mobile HX chips is the Raptor Lake Core i7-14650HX which is 8P+ 8E with hyperthreading, making 24 threads. The chip concerned could have been the Series 2 Ultra mobile replacement for this.

Intel lately have started calling their Series 1 Meteor Lakes Series 100 Ultras. It's possible that Arrow Lake desktop/mobile might be called Series 200 Ultras and Lunar Lake Series 300 Ultras.
 
I was gonna say something along these lines. 'Just buy it now because you need it' really isn't great advice.

There are lots of meh/shitty/mediocre/interchangeable generations of hardware, and there are paradigm shifts in hardware. Even minor shifts matter that way. When we moved from the quad to the 6c12t and up Intels for example, which is mentioned above in the shape of an 8700K. Or when Zen basically killed the HEDT segment on the spot.

Another one would be the first X3D.

But then another aspect in all of this would also be price. Buying a new thing thats great at launch is generally not the most cost effective choice.
Ideally, you wait until just after a new big thing has lost its initial shine and becomes the new norm, and then you buy at a more reasonable price in line with its performance. That's how you get the longest, most durable benefit out of it and top-end performance alongside. Generally, that performance will last you long enough to ride that hardware until the next big thing comes along. If you zoom out like that, all the stuff in between is really just minor revisions of that good idea, until that fruit's gone, and then something new comes along.

Yearly or bi-gen upgrades are and have most of the time been pretty cost ineffective. Also, you can't really say buying a certain segment is better or worse, but its generally a better idea to buy more powerful hardware rather than 'just enough', as the latter is penny wise pound stupid in general, and another way to not be able to buy into the optimal time frame/hardware generation because 'you need something now'.
The only logical flaw I find with that argument is that the next big thing doesn't invalidate your purchase. I bought a 7700K back in the day, then the 8700K dropped 3 months later. I surely would have waited if I'd known what was going on, but even then, I kept the 7700 for 5 more years and enjoyed the heck out of it. It was still a fast CPU, after all. No new release makes your PC suddenly worse. ;)

I hope so, as I dislike the AM5 IHS almost as much as I dislike E cores.
I find nothing wrong with the AM5 IHS. Just slap a big cooler on it, and you're fine. Imo, the chiplet design is a bigger contributor to any heat issue anyway.
 
I find nothing wrong with the AM5 IHS. Just slap a big cooler on it, and you're fine.
90°C is fine, heard that before. 7700k was fine as well, when delided. If I have to rely on 3rd party stuff like CPU brackets to get a proper contact, deliding to change tim, custom IHS to get better temps and preserve compatibility, or just run bare die again with 3rd party brackets, no, nothing is fine about those.
But let's not go off topic, I'm done here.
 
90°C is fine, heard that before. 7700k was fine as well, when delided. If I have to rely on 3rd party stuff like CPU brackets to get a proper contact, deliding to change tim, custom IHS to get better temps and preserve compatibility, or just run bare die again with 3rd party brackets, no, nothing is fine about those.
But let's not go off topic, I'm done here.
I still think that's mainly due to the offset chiplet design and the extremely small lithography, and only in very small part to the thick IHS. As CPUs get denser and denser, it's only going to get worse.
That's all from me too.
 
The only logical flaw I find with that argument is that the next big thing doesn't invalidate your purchase. I bought a 7700K back in the day, then the 8700K dropped 3 months later. I surely would have waited if I'd known what was going on, but even then, I kept the 7700 for 5 more years and enjoyed the heck out of it. It was still a fast CPU, after all. No new release makes your PC suddenly worse. ;)

And here I am still gaming on that very same 8700K in 2024, and still not CPU necked quite enough to warrant an upgrade.

So there is a definite gap there, undeniably, waiting those 3 months would have been a much better choice. A 7700K is really lacking core count.
 
And here I am still gaming on that very same 8700K in 2024, and still not CPU necked quite enough to warrant an upgrade.
Exactly my point! You could have waited a bit more for the 9900K with two more cores, but as long as you're happy, who cares? :)

So there is a definite gap there, undeniably, waiting those 3 months would have been a much better choice. A 7700K is really lacking core count.
If I knew the 8700K was coming, I would have waited. But like I said, I was happy with the 7700 for years to come, so "in the end, it doesn't even matter." :rockout:
 
90°C is fine, heard that before. 7700k was fine as well, when delided. If I have to rely on 3rd party stuff like CPU brackets to get a proper contact, deliding to change tim, custom IHS to get better temps and preserve compatibility, or just run bare die again with 3rd party brackets, no, nothing is fine about those.
But let's not go off topic, I'm done here.

Very much on topic and also very much a realistic concern I share. Cooling 12th - 14th gen and beyond is highly relevant.

Most of your list can be (partially) blamed on behaviors exhibited on sites like this. Some people just couldn't let increasingly invasive OC procedures go in order to adapt towards software or another direction.
And here I am still gaming on that very same 8700K in 2024, and still not CPU necked quite enough to warrant an upgrade.

Remember looking at the official Z300/Q300 charts depicting jump in PCIe lanes. Fairly sure I opted out of an 8700K because of RAM and GPU inflationary practices.

Cache and VRM upgrades are much higher priority now as I start looking seriously at best case processor/chipset combination. To pair with 3080.


[Highly edited to remove flourish of profanity towards 5xxx 70% claims and Space Lynx. :laugh:]
 
No. Zen 3 was a major architectural leap where AMD unified the CCD, making all 8 cores share the same L3 cache. On Zen 2, you had 4 cores sharing half the cache, and the other 4 sharing the other half. That's why the 3300X was much faster than the 3100. Both were 4-core chips, but the 3300X had one CCX disabled, leaving it with 4 cores with access to all 16 MB cache, but the 3100 had a half of each of the two CCXs disabled, having 2 cores access to 8 MB L3, and the other 2 cores access the other 8 MB.

Reality is in your own answer here. The Zen 2 3300X was just like a 4 core Zen 3. The main thing that change performance in both ADL->RPL and Zen 2- Zen 3 was cache changes. They are the same microarchitecture. Mucking with the cache doesn't change the pipelines or instruction width.

I've seen this marketing hype many times, where a 'new generation' CPU came out but was really same as the old one with more / faster cache and claimed to be a redesign. HP did the same thing with some of their PA-RISC chip releases, and I'd bet IBM did it with Power. Heck car makers do it all the time. New sheetmetal, same frame and powertrain.

Now I'm not saying it doesn't change performance - but it's not a new microarchitecture. Fact is, Zen 2 was starved by its cache, and Alder Lake had latency issues with its cache. These are all just evolutionary moves. for that matter - surprise - Zen 4 is as well.

But hey, don't believe me.

Believe Intel :

 
Most of your list can be (partially) blamed on behaviors exhibited on sites like this. Some people just couldn't let increasingly invasive OC procedures go in order to adapt towards software or another direction.
Let me be clear, personally I don't have a problem with all that. I'm an enthusiast enough to take everything apart and improve what I can, and I'm also able to cope with any risk involved in the process. I'm not afraid to try new things. Couple of weeks ago I used LM for the first time, and on the most expensive piece of a hardware I ever had. And it was easier than applying the damn mx-6, if not counting the 4 layers of nail polish for the SMDs.
The thing is I like my PCs cool & quiet, I like having headroom to OC and tweak the hell out of my hardware, and saying "90°C is fine", just because you failed at making it better at the factory, makes me wanna puke. It's the same shit Intel was doing with their TIM, and everyone was fine with it because it just works, just as the mostly useless E cores do.
 
Just wait for the Supers... :D
 
Personally I'm only interested in the i7's.
 
Back
Top