• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Some Intel Nova Lake CPUs Rumored to Challenge AMD's 3D V-Cache in Desktop Gaming

Status
Not open for further replies.
this thread escalated quickly with the red team reacting..lol..(just the thread title alone will make any AMD fanboi react)
 
Reminds me of the 2015 i7-5775C 128MB L4. By now this should have grown to 1.5GB to scale with the number of cores. 4C/128MB
 
You think it's good Intel is going to segment it's CPUs with v cache because you think segmenting CPUs with v cache is terrible? WUT?
I have always disagreed with the way AMD did their earlier 3D CPUs. I do not like the idea of a gaming only CPU.. but mediocre at everything else. A regular CPU can play games no problem lol.. put the cache on all of em.
 
How did Intel get so bad? 3D V Cache is already 3 years old and it's absolutely dominant in gaming. Why is it going to take until Nova Lake? (late 2026, early 2027)?
X3D was invented by AMD, not Intel. It's already patented and Intel won't able to copy AMD's idea. That's why Intel want to create own version of "X3D" as more capacity of L3 cache.
I'm seeing why Intel is losing marketshare so fast.
Totally misconception. Intel market share are still holding majority than AMD, only lost on recent profits.
As long as they keep E cores, Intel stays sucker
Like we said before, E-cores wasn't bad at all for working and business process, only gamers doesn't care about it.
Intel needs to get out of the chip design business. The CPU market is crowded and the GPU market is the only lucrative space currently.

Intel is sitting on a gold mine with its fabs but all that capacity is wasted making shitty Intel CPUs that less and less people want.
Let's be honest here, Intel should've gone fabless a long time ago, when it got stuck on 14nm for about 7 years. It could afford staying still, whilst it had little to no competition, which then came way too late.
US governments and other corporations wants Intel to continue manufacturing chips by their own fabs because of TSMC were already dominated countless companies like Apple, AMD, Nvidia etc., and even Google were recently joined it as Samsung fabs got issues of overheating on chips. Do you think Intel would easily going bankrupt? I don't think so.
Late 2026/2027?! LOL by then AMD will have moved on to 4D vCache
Nah, L4 cache doesn't exist in current years.
I really hate this type of wishful marketing BS.
BS haters here, I don't want to convince this guy because he's NOT understanding about marketing business.
this thread escalated quickly with the red team reacting..lol..(just the thread title alone will make any AMD fanboi react)
Every PC communities have too much AMD fanboys roasting Intel and Nvidia everywhere, throwing all of nonsense and garbage comments for defending AMD as loyalty brand. There's nobody would doing perfectly for technologies, educations are the must.
 
I personally have no problem running Intel stuff. Looking forward to it actually.
 
As an AMD user and having had the opportunity to use both Ultra Series and X3D, I will FOR SURE be going back to Intel even if the numbers still slightly favor AMD whatever is out by then if this is true.

I am a firm believer that even while slower in games than the X3D, the fact they got rid of Hyperthreading in the Ultra Series it aided a lot in overall stability, I did not experience a single stutter until I moved from my 285K to the 9950X3D. It felt more solid and snappier in Windows and I did not have to worry about setting CPPC in BIOS, making sure my cores are parking correctly, making sure games use the right cores, its just not something you can set and forget because once you do, a stutter here or there or a loss in fps will remind you. I will also never trust AMD to ever have solid drivers based on my collection of Radeon cards from 4850 to the 7900XTX and now stutterfest X3D CPUs (there's a dictionary online on what you can try to fix this but they usually miss). I would trade 200FPS off my total 700FPS in Valorant (for example) just to never experience a stutter again.
 
As an AMD user and having had the opportunity to use both Ultra Series and X3D, I will FOR SURE be going back to Intel even if the numbers still slightly favor AMD whatever is out by then if this is true.

I am a firm believer that even while slower in games than the X3D, the fact they got rid of Hyperthreading in the Ultra Series it aided a lot in overall stability, I did not experience a single stutter until I moved from my 285K to the 9950X3D. It felt more solid and snappier in Windows and I did not have to worry about setting CPPC in BIOS, making sure my cores are parking correctly, making sure games use the right cores, its just not something you can set and forget because once you do, a stutter here or there or a loss in fps will remind you. I will also never trust AMD to ever have solid drivers based on my collection of Radeon cards from 4850 to the 7900XTX and now stutterfest X3D CPUs (there's a dictionary online on what you can try to fix this but they usually miss). I would trade 200FPS off my total 700FPS in Valorant (for example) just to never experience a stutter again.
Some genuine questions because I don't have any recent big.little Intel CPU.
Is core parking not working? Is the stutter from crossing over CCD's? Is Intel's thread director just lightyears better preventing games from hitting ecores?
 
Some genuine questions because I don't have any recent big.little Intel CPU.
Is core parking not working? Is the stutter from crossing over CCD's? Is Intel's thread director just lightyears better preventing games from hitting ecores?

Well it could be related to the X3D cache so Intel's first iteration of their own "X3D" may actually suffer from stuttering problems too.

Core parking works, but you need to make sure it works and is working by enabling game mode, setting power profiles to balanced etc. Non-enthusiasts wouldn't know that they've been playing games on the wrong CCD lmao. Regarding stuttering, nobody knows where its coming from there's just a bunch of band-aid fixes online but I am betting it's due to a combination of AMD's amazing drivers, X3D Cache and Hyperthreading.
 
I have always disagreed with the way AMD did their earlier 3D CPUs. I do not like the idea of a gaming only CPU.. but mediocre at everything else. A regular CPU can play games no problem lol.. put the cache on all of em.
But Intel isn't going to put this on everything either, you know this just as well as I do.

Its a waste of money to put it on low end chips, to begin with.
 
Well it could be related to the X3D cache so Intel's first iteration of their own "X3D" may actually suffer from stuttering problems too.

Core parking works, but you need to make sure it works and is working by enabling game mode, setting power profiles to balanced etc. Non-enthusiasts wouldn't know that they've been playing games on the wrong CCD lmao. Regarding stuttering, nobody knows where its coming from there's just a bunch of band-aid fixes online but I am betting it's due to a combination of AMD's amazing drivers, X3D Cache and Hyperthreading.
I often wondered if stuttering issues had to do with DDR5. I don't recall stuttering being a problem with AM4 X3D but none were dual CCD on that platform either. DDR5 has on die ECC and there is no telling if you're running into issues because their is no reporting, no way to tell how frequently a chip is correcting, or if it's getting worse over time from some degradation, incompatibility, or additional defect.
 
Intel being Intel, I'd expect them to only put 3D cache on the high end core 7 and core 9, and charge a massive premium for it.
I've never had any stuttering issues on my AMD system though, sounds like a stability issue.
 
Good, I think V-Cache is a terrible way to segment CPU's, always have.
I hope Intel gives it to AMD over this one.
The article reads that only a few SKUs would have such extra cache. So, Intel would be doing the same.
Besides, entire line-up does not need extra cache. Those SKUs are specifically for gaming crowds.
 
X3D was invented by AMD, not Intel. It's already patented and Intel won't able to copy AMD's idea. That's why Intel want to create own version of "X3D" as more capacity of L3 cache.
X3D patent is layered L3 cache above or below the compute die.

Intel just had to increase the L3 cache size, especially with Meteor Lake where they have the packaging capability to just create an L3 tile and add it to the mix. Hell, some Arrow Lake SKUs have blank "filler" tiles next to the compute tile to keep the package rectangular, yet they top at out 36MB L3 cache against AMD's 128MB. Intel have the manufacturing capability and flexibility, as well as spare physical space to add L3 cache to their products without touching AMD's patent. They just didn't.
 
AMD also got the advantage of being able to fab X3D chiplet in bulk that can either be used for Epyc or desktop Ryzen. Meanwhile Intel got to make a specific fab allocation for those chips.
The article reads that Xeon Clearwater Forrest also have this technology.
 
Can't they just leave these p, e, and lp-e cores alone? :(
 
AMD should counter by releasing X3D as standard on both regular Zen and Zen-Compact chiplets. Then every AMD CPU could technically be a gaming CPU in a pinch (or run other specialized workloads that also enjoy 3D Cache).
This is not going to happen, as it is not needed by all users. Diverse portfolio is the way to go.

I dont think the US Government will allow Intel to spin off its Fabs, they got way too much money from the CHIPS act to try and dump that. The whole point is to have on shore fabs and TSMC has already stated that their highest technology will be made in Taiwan only.
This is changing now. N2 fab has been approved for building in the US and future nodes will be coming soon after deployment on Taiwan.

Shame they aren't pushing nova lake to q4 2025 and are getting an arrowlake refresh
Nova Lake was never planned for 2025 and Arrow Lake refresh is not certain at the moment.
 
I am a firm believer that even while slower in games than the X3D,
personally not that much, AMD X3D just win on the maximum FPS, lows is a very different story based on what I tested (I have the 9950X3D as well)
yet they top at out 36MB L3 cache against AMD's 128MB.
Intel's L0 and L1 are faster than AMD's, that's why the RPL can go toe to toe with 9950X3D, and ARL when tweaked can also get there now..
 
Intel's L0 and L1 are faster than AMD's, that's why the RPL can go toe to toe with 9950X3D, and ARL when tweaked can also get there now..
I don't really care how we get there, as long as we get there.

If Intel can compete in gaming loads again then it's good for everyone. I'm just looking at the dumpster-fire that is the Arrow Lake gaming benchmarks where they're often slower than 13th gen with a few popular titles their average FPS is worse than AMD's minimum FPS. Not a good look for Intel and gaming.

As long as they keep E cores, Intel stays sucker
I like e-cores, just not for gaming, for the same reason I love laptops with Zen C cores.

Since Alder Lake, Intel have had 2P8E U-series processors that make for very capable productivity laptops where the workloads either need burst single-threaded performance, or they need maximum multi-threaded performance per Watt. A cluster of 4 E-cores running at ~2.5GHz does way more rendering per Watt, for example, than a single P-core running at 4GHz+ That same die area could have instead been a 4P0E processor and we all know how quad cores suck at multi-threaded workloads these days ;)
 
Last edited:
How did Intel get so bad? 3D V Cache is already 3 years old and it's absolutely dominant in gaming. Why is it going to take until Nova Lake? (late 2026, early 2027)? Even if Intel missed the boat to get it into Raptor Lake, they should have been throwing cache at Meteor lake at a minimum and it's a motherflippin' digrace that Arrow Lake didn't have this.

I'm seeing why Intel is losing marketshare so fast. They're too far behind and their agility in the face of competition is on par with an ocean-faring supertanker. Let's be very generous and very optimistic and say that this rumour is accurate AND the gaming variants with extra cache are launching on time in the first wave of Nova Lake releases. If both those things are true, Intel might have an answer to AMD's X3D range a mere four years late.

To put that in perspective, four years ago Intel had just launched their back-ported Rocket Lake onto 2015's 14nm process, and the Radeon 6900XT was the most recent flagship GPU launch.
Maybe they needed deep architectural changes in order for the additional cache to truly boost gaming workloads.
I think their aproach with previous arch was more general purpose.
Efficiency and reliability aside, the 14900k was very good in gaming as well.
 
The article reads that Xeon Clearwater Forrest also have this technology.
As far as Know, Xeons don't share the same compute die used for the consumer product where P-core and e-cores are on the same monolithic die. So they still need a specific wafer allocation. AMD is using the same CPU die for Epyc and its consumer platform (besides the APU). The way that AMD is using Chiplets gives it more flexibility compared to Intel.
 
Corporate customers want them. Gaming is not the end all be all it's not even a productive use of a CPU it's just masturbation. When it comes to corporate laptops and mini desktops which is the majority of stuff sold people want intel. Even for a lot of servers and workstations customers want intel.

Intel + intel or nvidia graphics is the default out there and what corporations buy for Windows. If they are going to move off intel for laptops and mini desktops it's going to be when Windows on ARM is good enough as more and more applications are just done in a browser now. It's not dirty tricks either here.

Stop thinking like a gamer. We are in the same situation as when it was Athalon 64. It dominated in servers and dual socket workstations when it came to corporations but for desktops it was all Pentium 4 and for laptops it was all Pentium M. Except this time around intels current products are actually only worse than AMDs when it comes to gaming and not for actual work. And again, gaming doesn't really count for work any more than masturbation does.

When it comes to the corporate world AMD is in servers and workstations you won't see it in other areas. Even then it will not be with an AMD GPU. And if the system is really important it's now not going to be using x86 at all. You're talking stuff from IBM. And at the server level we often aren't talking Windows at all but linux or Unix.
How’s this for sounding like a gamer:

High volume, mass production requires factories to remain at peak capacity or the profitability of the factory goes into the red. A drop in production by just a little can lead to a factory closing. This is where Intel finds itself. Instead of 10 million chips per year (made up number) they are only making 7.5 million chips (made ip number). Even though revenue is still many billions, profits are negative.

You can only last that long for a few quarters. Intel has been losing money per quarter for the better part of a year now. All the lucrative government, corporate and server contracts will not be enough to get Intel fabs back to peak capacity again in such a crowded chip market.

So you tell me which is better, TSMC that is close to breaking $30B per quarter or Intel that is close to dropping below $10B per quarter? I’ll give you a hint (whispers in ear) the higher number is better.
 
Good, I think V-Cache is a terrible way to segment CPU's, always have.

I hope Intel gives it to AMD over this one.

Me too, i switched to AMD from Intel as i was dissapointed they were lacking in gaming. I hope they do get back in the game(no pun intended) I will happily jump ship again as i favour neither camp and just want the best bang for my squids. I did really like the 12700K so hopefully Intel can pull their socks up.
 
The article reads that only a few SKUs would have such extra cache. So, Intel would be doing the same.
Besides, entire line-up does not need extra cache. Those SKUs are specifically for gaming crowds.
But Intel isn't going to put this on everything either, you know this just as well as I do.

Its a waste of money to put it on low end chips, to begin with.

I didn't mean the entire line up.. I guess I should have been a little more concise.

To me the ones that count are mid and higher, low end means nothing to me.
 
Status
Not open for further replies.
Back
Top