• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Intel Core Ultra Arrow Lake Preview

My biggest take away from the whole slide deck: Intel used the word ‘PAR’ almost as much as ‘AI’.

I guess Intel’s marketing department has a few golf fans.

Edit: Also I’m enjoying reading everyone’s interpretation of that crazy slide from impossible linear performance increases with power to the same performance at each power level to provide bonus winter heat. Lol!
 
This slide looks good:

View attachment 366967

Intel more energy efficient than AMD? Honestly hard to believe.

Speaking about energy savings, I wonder why they did not post the values for each of these games:

View attachment 366968
Especially considering that AMD hits a ceiling at about 150W or so and barely scales above it. Furthermore, at lower power levels the scale should be getting close to vertical.
It's quite well known that Intel scales a lot further with power, but the slides seem to imply that the 14900K and 9950X have a similar power/performance curve which is just wrong.
 
Which? And what are you targeting?

Pretty much all of them can be currently in parts. Although part of it is targeting 1440p 120+

But most recently the First Decendent can be pretty cpu limited in parts.

There are also older games like the Witcher 3 NG that crawl to 50-60fps in parts I'd love to see that improved.

I hit cpu limits in FF16 in parts as well.

1440p UW btw just for reference.

So almost everything I'm currently playing.

But honestly stuff not moving forward is dissappointing regardless and having essentially the same cpu gaming performance for the last few years and on top of that waiting another 18-24 months for a meaningful uplift sucks assuming it even comes.

I'm confused why people are ok with no progress though.
 
But I don't even know what it means. Three equal height bars with different numbers above them compared to an ON PAR baseline. WTF!
AMD's first party benchmark claims for Zen 5 = impossible to replicate
Intel's first party benchmark claims for Arrow Lake = impossible to understand

Can't wait to see the next stage in this emerging competition
 
Pretty much all of them can be currently in parts. Although part of it is targeting 1440p 120+

I'm confused why people are ok with no progress though.
Fair enough, and interesting, though yeah, it was already known the 4090 was pretty much a tie with all the best CPUs at its time already in terms of relative perf in gaming, I remember it necking even the review results.

As for being confused; I don't think people are ok with no progress, I think people just don't need much of it. Let's face it, even your use case still speaks of framerates way higher than the bottom line required for 'playable'. Witcher 3 NG... which does some of its graphics on the CPU if I'm correct... is a pretty edge case thing. Isn't the First Descendant online?

Also, I think people have been getting one reality check after another wrt 'performance increases'. Most notably with the latest Intel fiasco; every time we see excessive power figures causing trouble; so if then the architectural bounds aren't explored further, what progress is there to make? We're seeing Intel struggle. We're seeing AMD struggle. Apparently there's not much fruit left, and a lot has been picked in the last decade, too. Nothing's gonna keep giving forever.
 
Interesting how tables have turned. When Ryzen launched, it had better MT performance and worse gaming performance.

Also, no word on platform longetivy. Guess Arrow Lake's customers will be throwing their MB away after 1 or best case 2 generations.

Next.
Hey, if you believe the marketing lies, this current Intel socket has lasted THREE generations!
Just don't look too closely at the differences between 13th and 14th gen in case you become all cynical like I am ;)
 
Which? And what are you targeting?
100fps in Cyberpunk at 4K DLSS (no FG) im dropping to 60% gpu usage and full saturation on P cores, some on e - 70% CPU usage.

Same thing in remnant 2 - 90FPS mins, 120ish fps avg, and a bunch of other - single player titles. - BG3 is another one, but high fps in that one matters much less.

Basically at 80 FPS CPU bound the game feels kind of bad.

Kind of a waste to have a 4K 240Hz monitor and be running at 100FPS with gpu sitting at 60% at 4k.
 
Fair enough, and interesting, though yeah, it was already known the 4090 was pretty much a tie with all the best CPUs at its time already in terms of relative perf in gaming, I remember it necking even the review results.

As for being confused; I don't think people are ok with no progress, I think people just don't need much of it. Let's face it, even your use case still speaks of framerates way higher than the bottom line required for 'playable'. Witcher 3 NG... which does some of its graphics on the CPU if I'm correct... is a pretty edge case thing. Isn't the First Descendant online?

Also, I think people have been getting one reality check after another wrt 'performance increases'. Most notably with the latest Intel fiasco; every time we see excessive power figures causing trouble; so if then the architectural bounds aren't explored further, what progress is there to make? We're seeing Intel struggle. We're seeing AMD struggle. Apparently there's not much fruit left, and a lot has been picked in the last decade, too. Nothing's gonna keep giving forever.

I'm still concerned with the lack of progress we were getting decent gains from Zen+ all the way up to zen 4 on the gaming side mostly double digits and even Alderlake was a nice bump over skylake++++++/Rocketlake and now we got Zen5% and ARL negative 5%.....

Even if not everyone can take advantage of it more is still better and while i wasn't thinking this would be worlds better 10-15% would have been nice. It's never good when somthing new comes out and it can't even beat it's competition 7800X3D and it's predecessor 14900k/13900k at gaming....

I mean like with everything we have to wait for reviews but it isn't looking good for anyone that was hoping for somthing better than what we've had for a while now.

100fps in Cyberpunk at 4K DLSS (no FG) im dropping to 60% gpu usage and full saturation on P cores, some on e - 70% CPU usage.

Same thing in remnant 2 - 90FPS mins, 120ish fps avg, and a bunch of other - single player titles. - BG3 is another one, but high fps in that one matters much less.

Basically at 80 FPS CPU bound the game feels kind of bad.

Kind of a waste to have a 4K 240Hz monitor and be running at 100FPS with gpu sitting at 60% at 4k.

So much for a gaming monster #rip gaming performance.
 
100fps in Cyberpunk at 4K DLSS (no FG) im dropping to 60% gpu usage and full saturation on P cores, some on e - 70% CPU usage.

Same thing in remnant 2 - 90FPS mins, 120ish fps avg, and a bunch of other - single player titles. - BG3 is another one, but high fps in that one matters much less.

Basically at 80 FPS CPU bound the game feels kind of bad.

Kind of a waste to have a 4K 240Hz monitor and be running at 100FPS with gpu sitting at 60% at 4k.
You're well above 60 FPS. High refresh was always a hard target, this isn't going to ever change. Games aren't optimized around it. This was true in 2000, in 2010, and in 2024.

240hz monitors... yeah. The industry has placed some nice carrots to chase for us gamers, eh. Nobody selling a monitor is promising you 240 FPS though. Commerce and its funny tricks :) Reflect on your situation now, you've been diving head first into top end hardware and the target to reach is always moving further away from you. And for what? Minute differences.

You cannot escape the mainstream performance targets. When it moves forward, we move forward. Everything else is wishful thinking. Its an economical impact on the gaming performance 'market', kind of. After all, if all games run at 60 fps just fine, less optimization is needed so that budget is spent elsewhere or pocketed altogether. Consoles don't run that 240 FPS target though, and neither does the overwhelming majority of gaming systems. It makes zero business sense to optimize for it; so what does hardware do for you there? It tries to brute force it. GL with that.
 
Last edited:
240hz monitors... yeah. The industry has placed some nice carrots to chase for us gamers, eh. Nobody selling a monitor is promising you 240 FPS though. Commerce and its funny tricks :) Reflect on your situation now, you've been diving head first into top end hardware and the target to reach is always moving further away from you. And for what? Minute differences.
Yeah but the dive makes me unreasonably happy, and as do those moments when you hit that 200FPS+ average in a game at 4k and it's awesome.

The point is, CPU is holding back GPU much more than during the 3xxx series, where you had the 5800x and 10900K then the 12 series and 5800x3d. By contrast 2024 CPUS are kind of meh. GPUs have been gaining 40%+ per generation while with the exception of x3d on CPU we're sitting at +15-20% last gen, and now +5% or in Intel's case -1%.

When the 5 series comes out, the gap between high end GPU and high end CPU is going to be even bigger, no wonder nvidia is going to sandbag the 5080.
 
You're well above 60 FPS. High refresh was always a hard target, this isn't going to ever change. Games aren't optimized around it. This was true in 2000, in 2010, and in 2024.

240hz monitors... yeah. The industry has placed some nice carrots to chase for us gamers, eh. Nobody selling a monitor is promising you 240 FPS though. Commerce and its funny tricks :) Reflect on your situation now, you've been diving head first into top end hardware and the target to reach is always moving further away from you. And for what? Minute differences.

You cannot escape the mainstream performance targets. When it moves forward, we move forward. Everything else is wishful thinking. Its an economical impact on the gaming performance 'market', kind of. After all, if all games run at 60 fps just fine, less optimization is needed so that budget is spent elsewhere or pocketed altogether. Consoles don't run that FPS though, and neither does the overwhelming majority of gaming systems.

I care less about the actual perfomance numbers regardless of if that mean 60-120-240 etc and more that each generation brings a meaningful uplift especially when real generations are now 2 years ish apart now we are in a scenario where it could be nearly a half decade before we get a meaningful gaming uplift over Zen4X3D and Raptorlake which still sucks regardless of who can or can't take advantage of it.

Yeah but the dive makes me unreasonably happy, and as do those moments when you hit that 200FPS+ average in a game at 4k and it's awesome.

The point is, CPU is holding back GPU much more than during the 3xxx series, where you had the 5800x and 10900K then the 12 series and 5800x3d. By contrast 2024 CPUS are, well, kind of meh. GPUs have been gaining 40%+ per generation while with the exception of x3d we're sitting at +20% last gen, and now +3%.

When the 5 series comes out it's probably going to be even worse.

I agree a lack of progress is the most concerning what numbers they actual hit not so much.
 
100fps in Cyberpunk at 4K DLSS (no FG) im dropping to 60% gpu usage and full saturation on P cores, some on e - 70% CPU usage.

Same thing in remnant 2 - 90FPS mins, 120ish fps avg, and a bunch of other - single player titles. - BG3 is another one, but high fps in that one matters much less.

Basically at 80 FPS CPU bound the game feels kind of bad.

Kind of a waste to have a 4K 240Hz monitor and be running at 100FPS with gpu sitting at 60% at 4k.
I agree with your overall point, as well as @oxrufiioxo 's. I don't game much these days, but when I do the games tend towards CPU bottlenecks, strategy and so forth.

The Cyberpunk numbers seem off, though. You have a faster CPU than I do, and a much faster GPU. And you're running much faster memory. If I remove GPU bottlenecks, my FPS is usually well in excess of 100. A number of benchmarks appear to agree. Could this be another one of those fabled scheduling issues? I don't use Windows anymore, but Process Lasso seems very helpful to people who do.

I care less about the actual perfomance numbers regardless of if that mean 60-120-240 etc and more than each generation brings a meaningful uplift especially when real generations are now 2 years ish apart now we are in a scenario where it could be nearly a have decade before we get a meaningful gaming uplift over Zen4X3D and Raptorlake which still sucks regardless of who can or can't take advantage of it.
Yeah, it's disappointing in principle when tech stagnates. It's also disappointing in the long term. I don't actually care about whether Zen 5 or Arrow Lake is a worthwhile upgrade for me, today, but stagnation today usually means less in the way of worthwhile upgrades years down the line, to say nothing of value-for-money.
 
Bad naming, the gaming performance is the same with previous gen, nobody cares about this power consumption, gamers care about FPS, that's all that matters, hope AMD will focus on that.
Probably this new generation was also affected by that Vmin issue, patched it and cut performance, intel, no thank you!
 
I agree with your overall point, as well as @oxrufiioxo 's. I don't game much these days, but when I do the games tend towards CPU bottlenecks, strategy and so forth.

The Cyberpunk numbers seem off, though. You have a faster CPU than I do, and a much faster GPU. And you're running much faster memory. If I remove GPU bottlenecks, my FPS is usually well in excess of 100. A number of benchmarks appear to agree. Could this be another one of those fabled scheduling issues? I don't use Windows anymore, but Process Lasso seems very helpful to people who do.
Im cherry picking the zones with alot of pedestrians (as you run around the game theres 3-4 of them) so this is in worst case, also I have my crowd density set to high, which is primarily what causes it.

My 1% lows in those zones are in the high 80's with averages in the 100s (nvidia performance monitor), when i walk away from that zone it goes to 168FPS avg and 130FPS min. I've just did a clean install of 24H2 so I don't think it's a scheduling issue. But if i ran like a average benchmark it would look better than that -- but your gaming experience is tied to the worst case scenario usually.
 
I agree with your overall point, as well as @oxrufiioxo 's. I don't game much these days, but when I do the games tend towards CPU bottlenecks, strategy and so forth.

The Cyberpunk numbers seem off, though. You have a faster CPU than I do, and a much faster GPU. And you're running much faster memory. If I remove GPU bottlenecks, my FPS is usually well in excess of 100. A number of benchmarks appear to agree. Could this be another one of those fabled scheduling issues? I don't use Windows anymore, but Process Lasso seems very helpful to people who do.

I haven't observed what he has as well but I've hit limits at 1440p uw in parts.


I think most of us are talking about the worse parts of games not the most performant areas. It's all about a more consistent experience moving those 0.1% and 1% lows up to the point that I don't even care about averages anymore it's all about the 1% lows to me.

People fixate too much on averages and I guess are ok with random drops in framerate due to cpu performance... I am not and try to eliminate it as much as possible. Maybe I'm just more sensitive to frametimes than most.


And just to be clear any statement I make are based on my own usecase alone everyone has to look at the numbers for themselves and decide how they feel about them... People love Zen5% good for them and I'm sure people like what they see here. I for one don't belong to either of those camps.
 
Thanks for the write-up W1zz.
Can you even get a good canned bench for SM2 and CS2? I mean... its all about assets on screen and in game logic. And isn't CS2 still a big pile of doodoo?
Yes CS2 is still doodoo lol. They added DLSS and it doesn't do much at all. It's still fun for an occasional playthrough.

I'd think you could make a standardized benchmark for both those. For CS2 you could just have a premade world save file that you load each time and benchmark off that. Hopefully a fairly busy world as the performance really starts chugging over 100k population.

Space Marine 2 might be harder. What about going to the same spot in an operation?

I'm sure W1zz will figure it out, and even if SM2 and CS2 aren't included we'll still have a clear picture of how the processors compare in games.
 
Here's a review test request: Please run the Ultra 5 at 65W so we can get a better feel for how it compares to the appropriate AMD parts as well as how a future non-K part will do.
 
I haven't observed what he has as well but I've hit limits at 1440p uw in parts.


I think most of us are talking about the worse parts of games not the most performant areas. It's all about a more consistent experience moving those 0.1% and 1% lows up to the point that I don't even care about averages anymore it's all about the 1% lows to me.

People fixate too much on averages and I guess are ok with random drops in framerate due to cpu performance... I am not and try to eliminate it as much as possible. Maybe I'm just more sensitive to frametimes than most.


And just to be clear any statement I make are based on my own usecase alone everyone has to look at the numbers for themselves and decide how they feel about them... People love Zen5% good for them and I'm sure people like what they see here. I for one don't belong to either of those camps.
Absolutely agree. I locked the game to 80 fps for that very reason when I played it. Great experience, buttery smooth. I think my 0.1% lows were at like 76. But I also have a middling GPU and a 1440p monitor. Sure, real world performance won't match review benchmarks, but phanbuey's reporting an enormous disparity. The reviews I linked were from a guy who doesn't use canned benchmarks; he isn't cherry picking performant areas--he may not find the absolute worst-case scenario, but if he records a 1% low of 188 FPS on a 13700k, I'd expect drops below 100 to be exceedingly rare.

But this is off-topic. It just felt like phanbuey's specific problem might be fixable without a CPU upgrade.
 
Interesting how tables have turned. When Ryzen launched, it had better MT performance and worse gaming performance.

Also, no word on platform longetivy. Guess Arrow Lake's customers will be throwing their MB away after 1 or best case 2 generations.

Next.

Actually they didn't have worse gaming performance, it was neutral almost across the board while having higher MT performance and better efficiency. The hypocrisy is real when it comes to Intel fans.

It’ll be a one and done platform if I had to guess. Rumor mill seems to point at that too.
 
Welp I guess Arrow Lake is going to be like Zen 5 with them focusing on efficiency, although to be fair Intel specially needs to improve their multi-threaded efficiency since they have been getting dropkicked in that area by AMD. Still going to wait for reviews anyways but to be honest I am not expecting anything special. Waiting patiently for the 9000X3D chips now to see if they're bring back good gaming improvements, otherwise for this gen both Intel and AMD are mediocre to be honest.

I was hoping they would at least mention something about Bartlett Lake since I am curious to see if they're going to try to backport the Lion Cove cores back into LGA1700.
Actually they didn't have worse gaming performance, it was neutral almost across the board while having higher MT performance and better efficiency. The hypocrisy is real when it comes to Intel fans.

I’ll be a one and done platform if I had to guess. Rumor mill seems to point at that too.
People were mostly angry due to the misleading marketing overhyping Zen 5 performance gains that were false, yea but sure AMD good Intel bad. Being a diehard fanboy for any company in general is cringe, quit it please.
 
The worst year for CPUs in 10 years? 9800X3D save us from this misery.

Intel managed to do Rocket Lake all over again. 4 x P core laptops and regressions in performance all over again.

Welcome to 11th gen.

Welp I guess Arrow Lake is going to be like Zen 5 with them focusing on efficiency, although to be fair Intel specially needs to improve their multi-threaded efficiency since they have been getting dropkicked in that area by AMD. Still going to wait for reviews anyways but to be honest I am not expecting anything special. Waiting patiently for the 9000X3D chips now to see if they're bring back good gaming improvements, otherwise for this gen both Intel and AMD are mediocre to be honest.

I was hoping they would at least mention something about Bartlett Lake since I am curious to see if they're going to try to backport the Lion Cove cores back into LGA1700.

People were mostly angry due to the misleading marketing overhyping Zen 5 performance gains that were false, yea but sure AMD good Intel bad. Being a diehard fanboy for any company in general is cringe, quit it please.

AMD had +2 percent game performance and Intel has -5. Really. Bad year. Nobody is being a fanboy except you here.
 
I'm confused why people are ok with no progress though.

If they're content with no progress, they've completely lost their sense of direction.

Even if we're GPU limited nowadays, that's no excuse to slack off. If anything, it’s the perfect time to future-proof for all the wild and CPU-hungry mechanics coming our way. Tech should keep moving forward with faster and more efficient CPUs, giving devs the tools to create better, richer and more complex games. Who knows, maybe we’ll finally get those insane game mechanics or next-level physics that just weren’t possible before.... you know more ways to break stuff and make it look real/etc

The 9800X3D sounds GRRREAT but can't be the answer because it's way too expensive and most gamers won’t go for it. We need 'progress' across the board with better gaming CPUs at all price points. That'll drive improvements with developers having the incentive to push those untouched heavier boundaries.
 
I cannot wait for this. Give me that Asrock OCF, 265K, And teamgroup xtreem 48gb ddr5 8200.
 
Looks like socket still has similar questionable design choise as seen previously.
Really disappointing Intel didn't even completely fix the socket bending issue despite changing the socket.
Actually they didn't have worse gaming performance, it was neutral almost across the board while having higher MT performance and better efficiency. The hypocrisy is real when it comes to Intel fans.

It’ll be a one and done platform if I had to guess. Rumor mill seems to point at that too.
If there's only going to be one gen for Z890 then thats pretty bad, especially given how expensive boards are, I know Intel fans love to dismiss socket longevity but not even a CPU refresh is really bad.
 
Welp I guess Arrow Lake is going to be like Zen 5 with them focusing on efficiency, although to be fair Intel specially needs to improve their multi-threaded efficiency since they have been getting dropkicked in that area by AMD. Still going to wait for reviews anyways but to be honest I am not expecting anything special. Waiting patiently for the 9000X3D chips now to see if they're bring back good gaming improvements, otherwise for this gen both Intel and AMD are mediocre to be honest.

I was hoping they would at least mention something about Bartlett Lake since I am curious to see if they're going to try to backport the Lion Cove cores back into LGA1700.

People were mostly angry due to the misleading marketing overhyping Zen 5 performance gains that were false, yea but sure AMD good Intel bad. Being a diehard fanboy for any company in general is cringe, quit it please.

Me quit it? Sorry I pointed out the massive double standards people generally have on TPU when it comes to Intel. I don’t care to promote or fall on the sword for any company. The only “cringe” thing is the obvious stark difference between general trending comments between the two releases.
 
Back
Top