• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Core Ultra 7 265K

Maybe 240F will be interesting, with a low price point. And popular. People gonna buy it, just to sit and wait for the next 3xx Gen.

It doesn't stand a chance compared to the 12400F and the low cost of DDR4. And a bad memory controller doesn't help. The novelty aspect of having the new socket is not the deciding factor.

there is actually an Asrock ITX board called NOVA. with a clear nod to Lake Nova. The 300 generation E-core monster may have been scrapped, the 400s are next.
 
Last edited:
It doesn't stand a chance compared to the 12400F and the low cost of DDR4. And a bad memory controller doesn't help. The novelty aspect of having the new socket is not the deciding factor.

there is actually an Asrock ITX board called NOVA. with a clear nod to Lake Nova. The 300 generation E-core monster may have been scrapped, the 400s are next.
ddr5 price will drop, and if 240F with cost the same, its obviously gonna be a better choice, but we have to look at 240F first. I am sure if Intel will release 120$ cpu for a new socket.
 
Bro have tested with enabled Win 24H2 and VBS off because Ryzen would destroy Intel even more...
Where are the branch prediction patched OS mate?????
 
Bro have tested with enabled Win 24H2 and VBS off because Ryzen would destroy Intel even more...
Where are the branch prediction patched OS mate?????
24H2 = red bar:
Untitled.png
 
This is not good, AMD has little reason to keep raising the bar. We've officially switched places to when Intel was so far ahead of AMD that it basically just re-released chips.
 
I was looking a this processor I'm not sure what the price is in your country but here the processor alone is $570 plus 13% tax plus a new motherboard I'll need one anyways but they aren't cheap. Anyways I'm a bit disappointed I have seen some brutal reviews and perhaps Intel did jump the gun in its release. But to be honest what kills it for me is the possibility of the 800 series being dead on arrival without Intel announcing any successor to Arrow lake but announcing NOVA cove and not committing it to 1851. Too much money to simply toss at a possible 1 hit wonder while Intel trys gets its crap together.
 
I am disappointed with the gaming performance. I had been waiting for this generation to finally replace my 10700K, and now I realize I could have gotten a mid-tier 13600K long ago and wouldn’t have lost anything compared to the Core Ultra. It’s a shame.
 
Thanks for the 1440p graphs. That is what I am gaming at and it looks like I don't need a 7800x3d for that :)
Now if only Intel would discount these new procs a lot I could be tempted.
 
the leap is so small for rayzen and intel both are skippable this gen
Depends on your use case. Also, most people don't upgrade every generation.

285K is almost 100 times as fast as my Pentium 4 2.0Ghz, some times its 1000 times, and so?

Right now new core ultra looks like an unfinished product, with advantages in very limited scenarios.
And ton of disadvantages, like buying new mother board and overpriced 285 and 245 models.

Maybe 240F will be interesting, with a low price point. And popular. People gonna buy it, just to sit and wait for the next 3xx Gen.
This one wont bend or overheat like the 13 and 14th gen.
 

It seems that roadmap commitments and market pressures are as always prioritized at the expense of product stability and we end up with releases that perform inconsistently. Eventually when these problems are fixed, the damage is already done with lasting reviews or early adopter feedback. I respect the challenges intel is having to face with the hyrbid arch but its simply not good enough.
 
All we can do is hope that the OS and game/software devs will eventually make full use of this inevitable shift to our advantage. I also feel Intels mainstream heterogeneous nose-dive might have been either poorly timed or perhaps a necessary leap to kick-start the inevitable future. 'BIG changes in little increments' or just go "BIG.little", time will tell!

The only motivation for the big leap was to economically challenge the core count AMD was able to produce, nothing else. AMD was able to bring large core counts to market in a very economic way with chiplets, Intel EMIB wasn't ready and it wasn't economically viable to produce a monolithic high core count cpu for the desktop market. Even in server AMD is beating Intel hard with number of cores/$. That's the reason.

There can be efficiency advantages in using different core types but they only really matter for mobile platforms like laptops. Servers are busy all the time so don't need to worry about core type, they just want more of them, and efficiency doesn't matter in a desktop - not in the way heteregoneous computing helps with anyway.
 
The only motivation for the big leap was to economically challenge the core count AMD was able to produce, nothing else. AMD was able to bring large core counts to market in a very economic way with chiplets

All the better! I remember being stuck with Intels quad-core processors for years, with the 7700K being my last. It took Intel a while to embrace competitive desktop 6-8 core offerings and credit where its due (AMD) for pushing them. In other words, i'm fully on board with companies striving to outdo each other to make things happen - thats the norm. Same applies to higher core count prosumer models with plenty of applications better tuned for demanding more parallel processing power.

Its no surprise that Intel had to respond with their own innovations - bringing hybrid architectures to desktop, to stay competitive. I'm an optimist and I like seeing this level of compo where both companies are constantly pushing boundaries, unfortunately for intel those boundaries are hitting back but in hopes of better optimisations going forward. Intels transition to big.LITTLE was ambitious but it does show promise going fwd.
 
How come @1440p Hogwards Legacy every CPU is above 200 FPS? This game is very CPU demanding and it should be below 100fps for most of the processors
 
I think Intel is in big doodoo with this CPU, and basically, the whole series. That's by far, the worst CPU line-up from the past 10 years....
 
I think Intel is in big doodoo with this CPU, and basically, the whole series. That's by far, the worst CPU line-up from the past 10 years....
There should be a refresh on this one. Might be a bit faster after that. Perhaps we could see a jump like Alder Lake to Raptor Lake when that happens.

Sometimes one step backwards to move two steps forwards.
 
How come @1440p Hogwards Legacy every CPU is above 200 FPS? This game is very CPU demanding and it should be below 100fps for most of the processors

Thats odd i usually check all individual gaming benchmarks and didn't pick up on it. I guess i'm too busy looking at the GPU rankings in the charts. 200fps+ does look out of place for this game. My nephews running a 5600X at 1080p and we had to dial down the settings to hit 60fps+

@W1zzard
 
Arrowlake is not impressive. Needed step sure but it does nothing new or anything that preexisting chips don't already do. AMD crushed it in efficiency and in most areas of performance when comparing it against AMD's halo products while Intel's older chips offer much better performance per dollar if not as good efficiency. Not trashing Intel but outside of 14th gens issues I would rather buy them at their current prices if AMD didn't have such good/efficient products.
 
This is not good, AMD has little reason to keep raising the bar. We've officially switched places to when Intel was so far ahead of AMD that it basically just re-released chips.

I don't know what you're hinting to here so let me address possible scenarios cause the 2nd is really bad.
if it's zen5 it's outright wrong architecture is wildly different, it's mediocre buuuut aaanyways,
if your thinking about the thing no-one seems talks about, hell yes.

re-releasing more stupid zen3 sku's, re-releasing laptop chips with misleading names which is very anti-consumer practices.

What is a Ryzen 5 7235hs, or what about a ryzen 5 7520U ?
desktop ryzen 5700.. you think that's the same chip as 5800x in a worse bin, ohh no, amd snuck in one.. that's a mobile cpu with half the L3 so it performs like near zen2 in games
laptops with 5500U (zen2) 5600U (zen3) 5700U (zen2) 5800U (zen3)....

- So to unsuspecting customers the ryzen brand is a minefield with re-releases, naming being inconsistent and or performance so you end up with a 4 year old cpu architecture in the latest lineup..
 
My biggest concerns are Zen 4 beating Zen 5 and Raptor Lake beating Arrow Lake in like a third of the tests. This year's new CPUs are an absolute disaster. There is no way I would willingly upgrade to a CPU that is essentially just a sidegrade, sacrificing performance in some areas for others.
 
Thats odd i usually check all individual gaming benchmarks and didn't pick up on it. I guess i'm too busy looking at the GPU rankings in the charts. 200fps+ does look out of place for this game. My nephews running a 5600X at 1080p and we had to dial down the settings to hit 60fps+

@W1zzard

CPU and GPU performance in Hogwarts is highly dependent on location and time of day. I'm curious what different location(s) in the game each benchmarker uses because they'll likely test very differently.
 
CPU and GPU performance in Hogwarts is highly dependent on location and time of day. I'm curious what different location(s) in the game each benchmarker uses because they'll likely test very differently.

This is true. Not sure if the game's seen some meaningful improvements over time but when testing after launch the performance was tanking heavily in densely populated areas or visually complex scenes.
 
This is true. Not sure if the game's seen some meaningful improvements over time but when testing after launch the performance was tanking heavily in densely populated areas or visually complex scenes.

Generally I'll see CPU-dependent performance drops in the castle room outside the library with the mermaid sculpture and of course in Hogsmeade but recently I found one that is horrible, probably a game bug. In the little outside courtyard area just off the aforementioned room outside the library, if you run an anticlockwise loop around the grass, as you approach the big tree to keep to it's right, the fps will TANK. Really bad. Strafe run looking to your L a bit and it's the worst. I run at 60fps VSync for smoothness and the fps will drop to the mid 20s and I've even got it to the teens. It'll recover after a few seconds to 60fps. You can continue running a loop in the courtyard and the next time you get to the same area, you'll get the identical problem again. ~20 fps which recovers. This seems to be the worst in Winter (later game). I'm now in Spring and it seems a little better but still crashes to the 30s fps.

GPU is barely being used and really CPU seems nonplussed as well so I'm not sure this is even a CPU issue? Bug/crap optimization I guess. I don't remember that problem in Summer/Autumn but in general FPS consistency in that area is pretty bad (a better CPU like a 5800X3D makes a real difference), but this is another layer of performance problem. AMD and Nvidia GPUs behave similarly as do Intel and AMD CPUs.
 
Generally I'll see CPU-dependent performance drops in the castle room outside the library with the mermaid sculpture and of course in Hogsmeade but recently I found one that is horrible, probably a game bug. In the little outside courtyard area just off the aforementioned room outside the library, if you run an anticlockwise loop around the grass, as you approach the big tree to keep to it's right, the fps will TANK. Really bad. Strafe run looking to your L a bit and it's the worst. I run at 60fps VSync for smoothness and the fps will drop to the mid 20s and I've even got it to the teens. It'll recover after a few seconds to 60fps. You can continue running a loop in the courtyard and the next time you get to the same area, you'll get the identical problem again. ~20 fps which recovers. This seems to be the worst in Winter (later game). I'm now in Spring and it seems a little better but still crashes to the 30s fps.

GPU is barely being used and really CPU seems nonplussed as well so I'm not sure this is even a CPU issue? Bug/crap optimization I guess. I don't remember that problem in Summer/Autumn but in general FPS consistency in that area is pretty bad (a better CPU like a 5800X3D makes a real difference), but this is another layer of performance problem. AMD and Nvidia GPUs behave similarly as do Intel and AMD CPUs.

A magical game - blossoms in the summer/spring and hibernates in the winter lol

Over a year since release and most likely still badly deep rooted optimisation problems. I don't even recall if its a console orientated port which struggles at the PC level, but with this type of perf taxing it cant be anything short of a badly configured port. Doesn't affect me personally though, i'm more into my shooters and strategy types. But i bet its annoying AF for wizardry enthusiasts.
 
For me, power consumption is actually main disappointment.... Very advanced TSMC node and they are still so-so with power consumption? That does not paint good picture at all...
 
Back
Top