• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel's Core Ultra 7 265K and 265KF CPUs Dip Below $250

It isn't, but are we (they) pretending that it is not the case?
Because it suits their "argument", publically refute it with facts so noobs don't get misled, then move on and don't feed the trolls or go off topic. Or the mods will get pissed and people new to the thread will get bogged down with pages of irrelevant details.
 
[from a gamer's perspective]

What's the lifespan on LGA1851 and Z890? I get that the CPU is inexpensive, but after the refresh launch this year, it's done.
No one is expecting the refresh to radically change the performance landscape. Intel 265KF owners should not spend $400 to upgrade 365kf or whatever.
So what you buy now, is what you'll keep for years to come.

Example from Micro Center:
Besides Z890 with PCIe5 and saving $180... what advantage is there for a gamer to pick Intel?
The DDR5 6000 CL36 is going to further constrain 265kf.

View attachment 404543
From a gamers perspective, a LONG time. You can still play modern games on coffee lake 6 core CPUs from 2017 without issue. If you were to buy a 265k, it would be perfectly fine for games at minimum through 2030, I'd say likely though 2035. And thats assuming youre using flagship or high tier hardware. If you're using 4070 class GPUs, it'll go even longer. I can still get acceptable performance from less demanding modern titles on my ivy bridge i5.

This is the reason people say that platform lifespan doesnt matter. Its cool to upgrade systems, from an enthusiast perspective, but its not actually necessary.

With a card youll never use to play at that resolution. Like seriously, if you think 120 1% lows are embarrassing, how do you actually play games? I mean my 4090 gets nowhere near 120 1% lows in the majority of games im playing. Should I throw it out the window? What the hell man
Context dude. Sure, that 20% difference now doesnt matter much, but how about 5 years from now? 7? 10?

That difference could mean an extra year or two of acceptable use out of a platform. Look how poorly Phenom II aged against nehalem or sandy bridge.
 
Context dude. Sure, that 20% difference now doesnt matter much, but how about 5 years from now? 7? 10?

That difference could mean an extra year or two of acceptable use out of a platform. Look how poorly Phenom II aged against nehalem or sandy bridge.
The future proof argument makes sense if we are talking about similar priced cpus. The 9800x 3d is currently 220$ / € more expensive than the 265k, in 5-7-10 years a 150$ cpu will be faster than the 9800x 3d. You are not getting your moneys worth if you are not using that performance TODAY. Which you CAN do if you are playing some specific games like a couple of racing sims or MSFS and such, but other than that it's a waste of money imo.
 
The future proof argument makes sense if we are talking about similar priced cpus. The 9800x 3d is currently 220$ / € more expensive than the 265k, in 5-7-10 years a 150$ cpu will be faster than the 9800x 3d. You are not getting your moneys worth if you are not using that performance TODAY. Which you CAN do if you are playing some specific games like a couple of racing sims or MSFS and such, but other than that it's a waste of money imo.
That depends. IMO the money isn't wasted if you get a longer lifespan out of the hardware. $220 for 2 more years of acceptable performance, better 1% lows in cache sensitive games, ece may be worth it to many. If you play those racing sims, or MSFS, or strategy games (supcom, sins of a solar empire, ece) frequently that benefit may very well be worth it.

In 5-7-10 years a $150 CPU could very well be faster, but if my CPU is still fast enough, I dont have to spend that $150 for that performance.
 
The future proof argument makes sense if we are talking about similar priced cpus. The 9800x 3d is currently 220$ / € more expensive than the 265k, in 5-7-10 years a 150$ cpu will be faster than the 9800x 3d. You are not getting your moneys worth if you are not using that performance TODAY. Which you CAN do if you are playing some specific games like a couple of racing sims or MSFS and such, but other than that it's a waste of money imo.
It’s also important to reiterate that GPUs seemingly stopped scaling as much as they used to, especially in the categories that majority of the people actually buy. That’s why even W1zz explicitly says that looking at 720p results with a 5090 is just academic. Sure, if you have unlimited money and run a 5090 the 9800X3D is worth it, but those people ain’t keeping hardware for 10 years anyway. And those that do, well, they don’t buy 5090s. For them, it’s not inconceivable that they will only get a 5090 level performance in 8 years or so when the mainstream cards get there. I mean, Turing was 2018 and only NOW we get 5060Ti and 9060XT that are faster than the 2080Ti at… let’s call them “affordable-ish” points. So you get a 9800X3D if you are also getting a 5080/4090/5090 and will upgrade once something new comes out anyway, or you are essentially blowing money on something that won’t be relevant for you for years for dubious longevity. It’s a “meh” proposition. I would absolutely not advise anyone to pay significantly more for a 9800X3D versus a 265K if they will run mainstream GPUs which most do. It’s insanity.
 
That's awesome! The 7700X bundle is much cheaper though and the 9600X/7600X bundles are insanely cheap as well, I might still go AM5 if I was building another PC.
 
It’s also important to reiterate that GPUs seemingly stopped scaling as much as they used to, especially in the categories that majority of the people actually buy. That’s why even W1zz explicitly says that looking at 720p results with a 5090 is just academic. Sure, if you have unlimited money and run a 5090 the 9800X3D is worth it, but those people ain’t keeping hardware for 10 years anyway. And those that do, well, they don’t buy 5090s. For them, it’s not inconceivable that they will only get a 5090 level performance in 8 years or so when the mainstream cards get there. I mean, Turing was 2018 and only NOW we get 5060Ti and 9060XT that are faster than the 2080Ti at… let’s call them “affordable-ish” points. So you get a 9800X3D if you are also getting a 5080/4090/5090 and will upgrade once something new comes out anyway, or you are essentially blowing money on something that won’t be relevant for you for years for dubious longevity. It’s a “meh” proposition. I would absolutely not advise anyone to pay significantly more for a 9800X3D versus a 265K if they will run mainstream GPUs which most do. It’s insanity.
Yep, unless you play very specific games, or run very specific refresh rates, 240 Hz or above.

Even 265K on "bad" ARL gets 120 FPS minimum 1% lows. When paired with slow (for ARL) RAM and with 200S boost off.
 
Example from Micro Center:
Besides Z890 with PCIe5 and saving $180... what advantage is there for a gamer to pick Intel?
The DDR5 6000 CL36 is going to further constrain 265kf.
Well, for gamers, Intel competes now on price. And they are quite competitive, indeed. Other than that, many things depend on GPU. If it's high-end and 4K, CPU will not matter much in Ultra settings. Things also depend on the selection of games. My favourite is MSFL, so 9800X3D is my preferred CPU for gaming.

It's pointless to pity two photographs of deals without knowing what a buyer would actually do with it, as those two CPUs bring different flavours in different workloads and use case. I don't mind paying more for X3D system due to my preferred game, CPU's snappiness in Office applications and other reasons such as platform longevity. But my choice may not as as suitable for somebody else who cares about other things and uses PC differently.
 
From a gamers perspective, a LONG time. You can still play modern games on coffee lake 6 core CPUs from 2017 without issue. If you were to buy a 265k, it would be perfectly fine for games at minimum through 2030, I'd say likely though 2035. And thats assuming youre using flagship or high tier hardware. If you're using 4070 class GPUs, it'll go even longer. I can still get acceptable performance from less demanding modern titles on my ivy bridge i5.

This is the reason people say that platform lifespan doesnt matter. Its cool to upgrade systems, from an enthusiast perspective, but its not actually necessary.


Context dude. Sure, that 20% difference now doesnt matter much, but how about 5 years from now? 7? 10?

That difference could mean an extra year or two of acceptable use out of a platform. Look how poorly Phenom II aged against nehalem or sandy bridge.

Do you even understand time?

Let me frame this. LGA-2011 came out in 2011. When it came out, the premium tier hardware had no M.2. It had a couple of SATA III ports. It had mostly SATA II ports. It had 8 RAM slots...of DDR3 memory. It was the absolute tippy top of the consumer market, because beyond that you were in performance computing. The equivalent to modern HPC. That entire platform had a $600 CPU, a $300 motherboard (in the age of $100 motherboards), had to have either a huge air cooler or water cooling....etc... It was about $1500 before you added any GPU or anything else, noting those 8 memory slots being populated meant basically 0 chance at overclocking.

In 2021 what could I buy? Ryzen 5 series. A 5800x would have set me back $449...assuming I didn't chose the budget champion of the 5600x...the former being 2 physical cores more than the 2011 offering, and the later being the same count, but both were less finicky on running threaded applications. You tack on a board, twice the memory of the 2011 platform in 4 sticks instead of 8, and identify that you suddenly now have 2 M.2 SSDs and the experience is night and day.



My point here, for this thread, is that pretending you want to have a CPU and platform for more than about 5 years is silly. That's 2-3 generations of hardware...and it's even more of an issue when you consider that even if AM4's longevity was assumed, it went from rocky meh in the 1000 series to solid in the 5000 series in that time. Intel is offering these chips at a huge discount...but the back of hand math lets me know that they are only cutting margin and not cutting painfully. If the MSRP is 400, and the margin is 50%, then they make $200 per chip sold. 400-100=25%, 50%-25% = 25%. This means their chips are only making them $100 now. That's...a lot less action than I thought I'd see from a company as truly slow to action as Intel, given their brand image has been tarnished by a company they didn't even view as a competitor a decade ago when their current plans were birthed.
 
a refresh is not a new architecture. zen6 is going to have more ipc and more cores. not just a 100mhz speed bump.
Zen 6%?

I have the same frustration with customers buying Nvidia over AMD.
I really, really have no understanding of why you would be frustrated about what other people buy. I don’t think we even live in the same country, and yet you take it personally - to the point it frustrates you - that I have an RTX 4090 and a 13900K.

That’s just strange.
 
Last edited:
yeah, dunno why they did that. I mean, the 9070xt is a good GPU but if same price as 5070 ti, then it is better for 5070 ti. They are roughly $1080 CAD here for that GPU. 5070 ti is like $1300 CAD. It would probably be cheaper to go on vacation to China and get it.

The sad part is, its cheaper for me to drive to the states, get certain parts, and drive back home. But I think they would tax me at the border now. Not sure.
The 9070 XT is cheaper at brick and mortar stores. Memory Express has a XFX SWIFT in stock for $890.
 
sure in something like a SFF office pc, but even the crappest ATX prebuilt pc will come with a 450w minimum power supply thats more than enough for a x3d and low end gpu.

View attachment 404542


a lot of people will have ended up with something like this due to covid and crypto.
What’s the point of pairing a X3D with a low end GPU? If you just want to waste money I suggest booze.

The embarrassing thing is an 8 core outperforming a cpu with nearly double the cores, and yeah I'm sure some people would be playing in 1080P as it's the most popular used monitor resolution.

The comparison is flagship cpu's, but I get it AMD is chopped liver and no one ever uses the inflation excuse, a $329 CPU in 2017 would be $435 today.
But all that needs to be said is AMD CPU's are getting record sales and Ryen cpu's are the top sellers on retail sites, Intel CPU prices wouldn't be dropping if they weren't desperate for sales.
What cpu are you running?

Intel CPU prices wouldn't be dropping if they weren't desperate for sales
I seem to recall AMD dropping Zen 5 prices three weeks after launch.

I guess they were desperate for sales.
 
With a card youll never use to play at that resolution. Like seriously, if you think 120 1% lows are embarrassing, how do you actually play games? I mean my 4090 gets nowhere near 120 1% lows in the majority of games im playing. Should I throw it out the window? What the hell man
Oh jesus christ! This isn't a hard thing to understand.

The 9800x3d is nearly 20% higher 1% lows than Intel Core Ultra 9 285k, to compound this even further, the Core Ultra 9 is nearly 21% more expensive than the 9800x3d. So, the cost per frame with regards to 1% lows is at least 30-40% higher than the 9800x3d. That is a huge gap in terms of price per frame performance when talking about 1% lows.

For those that have 144hz monitors and up, the 1% lows generally have to be at 144fps or higher. So, the 1% lows of the 9800x3d are very attractive relative to the Core Ultra 9 285k, then on top of that they are saving basically a $100 dollars, or 21% of the purchase price of the 9800x3d, by going with the 9800x3d.
 
Oh jesus christ! This isn't a hard thing to understand.

The 9800x3d is nearly 20% higher 1% lows than Intel Core Ultra 9 285k, to compound this even further, the Core Ultra 9 is nearly 21% more expensive than the 9800x3d. So, the cost per frame with regards to 1% lows is at least 30-40% higher than the 9800x3d. That is a huge gap in terms of price per frame performance when talking about 1% lows.

For those that have 144hz monitors and up, the 1% lows generally have to be at 144fps or higher. So, the 1% lows of the 9800x3d are very attractive relative to the Core Ultra 9 285k, then on top of that they are saving basically a $100 dollars, or 21% of the purchase price of the 9800x3d, by going with the 9800x3d.
At some point - I hope in the not so distant future - people might figure out that if someone is interested in a CPU like the 285k they are not interested in an 8 core chip like the 9800x 3d. The actual competition is the 9950x and the 9950x 3d.
 
$200, $300, whatever, they can't change the fact 265K / 285K are like 2 generations behind X3D chips in gaming performance and most people who are doing home PC builds care about that. If they came closer it would be a good buy, but I still see Intel trailing in some other benchmarks like Photoshop. It's nice Intel made a big leap forward in efficiency but it's still behind.
 
$200, $300, whatever, they can't change the fact 265K / 285K are like 2 generations behind X3D chips in gaming performance and most people who are doing home PC builds care about that. If they came closer it would be a good buy, but I still see Intel trailing in some other benchmarks like Photoshop. It's nice Intel made a big leap forward in efficiency but it's still behind.
Question let's say someone games at 4k high settings....what's the advantage of an x3d chip.....
 
Question let's say someone games at 4k high settings....what's the advantage of an x3d chip.....
You should look at some benchmarks there are plenty of games that are still CPU limited with a lot of physics calculations so nothing you do with the GPU will raise that minimum FPS.
 
SMT increases the power consumption so much on RPL it's downright scary. I actually turned it off some time ago and haven't looked back. It's possible some of the massive efficiency improvements in ARL's overall power usage come from the fact it doesn't have SMT support at all. The extra few threads aren't worth the 40 C increase I get in load temps.
In games it's worth having it off both in terms of performance and power draw, but in MT workloads HT on will always be faster compared to HT off at iso power.
 
I upgraded to a 265k recently from an intel 9700k and the difference is huge!
Very happy with the performance, games run really smooth, and it eats through my complex 4K video editing projects!
I usually upgrade every 5 years and think this platform will more than cover it :)
 
Good price for 265K. Bad for users that Intel brings new socket every time.

This is why I like AMD's approach to platform. You can buy B650 board, and still be able to use with upcoming generations of CPUs. Same was with AM4. You could run Ryzen 59xx on 5 years old B350 board.

Why I'm mentioning this? Reusing mobo and buying CPU only significantly lowers required investments in years.

AMD is going to support AM5 at least till 2028. Officially they said 2027+, so that's 2027 and next x year(s).
 
At some point - I hope in the not so distant future - people might figure out that if someone is interested in a CPU like the 285k they are not interested in an 8 core chip like the 9800x 3d. The actual competition is the 9950x and the 9950x 3d.
It should be true but it isn't. People wanting the highest framerate always target hardware that will offer them that highest framerate, even when that hardware is in fact too much and too expensive and generally ridiculously bad value for what it offers in gaming. That's true for about forever. 20 years ago people would go and buy an Intel Extreme or an AMD FX for gaming. Later an i7 for gaming. They where upgrading the whole Intel platform to go from the one i7 to the next i7 that would offer them +5% performance. Or they would go and buy the top Bulldozer model(which was bad anyway) to gain whatever they could. Or a 6 core Phenom for gaming. In graphics cards the same, but OK, there it was making more sense. But not always. Not when people where buying Titan cards to play games. But those Titan cards where offering that extra few frames that where so much a necessity to run games smoothly, or that's what they where saying.

So yes, people will buy a 285K for games, strictly for games, because that 285K is the top Intel CPU on the gaming charts. They might use it's compute capabilities occasionally here and there, but in fact they where bought it for games. People going AMD can avoid going for the top models, like the 9950X or X3D, because they do have the 9800X3D that even tops those top models in gaming. They don't care that it is just an 8 core model. They don't care that the 285K will demolish the 9800X3D in productivity. They want the best in gaming and the 9800X3D is at the same time the cheapest way to get the best framerate and in some cases the only way to get the best framerate.

In gaming the comparison of 285K will always be with 9800X3D, even 7800X3D and soon the 9600X3D. And until Intel creates something new and revolutionary, AMD will be winning. If Intel does create something new and revolutionary and AMD needs a 16 core X3D CPU to even come close to that new Intel model, I am pretty sure that core count wouldn't matter. Only framerate will matter.
 
Did a part list recently with a $259.99 265K, which was an excellent deal even at $20 more than this new offer.

At $239.99 these are absolute steals and nothing comes close, lets be real here, it's an 8P+12E current gen chip, where you can get a PCIE x16 GPU + x4 M.2 mobo for $150. CPU/MOBO/RAM for sub $500, which is around what you're paying for a current gen 8P X3D CPU alone. Sure the X3D is a bit faster in most games, but that assumes you a) have a 4080 or better, and b) aren't using 4K, but rather 1080p/1440p high refresh (and by high refresh I mean 240 Hz+, all of these current gen chips from either vendor can easily hold 120 FPS+).

Really struggling to see the argument for anything other than a 265K in the average 1-2k PC build at this point. Sure, a 9600X will be around the same price, but it's still just a 6 P core, with no E cores (meaning it's basically good for just games, for workstation stuff the 265K blows the 9600X out of the water) and I'm not sure the "future proof" AM5 platform argument is relevant when there's just one more gen around the corner with Zen 6, seems like ARL is getting a refresh too anyway.

It's also interesting to note that many new games are having eight core CPUs being in the minimum recommended specs, albeit typically using something like a 9700K or a 2700X, so a 9600X is still a better gaming CPU than either of those, but it's still a marked shift from the old "6 core i5" being the minimum.
265K/KF officially supporting DDR5-6400 is another compelling point if the "workstation stuff" you mention is memory-bandwidth-bound.
That said, the 12 E-cores on it are pretty much useless to me and I'd wind up disabling them, and the lack of AVX-512 support on Arrow Lake is still a deal breaker for me. Either way it's good to see strong arguments for modern Intel builds, more competitive it gets the better.
 
It should be true but it isn't. People wanting the highest framerate always target hardware that will offer them that highest framerate, even when that hardware is in fact too much and too expensive and generally ridiculously bad value for what it offers in gaming. That's true for about forever. 20 years ago people would go and buy an Intel Extreme or an AMD FX for gaming. Later an i7 for gaming. They where upgrading the whole Intel platform to go from the one i7 to the next i7 that would offer them +5% performance. Or they would go and buy the top Bulldozer model(which was bad anyway) to gain whatever they could. Or a 6 core Phenom for gaming. In graphics cards the same, but OK, there it was making more sense. But not always. Not when people where buying Titan cards to play games. But those Titan cards where offering that extra few frames that where so much a necessity to run games smoothly, or that's what they where saying.

So yes, people will buy a 285K for games, strictly for games, because that 285K is the top Intel CPU on the gaming charts. They might use it's compute capabilities occasionally here and there, but in fact they where bought it for games. People going AMD can avoid going for the top models, like the 9950X or X3D, because they do have the 9800X3D that even tops those top models in gaming. They don't care that it is just an 8 core model. They don't care that the 285K will demolish the 9800X3D in productivity. They want the best in gaming and the 9800X3D is at the same time the cheapest way to get the best framerate and in some cases the only way to get the best framerate.

In gaming the comparison of 285K will always be with 9800X3D, even 7800X3D and soon the 9600X3D. And until Intel creates something new and revolutionary, AMD will be winning. If Intel does create something new and revolutionary and AMD needs a 16 core X3D CPU to even come close to that new Intel model, I am pretty sure that core count wouldn't matter. Only framerate will matter.
The point is you can't make a "value" argument cause nobody that cares about value will ever consider a 285k (or a 9950, 7950x, 9950x 3d etc.) strictly for gaming. Even the 9800x 3d is bad in terms of value. So the argument "the 9800x 3d is a better value than the 285k for gaming" is completely missing the target audience, they don't care, nobody on a budget will be looking at i9 or R9 cpus unless they want to use it for more than games. It's a ridiculous comparison. Also - the 14900k is the fastest Intel gaming chip and it's currently at 399$.

GPUs are completely different. GPUs DO actually scale. A 5080 is more than twice as fast as a 5060. The 9800x 3d is ~20% faster than a 7600 as long as you are rocking a 2k $ GPU and playing at low resolutions.

265K/KF officially supporting DDR5-6400 is another compelling point if the "workstation stuff" you mention is memory-bandwidth-bound.
That said, the 12 E-cores on it are pretty much useless to me and I'd wind up disabling them, and the lack of AVX-512 support on Arrow Lake is still a deal breaker for me. Either way it's good to see strong arguments for modern Intel builds, more competitive it gets the better.
If you are going to disable the ecores then there is 0 reason to go for a 265k. Get a 9700x.
 
If you are going to disable the ecores then there is 0 reason to go for a 265k. Get a 9700x.
Sorry, don't think I was clear enough. I have no plans to get a 265K, was just stating that the lack of AVX-512 on it is what prevents me from considering it. A 9700X is definitely the clear choice.
 
Sorry, don't think I was clear enough. I have no plans to get a 265K, was just stating that the lack of AVX-512 on it is what prevents me from considering it. A 9700X is definitely the clear choice.
Why would the lack of AVX prevent you? It's still faster than the 9700x in AVX workloads due to the extra cores. Stockfish and Ycruncher are AVX heavy workloads and the 265k still wipes the floor with the 9700x.

stockfish-chess.png
 
Back
Top