• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Intel's Core Ultra 7 265K and 265KF CPUs Dip Below $250

Oh noes the Intel diehards trying to recover something along the lines of sunk cost fallacy :shadedshu:
 
there are plenty of low end gpus that would make a decent upgrade to vega onboard graphics an rtx 5060 is only 150w. a gtx1560 is a massive upgrade over vega and doesnt need any power.


a 5700x3d is £200 ? a decent am5 motherboard is £150 and ddr5 is another £80? thats before you even buy a CPU.
a 5060 is still requiring external PCIE power, and a low end system like that likely doesn't have a PSU that supports that.

So adding 150w of a 5060, plus say another 100w of a CPU (because the 3000g ect were so low power) a psu upgrade is a very real possibility
 
Sorry, EU pricing.
I'm not sure why you would think AM5 has more expensive motherboards?
For DIY, AM5 is a more popular platform that's constantly reported to be the king for gaming (only true for ~$500 X3D chips), so the motherboards haven't generally been getting anywhere near the nice discounts equivalent Intel boards have got. Just look at any of the recent TPU mobo reviews, or use PCPartpicker if you want to check for yourself.

Also I just checked EU pricing and the 9700X was still on average more expensive than the 265K, so I find it hard to believe the 7800X3D is that much cheaper than elsewhere.
 
a 5060 is still requiring external PCIE power, and a low end system like that likely doesn't have a PSU that supports that.

So adding 150w of a 5060, plus say another 100w of a CPU (because the 3000g ect were so low power) a psu upgrade is a very real possibility
sure in something like a SFF office pc, but even the crappest ATX prebuilt pc will come with a 450w minimum power supply thats more than enough for a x3d and low end gpu.

1750436413569.png



a lot of people will have ended up with something like this due to covid and crypto.
 
Not true with AM4. I bought an X470 motherboard for 90 euros in August of 2020(well two X470s, but let's focus on one of them). So the socket was already 4 years old, like AM5 today. I started with a cheepo Athlon 3000G for about 45 euros. Sold that, gone to a second hand 2600X for about 50 euros from a friend. When the R5 5500 droped at 90 euros, I bought that one. When things changed and I wanted a good integrated GPU, I bought a 4600G that was selling for 80 euros and replaced that 5500. If I wanted to go the opposite direction, I could have replaced that 5500 with a 5950X or a 5800X3D. My only "problem" would have being the "limitation" of PCIe 3.0 instread of being with a board supporting PCIe 4.0 or 5.0. So, about 9 years after the introduction of AM4 and almost 5 years after I bought it, the platform could still offer me the option to triple my performance in productivity or increase to a completely different level the gaming performance of my platform. Especially in that last case, the money saved could give me the option to also upgrade my GPU.
Someone buying an AM5 today, to use from day one an 9800X3D or an 9950X(3D or not) can enjoy another upgrade circle that will offer them a significant uplift in performance, thought better architecture or increased number of cores. Someone using a more middle class option, like a 7600X or a 7700X, can double or more the performance of their system by just going for a higher end CPU in 4-5 years for probably half of the cost of changing platform. Or, if there is a use for a second system, buy an APU and keep using that AM5 system with that new APU for even more years.
If I were to buy an AM4 system in late '20 I woulda gone for, at the very least, 5600X. Now, in '25, I woulda still have no reason to upgrade. By '30 or so, when 5600X becomes truly problematic for what I'm doing, anything AM4 will also be questionable at best.
Or the same 10-year cycle but with Ryzen 1600. Say, '27... I can go for a 5700X3D/5800X3D and get a significant uplift but at the same time low end tech of '27 will offer me PCI-e 5.0 instead of 3.0; properly matured DDR5 instead of slow DDR4; higher count of fast USB ports; possibly multiple USB-C ports; a lot higher single thread performance; probably higher multi-thread performance. And all that for not that much money. I wouldn't mind shelling out a couple Gs anyway because I was saving quid for a decade after all.

Replacing stuff for research purposes is one thing and I'm all for it but upgrades as in actual upgrades should come when your system is no longer capable and you should ideally buy the best you can buy and not just slot a higher end CPU in your ageing motherboard.
 
Pat was the worst CEO in Intel's history. Hopefully they scrap any and all plans he might have had.
Pat's focus on manufacturing was correct. Intel lost it's crown and AMD became the best option in CPUs, because AMD gained the upper hand in manufacturing, with the help of TSMC. Pat was also correct to re ender the GPU market and try to create competing products to AMD and Nvidia. Intel is dead without manufacturing and will definitely be in AMD's and Nvidia's mercy without discrete GPUs.

Pat was fired because focusing on manufacturing was far more expensive than what they had probably predicted, 20A not being as good as they wished to also maybe played a significant role, GPUs are still far behind compared to competition and probably not profitable, the 14900K fiasco could be his call, Arrow Lake being worst than 14th gen should probably be expected, considering Intel engineears had to use TSMC nodes for the first time, but probably was also an excuse to throw him out.
I believe Pat turned the Intel ship in the right direction, but he probably couldn't handle a company of over 100K employeers. Maybe his only mistake was the people who chose to have around him.
 
It's a more popular platform that's constantly reported to be the king for gaming (only true for ~$500 X3D chips), so the motherboards haven't generally been getting anywhere near the nice discounts equivalent Intel boards have got. Just look at any of the recent TPU mobo reviews, or use PCPartpicker if you want to check for yourself.
I just took a quick glance at local motherboard pricing and honestly it's a bit of a nothingburger. The first AM5 motherboard I'd comfortably recommend is about 95E, for Intel it's about 110E. So technically AM5 wins that round.
If we're talking about cpu's around that price I'm thinking of motherboard around half that, not motherboards you typically review.
Actually I'd argue the opposite considering you need a decent motherboard to truly extract all the performance of the 265K CPU whereas the 7800X3D can run on a potato.
 
It is super unpopular in DIY, but that's a subset of the real market.

I'm struggling to understand why anyone would go for a 9700X instead at $305, current pricing. Perhaps 9600X at $180, but I mean, $60 for more than 3x the cores and cheaper mobos meaning it's more like $30 more for the ARL chip...
The "AMD is much better for gaming" is mostly from the X3Ds, which are almost twice the price, especially if you consider mobos. Against standard Zen 5?

In the performance metrics that matter to most customers, the 265K is slower than last gen Intel parts: https://www.techpowerup.com/review/intel-core-ultra-7-265k/13.html

If a person is price conscious (which is the basis for even looking at the 265K here), one can pickup a 7700X for $230 and would still get better performance over the 265K in games and applications they are likely to use. You need to specifically have a use for those extra cores on the Intel and that's not something the vast majority of people will have.

The cost and time investment of a new motherboard anytime you want a notable performance bump on Intel is also a huge negative, especially with the rising cost of motherboards. PCIe signaling requirements demand more layers and signal repeaters, thus increasing costs. That will only get worse as PCIe version increases. Intel should take a hint and improve the longevity of it's platforms, the longer it drags it out, the more of a negative the motherboard cost becomes for them.

That's with essentially the same efficiency, and with 200S (warranty) boost turned off, and ARL running slower RAM than it's rated for. Anyone able to argue for Zen 5 in this case? I find it a weak choice besides at the high end, with 9800X3D/9950X3D. Maybe 9600X3D at ~$250 changes things...

"essentially the same efficiency"

No, there's still a gap in favor of AMD in terms of efficiency. 200 series is better than Intel's last gen products in efficiency and only gets closer to AMD's 9000 series due to their more aggressive tuning out of the box. This is evidenced by 7000 series efficiency numbers in the charts and any power scaling comparisons between the 9000 series and 200 series.

The advent of gaming performance charts/results generated with an RTX 5090 really throws off people's understanding of the actual relative performance with the GPUs they have I think.

The advent of proper benchmarking by removing the GPU bottleneck throws people off?

C'mon, that's a load of nonsense and you know it.

Besides the whole general ignoring of "application performance" charts.

People aren't ignoring application performance, the 200 series is just worse than AMD and last gen Intel in the applications they actually use.
 
Leave spain out of this (And maybe some of the other countries that you cited, but didn't bother to actually check)
View attachment 404536View attachment 404537

That is still Amazon! Look I'm not saying you are full of BS, but it seems to be very limited right now, if we actually see other retailers lowering the price to $240 or 250 euros that would be amazing, I love cheaper goods, I love solid value hardware that most people can afford.

If people are up for a new computer, this would be a solid deal, especially if there are further combo deals with motherboards and memory, but as I've said most other retailers, who are more country specific seem to still have the old prices.

At $400 or $350 or $320 it is still a turd!!! At $250 its amazing, $300 its passable, but going with AMD is likely better due to the superior longevity of their platforms, AM5 is said to be supported all the way to 2027 with new CPU's.

Though if I was looking for a new computer or CPU right now and I saw the 265k for $240 I'd buy it instantly! In my region I'm not seeing the same low prices, I'm still seeing the $400 pricing.
 
What arrowlake has done in terms of the mixed workload power draw is absolutely insane. Most of these kinda mixed workloads (gaming included btw) have the 285k being as fast or faster than the 9950x while consuming a truckload less power. Even in gaming the 9950x ends up consuming somewhere between 35 and 100% extra power. Check the following review that has a variety of workloads and their power draw.
"Truckload less power"? Where are you getting your numbers? It's enough to look into the reviews on this very website to find out the falsehood of this narrative. 285K vs 9950X power consumption in mixed workloads is literally the same, on average, within margin or error (132W vs 135W). And 285K is sitll not able to win overall in applications against vanilla Zen5. In gaming, 9950X consumes, on average, only 10W more, and not between 35-100% extra power. Nonsense. Sure, there is an odd game like Alan Wake 2 where R9 draws significantly more, but we are not here to nitpick on game-by-game basis in order to make a big point, aren't we?

'200S Boost' profile, aka overclocking the interconnect and memory to 8000 MT/s under warranty, has brought a few nice gains to 285K in some gaming, 12% on average by DerBauer in 6-7 games, but we are yet to see a comprehensive re-testing in 40-50 games to see the bigger picture.
Screenshot 2025-06-20 at 16-56-34 Intel Core Ultra 9 285K Review - Power vs 14900K 9950X 9900X...png


Finally, we must not forget that 285K should be winning with 9950X across the board by a wider margin. It should be using way less power than it does. It has more cores, it is produced on more advanced and very expensive N3 node vs N5's iteration named N4 that Ryzen uses, which is much cheaper to produce and has higher yields. Something stinks in the kingdom of Denmark, don't you think so?

Most recent 3D Centre meta-review is below. With all advantages explained above that 285K have over vanilla 9950X, it should be winning with it decisively across applications, games and power consumption. It just does not, I am afraid. Somehow, it's on average a tad slower even in applications.
I am curious as to how people explain this.
3D Centre Z5 9950X3D meta.png


If they have a deal with TSMC that says "We guaranty that we will buy a minimum of that number of your wafers in 2025-2026", then Intel has two options. Keep prices up and then have to pay a fine to TSMC for underutilized TSMC's nodes, or lower prices, hope people to start buying and manage to use all of TSMC's agreed capacity, before their next 18A CPUs becomes their main product.
Yes, this makes sense, to get rid off the volume they had contracted.
 
No, there's still a gap in favor of AMD in terms of efficiency. 200 series is better than Intel's last gen products in efficiency and only gets closer to AMD's 9000 series due to their more aggressive tuning out of the box. This is evidenced by 7000 series efficiency numbers in the charts and any power scaling comparisons between the 9000 series and 200 series.
You are wrong, intel scales much better . And this is the only segment where amd is actually competitive, if you try make a power scaling comparison between a 265k and a 9700x it will be PEGI18. AMD's huge issue right now IS efficiency, especially in light and mixed workloads.


00HJj5p.jpeg
 
Dunno why but these Ultra CPUs are the first to be priced that bad. 13th and 14th gen were leagues closer to MSRP even on launch, let alone some quarters into the lifecycle.
and those are better CPU's than these Ultra CPU's anyway. These Ultra CPU's are rather piss poor performance.

I found when looking at Russian prices, you are just better off buying parts of the gen prior than new. Although I have seen some prices of newer GPU's that were about as good as the prices here in Canada. And we aren't sanctioned (just yet) either.
 
[from a gamer's perspective]

What's the lifespan on LGA1851 and Z890? I get that the CPU is inexpensive, but after the refresh launch this year, it's done.
No one is expecting the refresh to radically change the performance landscape. Intel 265KF owners should not spend $400 to upgrade 365kf or whatever.
So what you buy now, is what you'll keep for years to come.

Example from Micro Center:
Besides Z890 with PCIe5 and saving $180... what advantage is there for a gamer to pick Intel?
The DDR5 6000 CL36 is going to further constrain 265kf.

265k_vs_9800.png
 
You can get for the still excellent $260 from Amazon, it's just Microcentre for the $240 K/230 KF deal.
I noticed which nice to know they doing to seeing I have like $100 gift credit at amazon, though I was saving it for my next gpu.
Intel's barn burner CPU...........you ever look at their recent cpu's they aren't burning anything down
It crack at the fact that 125w cpu will pull well over 250watt.... I dont like this whole PL1/PL2 crap but both intel and amd doing but amd dont pull anywhere near as much wattage. I stay on to my 6700k as long as I did cause I dont want cpu that draw more the 125w under an circumstance preferable 95w,

Maybe some people are ok with cpu pull upward of 300w and gpu pull well over that now days, but I am not one them
 
"Truckload less power"? Where are you getting your numbers? It's enough to look into the reviews on this very website to find out the falsehood of this narrative. 285K vs 9950X power consumption in mixed workloads is literally the same, on average, within margin or error (132W vs 135W). And 285K is sitll not able to win overall in applications against vanilla Zen5. In gaming, 9950X consumes, on average, only 10W more, and not between 35-100% extra power. Nonsense. Sure, there is an odd game like Alan Wake 2 where R9 draws significantly more, but we are not here to nitpick on game-by-game basis in order to make a big point, aren't we?

'200S Boost' profile, aka overclocking the interconnect and memory to 8000 MT/s under warranty, has brought a few nice gains to 285K in some gaming, 12% on average by DerBauer in 6-7 games, but we are yet to see a comprehensive re-testing in 40-50 games to see the bigger picture.
View attachment 404527

Finally, we must not forget that 285K should be winning with 9950X across the board by a wider margin. It should be using way less power than it does. It has more cores, it is produced on more advanced and very expensive N3 node vs N5's iteration named N4 that Ryzen uses, which is much cheaper to produce and has higher yields. Something stinks in the kingdom of Denmark, don't you think so?

Most recent 3D Centre meta-review is below. With all advantages explained above that 285K have over vanilla 9950X, it should be winning with it decisively across applications, games and power consumption. It just does not, I am afraid. Somehow, it's on average a tad slower even in applications.
I am curious as to how people explain this.
Your graphs shows it quite clearly, in these lighter workloads the 9950x is blowing it's socks off in power draw. Just look at those small bars that are mixed or single core workloads.
 
Although I have seen some prices of newer GPU
You might collapse to a heart attack if you look at what they did to Radeon pricing. Lo and behold:
1750437115024.png

A thousand USD 9070 XT. Sans VAT, it's 840 USD. Still an atrocity. Just FYI, they sell 5070 Ti for the same money.
 
You might collapse to a heart attack if you look at what they did to Radeon pricing. Lo and behold:
View attachment 404546
A thousand USD 9070 XT. Sans VAT, it's 840 USD. Still an atrocity. Just FYI, they sell 5070 Ti for the same money.
yeah, dunno why they did that. I mean, the 9070xt is a good GPU but if same price as 5070 ti, then it is better for 5070 ti. They are roughly $1080 CAD here for that GPU. 5070 ti is like $1300 CAD. It would probably be cheaper to go on vacation to China and get it.

The sad part is, its cheaper for me to drive to the states, get certain parts, and drive back home. But I think they would tax me at the border now. Not sure.
 
[from a gamer's perspective]

What's the lifespan on LGA1851 and Z890? I get that the CPU is inexpensive, but after the refresh launch this year, it's done.
No one is expecting the refresh to radically change the performance landscape. Intel 265KF owners should not spend $400 to upgrade 365kf or whatever.
So what you buy now, is what you'll keep for years to come.

Example from Micro Center:
Besides Z890 with PCIe5 and saving $180... what advantage is there for a gamer to pick Intel?
The DDR5 6000 CL36 is going to further constrain 265kf.

View attachment 404543
A gamer doesn't need a 265k, but he doesn't need a 9800x 3d either unless he is rocking some super high end gpu. Get a 14600kf for 160$ and you are good to go.
 
A gamer doesn't need a 265k, but he doesn't need a 9800x 3d either unless he is rocking some super high end gpu. Get a 14600kf for 160$ and you are good to go.
shit, even a 12400 works fantastic for gaming
 
shit, even a 12400 works fantastic for gaming
Indeed. My 12400F asks all sorts of questions to my 6700 XT and it never answers... Wish I had dough for an upgrade...
 
You are wrong, intel scales much power. And this is the only segment where amd is actually competitive, if you try make a power scaling comparison between a 265k and a 9700x it will be PEGI18. AMD's huge issue right now IS efficiency, especially in light and mixed workloads.
You are talking about mixed workloads a lot, but you are posting a graph from one single and heavy loaded application. Sure, you are free to nitpick as much as possible. I am ok with that. There is no doubt that 285K is more power efficient and faster in some workloads. It's produced on N3 node, after all. It'd be absurd if it did not win in specific workloads, and with less power. Nobody disputed that. But the overall picture that is emerging from all benchmarks together does not show any decisive win overall.

In Linux, it's worse than in Windows. The CPU that comes on N3 node is decisively... behind in a gigantic 400-application benchmark, while not using less power, on average, than Ryzen. It's barely faster than 7950X.
Screenshot 2025-06-20 at 17-41-53 AMD Ryzen 9 9950X3D Delivers Excellent Performance For Linux...png


Screenshot 2025-06-20 at 17-45-51 AMD Ryzen 9 9950X3D Delivers Excellent Performance For Linux...png


How do you explain this? I am curious.
 
You are talking about mixed workloads a lot, but you are posting a graph from one single and heavy loaded application. Sure, you are free to nitpick as much as possible. I am ok with that. There is no doubt that 285K is more power efficient and faster in some workloads. It's produced on N3 node, after all. It'd be absurd if it did not win in specific workloads, and with less power. Nobody disputed that. But the overall picture that is emerging from all benchmarks together does not show any decisive win overall.

In Linux, it's worse than in Windows. The CPU that comes on N3 node is decisively... behind in a gigantic 400-application benchmark, while not using less power, on average, than Ryzen. It's barely faster than 7950X.
View attachment 404550

View attachment 404551

How do you explain this? I am curious.
The graph was a response to someone saying that amd has a lead in efficiency.

Your link is using linux, irrelevant for 96% of the market, BUT - even in that one, look at the distribution of power man. Sure on average they all consume the same power but I assume that's because there are a lot of full core workloads. But the distribution tells the whole story, in lighter workloads intel is significantly more efficient. There is not a single workload that the ryzen chips manage to do without pulling circa 70w+, while there were a lot that the 285k was pulling 40.

And that's easy to verify, if you have a dual CCD ryzen just browse the web. Youll be casually hitting 70w scrolling a page. Work on an excel, youll still be seeing 60-70 watts. For these same workloads intel is sitting at single digits.
 
A gamer doesn't need a 265k, but he doesn't need a 9800x 3d either unless he is rocking some super high end gpu. Get a 14600kf for 160$ and you are good to go.

It depends on the games being played, resolution, in-game settings, and graphics card.. but I'm talking about those gamers spending +$1,000 USD on a GPU

The FPS minimums for Intel's Core Ultra 2 are embarrassing:
1750439301935.png


Of course it diminishes when the games become more GPU-bound:
1750439334766.png


CUDIMM appears to help, but if you're paying more for memory than you are for the processor....
1750439681865.png

1750439804387.png


And those are 285K results which sells for more than 9800X3D, and that DDR5 memory will hold back Core Ultra:
1750440114412.png
 
Back
Top