• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Is Microsoft Deliberately Limiting AMD CPU Performance?

@Solaris17
Yeah, my bad. And it’s 288 for the 6900, if I am looking at it right. Not out yet, though. I was sure for some reason they are rolling out a hybrid this gen. I blame Intel slides for being confusing and me not paying enough attention to server chips.
 
Akchually, if the new E-cores are as good as they are touted, a full on E-core only Xeon might not be a bad shout, hypothetically. For certain workloads, at least. And Xeon 6 do have E-cores, I believe. It’s Granite Rapids/Sierra Forest combo. So, seems like Intel does see a use for them.
Oh man that depends on a lot of different things.
Call it an Atom based Xeon. Cause that's what it would be no?

Area efficient. But I suppose you and solaris would understand the work loads to measure that kind of performance. I'd be at a loss there.

I get really decent performance uplift with e-cores between 4.7ghz and 5.0ghz on 14700K (current use for fun). So I would only have this to have a personal comparison. They come clocked max 4.3ghz stock. I would hope that the improvement would be with some clock speed under the hood. If they reached 4.6ghz, that would be a start. But I believe 5ghz should be obtainable if there's die shrink involved.
 
"Manually" is not required.

Lasso can be configured for paths, not just single executables. So point it at your steam library and the problem is 99% solved. For myself, out of long habit, I already install all my games to a separate folder than normal applications (and I move my steam library there as well), so the problem is 100% solved for me.
That actually sounds like a cool feature for CPUs with heterogenous cores, whether it's necessary or not.

It's pretty rare I've actually not encountered a game that does and honestly either disabling the 2nd ccd which takes a couple minutes or just sticking with single ccd options would still negate that issue. Anyone buying a dual CCD chip should understand it's not plug and play but honestly no high core count cpu's are perfect these days.
Anyone buying a dual CCD CPU should understand that gaming isn't its primary intended purpose.
 
Anyone buying a dual CCD CPU should understand that gaming isn't its primary intended purpose.

While true I still personally liked gaming more with the 7950X3D than I did the 7800X3D once properly setup.

Although now that I think about it it could have been bugged windows 11 making the 7800X3D underwhelming lol.
 
I find it amusing that people entertain the idea that MS conspires against AMD, with whom they have business deals worth billions (if not tens of billions) in just one niche.
Take it as you wish, but MS has screwed AMD in the past and maybe it’s coincidental or worse, but those actions did end up benefiting Intel.

Two quick ones from the top of my head:

1- OG Xbox had an AMD CPU And AMD reps found out that it was going to ship with an Intel CPU at the launch event.



2- Delaying the launch of Windows XP 64 so Intel could launch their AMD64 compatible cpu. Remember, AMD created the 64 bit extension to the x86 ISA, since Intel plans was to abandon it for Itanium, so they didn't had to continue granting any licenses to AMD.

There are less dubious reasons like drivers availability and other issues, but MS was well aware of AMD plan and process when they created AMD64.

3- When Phoronix did their Zen5 review using Linux, the results were close to what was expected, so that told me that something was wrong with Windows but of course, we always expect to be AMD fault, so the rest is history, as they say.
 
Oh man that depends on a lot of different things.
Call it an Atom based Xeon. Cause that's what it would be no?
Kinda. Modern E-cores are far more capable than Atom ever was. The jump with Gracemont was huge.

They come clocked max 4.3ghz stock. I would hope that the improvement would be with some clock speed under the hood. If they reached 4.6ghz, that would be a start. But I believe 5ghz should be obtainable if there's die shrink involved.
That would be somewhat missing the point of them. Clocks are nice, but most improvements in E-cores are for IPC so that they DO more at same clocks. I don’t think Intel will chase clocks with them, that’s counterproductive. They will reach higher frequencies, eventually, sure. But it won’t be desktop P-cores “feed it more power to reach max boost” thing.

@Neo_Morpheus
Is this a competition of who can find more ancient history than the other guy to “prove” their conspiracy theories? We already had a 2009 newspost, now we are digging up OG XBox and XP64. Very cool. Have anything, you know, this decade and actually relevant to what MS and AMD are now?
Again, can you explain logically why MS would want to “screw over” their major console and Azure hardware partner TODAY?
 
Is there a time line here? My DELL has an AMD 7530U 6c/12t processor installed. lol.

This thread is hard to follow!
Please link me to a proper Dell Latitude with an AMD cpu.
 
Please link me to a proper Dell Latitude with an AMD cpu.
Sorry I didn't purchase a Latitude. Inspiron. Cute little thing. Flip it all around, works like tablet mode, back into a laptop. Great for the kids too. Not fond of the default resolution, so I have it reduced slightly to 1080P.

If I log in, it will say purchased, or is this good enough? edit in: Do look the specs. It's a got a little horse power to it. 4267mhz memory ddr4 :)

Kinda. Modern E-cores are far more capable than Atom ever was. The jump with Gracemont was huge.


That would be somewhat missing the point of them. Clocks are nice, but most improvements in E-cores are for IPC so that they DO more at same clocks. I don’t think Intel will chase clocks with them, that’s counterproductive. They will reach higher frequencies, eventually, sure. But it won’t be desktop P-cores “feed it more power to reach max boost” thing.

@Neo_Morpheus
Is this a competition of who can find more ancient history than the other guy to “prove” their conspiracy theories? We already had a 2009 newspost, now we are digging up OG XBox and XP64. Very cool. Have anything, you know, this decade and actually relevant to what MS and AMD are now?
Again, can you explain logically why MS would want to “screw over” their major console and Azure hardware partner TODAY?
Frequency is a huge part of IPC or we'd all be settling for 3.5ghz and 4.7ghz boost clocks like 14nm 8086K anniversary chip!

The jump from Alder Lake to Raptor Lake, 50% of that jump was in raw frequency. So yes, we definitely want fast cache with decent quantity and IPC any way we can get it. :)
 
Last edited:
Sorry I didn't purchase a Latitude. Inspiron
The reason for the question was that for whatever reason, Dell refuses to use AMD CPUs with their Latitude line, which is what all corporations buy, hence high volume/higher profits.
Same for Optiplexes, but on those, they announce machines with AMD cpus, but you cant buy them, really weird.
Apparently, their Precision workstations also show available with ThreadRppers but I dont know if the same shenanigans are needed to purchase them.

I have seen in person Epic servers though.

So in conclusion, Dell keeps their profitable lines of business computers for their Intel buddies, except for some servers.
 
Frequency is a huge part of IPC or we'd all be settling for 3.5ghz and 4.7ghz boost clocks like 14nm 8086K anniversary chip!
That’s just blatantly wrong. IPC is how much work a chip can do per clock. PER. CLOCK. It’s not single threaded Cinebech scores. It’s a metric that, while nebulous and dependent on the workload, is completely not a result of frequency increases. To simplify it in the most basic way - if both 2600K and 14900K were locked to, say, 3 Ghz and performed a single core workload bound solely to Core 0 the latter would be significantly faster than the former. That’s IPC improvements stemming from architectural advancements in action.

The jump from Alder Lake to Raptor Lake, 50% of that jump was in raw frequency. So yes, we definitely want fast cache with decent quantity and IPC any way we can get it. :)
Here you just mashed everything into one big ratatouille without actually making a point to follow up on. I have no idea how sentence two correlates to sentence one.
 
Hmm....? You mean like what, Dell not selling AMD systems cuz I dunno, they had a backdoor deal with Intel. Stuff like that never ever happens right? Dell is in the business to make money right?

What does that have to do with my comment? I didn't question how Intel would benefit from AMD getting held back. I questioned how MS would benefit from holding AMD back? Of course Intel has tried to hold AMD back before. I said nothing to deny that and that's not even the topic of this thread. It's whether MS is trying to intentionally hold AMD back.
 
The reason for the question was that for whatever reason, Dell refuses to use AMD CPUs with their Latitude line, which is what all corporations buy, hence high volume/higher profits.
Same for Optiplexes, but on those, they announce machines with AMD cpus, but you cant buy them, really weird.
Apparently, their Precision workstations also show available with ThreadRppers but I dont know if the same shenanigans are needed to purchase them.

I have seen in person Epic servers though.

So in conclusion, Dell keeps their profitable lines of business computers for their Intel buddies, except for some servers.
Yeah, I see your point, though I've never noticed that really. But now that you mention it.

But do budget laptops sell more than High end? Would only need to be 3 to 1 odds. 3x 400$ laptops, or just get 1 1200$ laptop and share Shrugs emoji here. Seemingly in the burbs of Chicago, there's more mid and low end laptops floating around. I mean the schools use chrome books actually. No idea what's in em, I never bothered to look.
That’s just blatantly wrong. IPC is how much work a chip can do per clock. PER. CLOCK. It’s not single threaded Cinebech scores. It’s a metric that, while nebulous and dependent on the workload, is completely not a result of frequency increases. To simplify it in the most basic way - if both 2600K and 14900K were locked to, say, 3 Ghz and performed a single core workload bound solely to Core 0 the latter would be significantly faster than the former. That’s IPC improvements stemming from architectural advancements in action.

Here you just mashed everything into one big ratatouille without actually making a point to follow up on. I have no idea how sentence two correlates to sentence one.
I figure my 14100F is about as fast as a 6 to 6.5ghz 7700K. IPC progress is doing fine.

Yes I understand the generational design changes. 14nm lasted forever. The IPC increased. So did the measurement data. These where minor changes in the same architecture over a long period of time. What was the main thing that changed? Frequency.
AlderLake, brand new smaller node, increased performance IPC from that (at the same clock as previous gen), and the next main ingredient? Oh, it was clocked higher than previous generations. So I will take the performance IPC increase if clock speeds are up 25% on the new node. That's where 13th gen clobbered AlderLake was that design change for sure, but that was like a 400mhz increase in clock frequency. I think without my clock, I should have just went to bed. lol.
 
I dunno if it's an age thing but a lot of posters are acting like shenanigans are unpossible! I thought the mere mention of an OEM not selling AMD would ring a bell with everyone but I guess not. And Microsoft has a long history of siding with Intel going back decades.

Not impossible, but thinking rationally, you pull a link from 2009 which is relevant how?

Normal person when they see a bug or something not optimised "I guess the devs need some help here, lets file a bug report, submit a patch or whatever".
Or "Oh they have deliberately gimped our processors, and I am concerned for the well being of AMD, their sales, and we must tell the world how evil this software developer is".

Also to the earlier comment, I dont think any of us want AMD to fail, not sure how its relevant to the thread, but saying we think Microsoft is not playing games does not mean we want to lose a CPU vendor from the market.

I guess the thinking is because some company may have done something in the past they are now "guilty before proven innocent".

As has been said a few times now as well, Microsoft actually have a partnership with AMD for their console division. It would actually be a conflict of interest for Microsoft to deliberately gimp Windows software for AMD processors.
 
I mean I got the "hot fix" and well I didn“t see any losses or gains.
 
New video, more evidence
 
New video, more evidence
I dont know what the video was trying to present, I feel this kind of video should be showing the competitor chip as well, yes its more time for testing, but then its a better more informative video.

I assume he didnt bother installing 24H2, so probably latest stable build of Windows 11 with the patch. There is no Intel data in the video so he has presented the impact of different Windows builds on Ryzen chips and a single setting on/off VBS. Thats it.

Windows 10 VBS is off by default, seems also the case in 24H2, dont know for latest stable build of Windows 11.
 
Last edited:
VBS is on for Windows 11 by default.
 
VBS is on for Windows 11 by default.
Yeah and I figured out I can’t disable it or else I lose my monitorIng via AIDA…
 

Win11 24H2 version (the patched one!) still gives bad performance for Ryzen 3000 & 5000 compared to win10. So, for 3-4 years now, AMD is having much worse reviews than they should on their CPUs just because of that mess of an OS. And Intel stayed afloat as having comparable CPU performance due to nerfed Ryzen performance.

As for the question if MS does that on purpose, let me ask:

1) Which of those 3 companies (MS, Intel, AMD) have being found guilty at least once for anti-competion practices? Those two found guilty have collaborated for decades now. AMD was always the small company that happened to compete well for a few years with Athlon CPUs and went down again because of market manipulation from Intel side for years.
2) Which of the others MS would help if Intel or AMD pushed to get a help?
3) Which one between AMD and Intel needed that more and had more money to burn and buy out that help?
4) Who was so desperate to remain relevant while their products were inferior in most metrics and would remain so for the next 3 years (at least)?

That thread title is just a logical question for any sensible person with knowledge of historical facts of PC market. And most should anwer YES given the strong evidence coming out day after day. 2/3rds of the games show AMD CPUs losing close or more than 10% (huge difference) when going from win10 to win11 and only a few% of games do so for Intel.
 
Win11 24H2 version (the patched one!) still gives bad performance for Ryzen 3000 & 5000 compared to win10. So, for 3-4 years now, AMD is having much worse reviews than they should on their CPUs just because of that mess of an OS. And Intel stayed afloat as having comparable CPU performance due to nerfed Ryzen performance.
Is it really so bad, though? That 227 fps vs 213 fps I see in the thumbnail doesn't look too serious. (sorry, I can't be asked to watch a 10-minute video just to find out) :ohwell:
 
Is it really so bad, though? That 227 fps vs 213 fps I see in the thumbnail doesn't look too serious. (sorry, I can't be asked to watch a 10-minute video just to find out) :ohwell:

You have to remember fanboys on both sides act like 3-5% is absolute destruction for whatever brand they choose to simp for.

This still shouldn't be a thing though especially at a more than 5% reduction.
 
Last edited:
You have to remember fanboys on both sides act like 3-5% is absolute destruction for whatever brand they chose to simp for.

This still shouldn't be a thing though especially at a more than 5% reduction.
Agreed and I blame the current crop on influencers aka Youtubers that love to use the terms “destroyed” and “annihilated” when obtaining such results in favor of their favorite brand.

Yeap, many are also biased, be just bribes or worse, just because its their favorite brand/corporation.
 
Agreed and I blame the current crop on influencers aka Youtubers that love to use the terms “destroyed” and “annihilated” when obtaining such results in favor of their favorite brand.

Yeap, many are also biased, be just bribes or worse, just because its their favorite brand/corporation.
Robeytech got triggered when I asked him why he doesn't use AMD cards.
 
Agreed and I blame the current crop on influencers aka Youtubers that love to use the terms “destroyed” and “annihilated” when obtaining such results in favor of their favorite brand.

Yeap, many are also biased, be just bribes or worse, just because its their favorite brand/corporation.

It's thanks to those influencers that I've been conditioned to skip/ignore videos with titles like that. The masses can't seem to get enough of those keywords though.
 
Back
Top