• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD FX 8150 with Microsoft KB2592546 Put Through 'Before and After' Patch Tests

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,769 (7.42/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
To the surprise of many, last week, Microsoft rolled out a patch (KB2592546) for Windows that it claimed would improve performance of systems running AMD processors based on the "Bulldozer" architecture. The patch works by making the OS aware of the way Bulldozer cores are structured, so it could effectively make use of the parallelism at its disposal. Sadly, a couple of days later, it pulled that patch. Meanwhile, SweClockers got enough time to do a "before and after" performance test of the AMD FX-8150 processor, using this patch.

The results of SweClockers' tests are tabled below. "tidigare" is before, "nytt" is after, and "skillnad" is change. The reviewer put the chip through a wide range of tests, including synthetic CPU-intensive tests (both single and multi-threaded), and real-world gaming performance tests. The results are less than impressive. Perhaps, that's why the patch was redacted.



View at TechPowerUp Main Site
 
Lower performance? Oh god help AMD.
 
^might have something to do with the patch being pulled just one of those duh things.
 
Pulling the patch is a good thing, would rather they release the patch when performance is to our expectation.
 
^but performance IS to our expectations! LOL ;) Just not to our wishful thinking!

Actually, that the results are mixed and not consistent shows that there is something funky with the architecture. If they were using affinity to keep the code over two neighbour cores whereever possible to get the Turbo-Boost effect to come in, then why would the results sometimes be worse? I assume that the flipside to the turbo boost is the effect of smaller L1 cache and the neighboring cores having to share cache and pipeline rather than letting two cores and their caches work independently. Swings and roundabouts.

I think the "affinity to get Turbo boost" feature ends up being a mistaken performance tweak. It is also a worthless concept if the next iteration of BD has higher overall clocks and lower Turbo boost.

Poor AMD. Hyping scheduler changes being the solution to all their woes... when it wasnt.

Sounds a bit like the wife... "if, but, you, not my fault..." yadda yadda
 
Last edited:
Perhaps, that's why the patch was redacted.

No perhaps necessary.

We've spoken with an industry source familiar with this situation, and it appears the release of this hotfix was either inadvertent, premature, or both. There is indeed a Bulldozer threading patch for Windows in the works, but it should come in two parts, not just one. The patch that was briefly released is only one portion of the total solution, and it may very well reduce performance if used on its own. We're hearing the full Windows update for Bulldozer performance optimization is scheduled for release in Q1 of 2012. For now, Bulldozer owners, the best thing to do is to sit tight and wait.

This is why no site of substance has run these tests; it's not indicative of squat.
 
The problem with this chip is per thread performance, it simply renders slow while sucking too much power. Windows throws loads randomly across cores, yeah, but the cores are there. The CPU has the potential, it just can't deliver it. At times, Phenom chips also suffer from the same case. I mean, a decent quad losing to a dual core at this time and date?

http://www.anandtech.com/bench/Product/54?vs=88

In reality though, the quad would trump that dual in BF and such. But they need more per-clock performance. A bit too late now...
 
Well, considering that the patch has been pulled because it could actually hinder performance, I'd say this is a pretty moot benchmark.

I'll be surprised if the actual patch brings a performance increase, but let's at least wait and see if it does.
 
whole thing is MS pulled it because the patch consists of 2 parts, MS released only half of it accidently- whoever the webmaster is of MS website.
 
Last edited by a moderator:
John Doe^

You are truely an idiot. That is 1 benchmark on 1 website. The other ARMA 2 benchmarks from "other" sites are not that poor. It's common sense can see that is merely a freakishly bizarre result maybe due to a driver or bad configuration which probably has nothing to do with Bulldozer.

Nope, but you're a fanboy bud. If you go over [H] to see the article, there're other similar cases. The architecture blows end of story. Grow up.
 
looks like "a patch" for sales not for performance no patch is going to fix
a bad product
 
Nope, but you're a fanboy bud. If you go over [H] to see the article, there're other similar cases. The architecture blows end of story. Grow up.

You have a problem reading. We know the architecture blows in single threaded games - I get that.

But I'm saying that ONLY Hard OCP shows ARMA 2 performing a min of 9 FPS - which anyone with common sense can see is unrelated to the CPU. A fucking Athlon X2 from 2006 can get higher mins than 9 FPS. So common sense will say it's an outside influence. Software/configuration/bios etc.

For all we know it could of been driver conflict with the Nvidia card tested.
 
Nope, but you're a fanboy bud. If you go over [H] to see the article, there're other similar cases. The architecture blows end of story. Grow up.

So anybody that doesn't go exclusively with Nvidia and Intel is a fanboy right? That's all I've gathered from your trolling of all AMD threads. Thanks for clearing that up.
 
This isnt a bad product, and I do recall MS releasing Patches in the past for both Intel and AMD due to new CPU/Motherboard Chipset designs...
 
You have a problem reading. We know the architecture blow in single threaded games - I get that.

But I'm saying that ONLY Hard OCP shows ARMA 2 performing a min of 9 FPS - which anyone with common sense can see is unrelated to the CPU.

First of all, [H]'s article is extensive and one of the best out there. And if you look at "average", Sandy has almost double the frames. Not just in ARMA, but in some other titles as well. You also have to consider that these (games in test) are well threaded.

So anybody that doesn't go exclusively with Nvidia and Intel is a fanboy right? That's all I've gathered from your trolling of all AMD threads. Thanks for clearing that up.

Trolling? Is that what you get out of it? Take off your AMD shirt buddy, I post facts. And no, it has nothing to do with going with whatever brand. Brand fanboyism is stupid. I suggest people more AMD GPU's than nVidia (due their bang for buck). But not sure you're grown enough to realize that.
 
instead of posting others results- post your own

thats all i can say

dont ask me to because im on 8+ year old hardware running win 7 32bit.

besides is the 2700K truly superior to the 2600K or 2500K other than just a model number change? I noticed Intel is to release a 2550K or 2650K shortly...
 
First of all, [H]'s article is extensive and one of the best out there. And if you look at "average", Sandy has almost double the frames. Not just in ARMA, but in some other titles as well. You also have to consider that these (games in test) are well threaded.



Trolling? Is that what you get out of it? Take off your AMD shirt buddy, I post facts. And no, it has nothing to do with going with whatever brand. Brand fanboyism is stupid. I suggest people more AMD GPU's than nVidia (due their bang for buck). But not sure you're grown enough to realize that.

Your posts are arrogant and aggressive, so yes you are trolling, and it's not the first time neither, to say you haven't really been here long you won't do yourself any favours if this is your general way of communicating with others. You need some people skills, get off your big hard Intel e-peen for a day and step out and get some sunlight instead of jerking off to the internets and acting like your opinion matters, because sorry to burst your bubble, it doesn't. You felt the need to come in here and thread crap when it's not about BD Vs Intel, dare to call me an AMD fanboy? check my specs :slap:
 
First of all, [H]'s article is extensive and one of the best out there. And if you look at "average", Sandy has almost double the frames. Not just in ARMA, but in some other titles as well. You also have to consider that these (games in test) are well threaded.

I'm not faulting Hard OCP's article as a whole. I think they are one of the best unbias writers.

Compare ARMA 2's with Bulldozer result to any other hardware website and you won't see a min of 9FPS.

What I'm saying is the ARMA 2 test is probably a bad run, influenced by a third party factor. eg. You can run 3D Mark 1,000 times eventually a driver, bad cofiguration or some weird fault within Windows will have the result coming up short atleast a few times.
 
besides is the 2700K truly superior to the 2600K or 2500K other than just a model number change? I noticed Intel is to release a 2550K or 2650K shortly...

http://www.eteknix.com/reviews/processors/intel-core-i7-2700k-flagship-showdown-revie/

http://www.eteknix.com/reviews/processors/intel-core-i7-2700k-flagship-showdown-review/3/

Your posts are arrogant and aggressive, so yes you are trolling, and it's not the first time neither, to say you haven't really been here long you won't do yourself any favours if this is your general way of communicating with others. You need some people skills, get off your big hard Intel e-peen for a day and step out and get some sunlight instead of jerking off to the internets and acting like your opinion matters, because sorry to burst your bubble, it doesn't. You felt the need to come in here and thread crap when it's not about BD Vs Intel, dare to call me an AMD fanboy? check my specs :slap:

No, I'm not. You don't understand what "trolling" is. I'm being pushy, not trolling. If I'm being flat out rude, then it'd at best be my arrogance, but not trolling. I actually have been reading here far longer than your join date, so no need to belittle me. As for my Intel e-peen, sorry, not going to bother.

Why do you think I called him a fanboy? Get some reading comprehension.
 
Trolling? Is that what you get out of it? Take off your AMD shirt buddy, I post facts. And no, it has nothing to do with going with whatever brand. Brand fanboyism is stupid. I suggest people more AMD GPU's than nVidia (due their bang for buck). But not sure you're grown enough to realize that.

You've just proven my point. You call me a fanboy for pointing out that you run into a thread, get proven wrong 90% of the time, and call people fanboys whenever you get proven wrong. All while ignoring the fact that my next CPU upgrade comes in May and doesn't have AMD written on it, and I'm unsure of what will be printed on my next GPU.

They say ignorance is bliss so you must be a very jovial person in real life.

Anyway, I'm done dragging myself down to your level to grunt in your language. Have fun.
 
You've just proven my point. You call me a fanboy for pointing out that you run into a thread, get proven wrong 90% of the time, and call people fanboys whenever you get proven wrong. All while ignoring the fact that my next CPU upgrade comes in May and doesn't have AMD written on it, and I'm unsure of what will be printed on my next GPU.

They say ignorance is bliss so you must be a very jovial person in real life.

Anyway, I'm done dragging myself down to your level to grunt in your language. Have fun.

No, actually, I'm right. The CPU sucks. You're wrong and can't take it due to your love of AMD. I'm not going to comment on those percantages you make up, or my ignorance because you're seeing things red and blue, when they are not.
 
http://www.eteknix.com/reviews/processors/intel-core-i7-2700k-flagship-showdown-revie/

http://www.eteknix.com/reviews/processors/intel-core-i7-2700k-flagship-showdown-review/3/



No, I'm not. You don't understand what "trolling" is. I'm being pushy, not trolling. If I'm being flat out rude, then it'd at best be my arrogance, but not trolling. I actually have been reading here far longer than your join date, so no need to belittle me. As for my Intel e-peen, sorry, not going to bother.

Why do you think I called him a fanboy? Get some reading comprehension.

Troll: a troll is someone who posts inflammatory,[2] extraneous, or off-topic messages..

Now Mr "I post facts" where does Intel Vs BD come into play in this thread? so yes, you are trolling, posting off topic benches for what reason? then you go shouting fanboy, do me a favour, there's only one fanboy in this thread. Your posts have been reported, I suggest you wind it in unless you want to end up banned here like you have been from the EVGA forums and god knows where else.
 
I just read up on it- seems only stock the 2700K leads by .3 points higher but overclocked the same- not much diff at all.

I guess you got it because of the higher stock clock which means not having to overclock it as far- to extent.

Im wondering if your P67 is holding your Model back vs the Z68 Series.

So How much did you drop on it vs a 2600K if I may ask?

http://www.eteknix.com/reviews/processors/intel-core-i7-2700k-flagship-showdown-revie/

http://www.eteknix.com/reviews/processors/intel-core-i7-2700k-flagship-showdown-review/3/



No, I'm not. You don't understand what "trolling" is. I'm being pushy, not trolling. If I'm being flat out rude, then it'd at best be my arrogance, but not trolling. I actually have been reading here far longer than your join date, so no need to belittle me. As for my Intel e-peen, sorry, not going to bother.

Why do you think I called him a fanboy? Get some reading comprehension.
 
Back
Top