• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD FX-8150 3.60 GHz with Windows Patches

Meh, I'll stick to what I have. It serves me very well.
 
Thanks. Been waiting for the TPU BD review for ages...

But an outstanding review it is, so kind of worth waiting.

I've been extensively testing my FX-4100 for the last couple months, but only last night I really started juicing it to see what it can do at 5GHZ, as I'm not exactly impressed with what it can do at 4.5GHZ (compared to my old Phenoms at 4.0GHZ).

I really had to juice it to be stable, with nearly 1.65volts to remain stable. There was obviously quite a bit of heat, with stressed temperatures leveling out about 65'C. (4.5GHZ on 1.5volts barely hit 40'C).

What was interesting was the scaling seemed to improve dramatically at the higher clocks. I would normally get about 20GFLOPS in LinX at 4.5GHZ, but GFLOPS increased to about 28 at 5GHZ. The built in benchmark in AOD soared to 8400 at 5GHZ, compared to about 7500 at 4.5GHZ. System responsiveness in windows "felt" much faster as well.

So what I'm saying is that I think these BD chips were designed to run at much higher clocks than what we see right now. They seem to really like high frequency. I think that as the fab process improves we will see what these chips are really made of, without having to use obnoxious voltages and thermal solutions.

For now, I'm anxious to see what my current generation BD chip can do if I can improve upon my already excellent air cooling.

On a side note, I was also able to further tune my 2x4GB of Ripjaws to 1800 CL7 1T last night.
 
Thanks. Been waiting for the TPU BD review for ages...

But an outstanding review it is, so kind of worth waiting.

I've been extensively testing my FX-4100 for the last couple months, but only last night I really started juicing it to see what it can do at 5GHZ, as I'm not exactly impressed with what it can do at 4.5GHZ (compared to my old Phenoms at 4.0GHZ).

I really had to juice it to be stable, with nearly 1.65volts to remain stable. There was obviously quite a bit of heat, with stressed temperatures leveling out about 65'C. (4.5GHZ on 1.5volts barely hit 40'C).

What was interesting was the scaling seemed to improve dramatically at the higher clocks. I would normally get about 20GFLOPS in LinX at 4.5GHZ, but GFLOPS increased to about 28 at 5GHZ. The built in benchmark in AOD soared to 8400 at 5GHZ, compared to about 7500 at 4.5GHZ. System responsiveness in windows "felt" much faster as well.

So what I'm saying is that I think these BD chips were designed to run at much higher clocks than what we see right now. They seem to really like high frequency. I think that as the fab process improves we will see what these chips are really made of, without having to use obnoxious voltages and thermal solutions.

For now, I'm anxious to see what my current generation BD chip can do if I can improve upon my already excellent air cooling.

On a side note, I was also able to further tune my 2x4GB of Ripjaws to 1800 CL7 1T last night.

I really hope that's 28 GFLOPS per core, as my Pentium Dual-Core E6600 @ 3.8GHz gets ~24 GFLOPS, and my 4.5GHz 2600K gets ~110 GFLOPS using AVX enabled IntelBurnTest.
 
I'm pretty sure a Phenom II X4 can reliably beat an FX-4100 since the Phenom II X4 will never encounter shared resources like the BD Chip. Per-Thread performance on Phenom II always seems to beat Bulldozer.
 
I'm pretty sure a Phenom II X4 can reliably beat an FX-4100 since the Phenom II X4 will never encounter shared resources like the BD Chip. Per-Thread performance on Phenom II always seems to beat Bulldozer.
That's what Bulldozer was designed for to share resources. Now what AMD needs to so is speed it up so it won't get bogged down in performance.

Picture a 4 lane freeway going into 2 lanes without warning. That is how the Bulldozer seems to work in terms of sharing its resources. I believe Piledriver will resolve this issue, perhaps not as much as we would like, but enough to gain it a nice 20% to 30% performance boost over the current Bulldozer IMO. :D

That new CEO is playing it safe and keeping everything behind close doors. Once they achieve the desired performance, that is when they will release the big guns.
 
Great review, finally!!!

But I know what's going to happen with this thread. the expected.
162193070_41c0cc04e1.jpg
 
That's what Bulldozer was designed for to share resources. Now what AMD needs to so is speed it up so it won't get bogged down in performance.

Picture a 4 lane freeway going into 2 lanes without warning. That is how the Bulldozer seems to work in terms of sharing its resources. I believe Piledriver will resolve this issue, perhaps not as much as we would like, but enough to gain it a nice 20% to 30% performance boost over the current Bulldozer IMO. :D

That new CEO is playing it safe and keeping everything behind close doors. Once they achieve the desired performance, that is when they will release the big guns.

I'm fully aware that's what it was designed for, and I consider it a flawed design. I'm also curious why people keep saying PD will be a 10/15/20/30/50% performance increase over Bulldozer, other than AMD saying that's what they were shooting for, is there any concrete evidence or explaination as to how this will be accomplished?
 
I'm fully aware that's what it was designed for, and I consider it a flawed design. I'm also curious why people keep saying PD will be a 10/15/20/30/50% performance increase over Bulldozer, other than AMD saying that's what they were shooting for, is there any concrete evidence or explaination as to how this will be accomplished?
32nm process revision along with modifications to the L2 & L3 caches, Branch Prediction, and so on.
It's not a bad design, AMD's only mistake was the fact they relied heavily on automation instead of detailed hand workmanship just as they did in the past.
 
32nm process revision along with modifications to the L2 & L3 caches, Branch Prediction, and so on.
It's not a bad design, AMD's only mistake was the fact they relied heavily on automation instead of detailed hand workmanship just as they did in the past.

L2 and L3 cache on BD CPU's I heard was awful by todays standards, but Ihave doubts there will be any real improvements on the manufacturing front. Their real flaw was assuming that just because they made multi-threading a huge focus, that it wouldhappen overnight. A huge number of tasks are still single-threaded, and thus most people will see a huge benefit in a stronger per-thread performance. If you are never using more than 4 threads, what good is being able to do 8 and each of the 4 that you do are only at 66%?
 
i can really not understand why all thoose Intel fanboys uses this thread for flames & beef against a totally different CPU arch ,
it´s same as all the pro Nvidia sponsored based games benching with a ATI Card ......
if there wouldn´t be a competition ... there wouldn´t be any price dumps, new innovations....


Get a life and bitch somewhere else

i´m running a fully AMD system, everything works fine in apps / games and there might be a marginal perf benefit on Intel
but u can´t tell me that you feel or even see it.. (not bench related)

note to the OP "thx for the Review"
 
Last edited:
I still don't get why you down clock the mem on Intel side and only use 2 sticks on the the 1366 makes it less realistic imo
 
Last edited:
I still don't get why you down clock the mem on Intel side and only use 2 sticks on the the 1366 makes it less realistic imo

I don't downclock anything. Intel supports DDR3 up to 1333 MHz and that's a fact. Anything above that is in fact overclocking and would not give valid results.

As for LGA1366 setup, please read again:
LGA1366
3 x 2048 MB MUSHKIN BlackLine FrostByte PC3-12800 DDR3
@ 1333 MHz 7-7-7-21 (limited to 4GB)

I'm using 3 sticks of 2 GB, equaling 6 GB in total. Since all other platforms can't have that kind of configuration, LGA1366 is then limited in windows to use just 4GB, to be on the same level with other platforms. Triple channel is still used, and all of it's benefits, it just means Windows can't address more than 4 GB.
 
i can really not understand why all thoose Intel fanboys uses this thread for flames & beef against a totally different CPU arch ,
it´s same as all the pro Nvidia sponsored based games benching with a ATI Card ......
if there wouldn´t be a competition ... there wouldn´t be any price dumps, new innovations....

I'm (if this was directed towards me) not flaming AMD because it's AMD. I'm pointing out BD has a lot of very real shortcomings. If you choose to ignore them and continue to support the product\company more power to you, but I will always go where the best overall price\performance is. In the Athlon XP/Athlon 64 days I bought AMD becuase it was the best bang for the buck. These days Intel seems to offer that top notch Price\Performance (in the realm of gaming and every day use)

AMD and their marketting brought all of the criticism they recieve on themselves. You don't go back to using a Prolific name like the FX Series, known historically for being the most powerful CPU's in their class, and release something that struggles to keep up with the competition, and expect people to just accept it. If you bought a season ticket through the NBA, and when you got to the stadium it was just a bunch of High School kids playing, wouldn't you be a little steamed?

I am only discussing what AMD needs to do to remain competative, and what would benefit the consumers the most. When BD was on the horizon, and the FX name was announced, people starting jumping in joy because it was gearing up to be a game changer. Then ES benches leaked, and they were clearly fakes because it performed somewhere between awful and above average. Then BD launched, and the early samples were pretty damn accurate, so it just had to be that Windows was poorly optimized, or the BIOS were wrong, or the scheduler was broken. Then it became that BD was never really supposed to be that good, it was based on server architecture anyway. Now that all of that has been debunked and proven inaccurate, it's Piledriver is the real product to look out for, it's going to offer a 5-75% performance boost over its little brother!

The nonsense can go both ways.

I just hope AMD can figure it out and offer a product like Llano or Trinity, that does an excellent job at what it's intended to do. I am hoping to get a Trinity Laptop when they come out for a reasonable price and get some light gaming done when I'm out of the house. Should be good.

Oh, and I agree, it's an excellent review.
 
I don't buy entry level hardware so I don't care who offers more at that price range. I know that AMD mostly dominates the entry and low midrange, when you get higher and higher AMD starts to fade away quickly

Actually you'd be surprised with the sandy bridge pentiums, think i3 minus hyperthreading. Great chips. Wish I went with them over Llano A8-3850 for my girlfriends build. Add a cheapo discreet gpu and its still cheaper than Llano.


The only word I understood in your post. Seriously, english dude, we talk english here.

wtf does that even mean??? :confused:

HAHAHAHAHAAA :roll::roll::roll::roll::roll:

i´m running a fully AMD system, everything works fine in apps / games and there might be a marginal perf benefit on Intel
but u can´t tell me that you feel or even see it.. (not bench related)

I can genuinely tell you with 110% certainty that I see and feel the difference with a 2500k as opposed to AMD's next best offering. Games, audio production, you name it. Not all games are GPU restricted. *cough* skyrim *cough*
 
I don't downclock anything. Intel supports DDR3 up to 1333 MHz and that's a fact. Anything above that is in fact overclocking and would not give valid results.

As for LGA1366 setup, please read again:


I'm using 3 sticks of 2 GB, equaling 6 GB in total. Since all other platforms can't have that kind of configuration, LGA1366 is then limited in windows to use just 4GB, to be on the same level with other platforms. Triple channel is still used, and all of it's benefits, it just means Windows can't address more than 4 GB.
I say its not realistic cuz ppl that build there own rig are not going to buy 1333 mhz ram and I'm sure overclocking isn't in amd or intel support, but I guess Intel does now with that rma

And who still uses 32-bit OS when 8+gb are so cheap.. I did not read what OS was used lol

So I'm just giving my opinion ;)
 
And who still uses 32-bit OS when 8 gb are so cheap.. I did not read what OS was used lol

nuff said
don't give your opinions on something you didn't bother to read.
 
nuff said
don't give your opinions on something you didn't bother to read.

I said the OS you used.. wow go cry to your mommy, I guess I just won't look at your lame oem reviews
 
Actually you'd be surprised with the sandy bridge pentiums, think i3 minus hyperthreading. Great chips. Wish I went with them over Llano A8-3850 for my girlfriends build. Add a cheapo discreet gpu and its still cheaper than Llano.



I can genuinely tell you with 110% certainty that I see and feel the difference with a 2500k as opposed to AMD's next best offering. Games, audio production, you name it. Not all games are GPU restricted. *cough* skyrim *cough*

wooohooo now you showing off with the benefit of mighty :respect: INTEL

me wouldn't wonder if all the stuff is based on books like e.g:

51APZW8WCCL._BO2,204,203,200_PIsitb-sticker-arrow-click,TopRight,35,-76_AA300_SH20_OU01_.jpg


or

http://www.intel.com/intelpress/programming.htm?iid=prodmap_tb+prog

or

http://software.intel.com/en-us/intel-sdp-home/


hmmmmm :D FX-8150 Gameplay

http://www.hardocp.com/article/2011/10/11/amd_bulldozer_fx8150_gameplay_performance_review/2
 
I don't like where this thread is going. The bickering and flaming stops now. Infractions will be handed out without warnings from this point forward. People getting so damn emotional over silicon. Keep it civil, people.
 
The F16 vs F22 comparison was used just to make a point. Please don't troll about it

I put my opinion across im not here to wind you or others up unlike some in this thread


though i appreciate what your saying and i agree its not a get out of jail card for AMD, i feel the single threaded performance is not as bad as or important as some imply, its not got a leg blown off,more a slight limp to me:)

and in Games ,where it matters most, to most people, all the BS spouted is just that, it performs close to (mostly between)intels 25-2600K which isne too Bad for a whole new architecture not a million miles below as many are implying.

and as for future tech your ideas though reasonable dont quite sound right to me, when physx was new it wasnt and still isnt used by many dev co's for anything but that dosnt to me make it less valuable as an idea , or not worth putting in and some would argue as intel did in the beginning that it was pointless as Cpu's did enough work(that was then eh)

its an actual fact that BD has Ops that arent and cant be used by most software yet, but that ddoes not mean they are worthless or shouldnt be worked with 256bitAvx extensions are an eg, im not a software tech/writer but i can appreciate when something will have a use ,just as GPU,s are now wielding their additional features ,one day so might BD then it may gain another 2%:p or maybe even the 50%:eek: Amd stupidly claimed

I don't like where this thread is going. The bickering and flaming stops now. Infractions will be handed out without warnings from this point forward. People getting so damn emotional over silicon. Keep it civil, people.

why do i allways see thes After i post i did try to keep a civil head on tho:)

If you could OC a chip to 20GHz but it performed as well as a Toaster would you still love it? Because that's basically the logic you're implying.
no what i implied was .Intels chips are easy to OC, a noob can do it, which some say is good ,and i see that point, but Amd's have more to mess with and fine tune(fact), so someone who Can and enjoys Ocíng, will have more fun with Amd.

oh and yeh at least once just to see:)


as i said i played safe and got a 960T, but im not on here giving AMD shit(poya) permanently ,despite them anoying me a bit with their Bs PR etc
 
Last edited:
I understand your opinion and respect it. But in the end it all comes down to facts.
And the fact is, review is about a specific product (FX-8150), and as such it didn't live up to either needs or expectations.

If I were to review just Bulldozer architecture, presented on a piece of paper, I'd give it a round of applause ;)

Perhaps we will see more from Pilledriver, perhaps we'll be disappointed again. I think it's futile to even begin discussion about the future and what will it bring.
 
I for one had great hopes for Bulldozer. My early builds were all AMDs, but when I got back into it after some time off I was saddened to see Intel was the better bang for the buck.

Bulldozer was my hope for a reversal of that. I was disappointed. And promises of fixes down the road were even more unpleasant.

All you need to do is look at the chart in the link below and you will see that even a 2500K is a much better buy than an FX-8150. Of course this according to one reviewers' results, but it pretty much agrees with what I read elsewhere.

So I went with Sandy Bridge when I needed to upgrade, and got my 2500K for $180. I think it was a real good buy considering the options.

http://techreport.com/articles.x/21813/19
 
Back
Top