• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Hardware Unboxed benchmarks: Ryzen 3 vs. Core i5 2500K vs. FX-8370

Honestly, after watching this video i think AMD FX users should just die at this point.

Well the FX8350 was released in 2012 and was like €150ish for the most part ... Considering it seems to be slightly below the Ryzen 3 CPUs, which are €100-130, in modern games... I would actually say the FX8350 has held up pretty well, considering the general opinion of them.
 
Then it's even more funny he spoke about the FX 8350 as an "icon". :laugh: Icon of low IPC and efficiency or what.

Exactly! But seriously, this CPU seems to be quite iconic, because so many people are using it in their gaming rigs and boasting about it like it's some kind of wonder thing! So many people are selling used computers with this CPU having stronger/newer other components, components better suited for faster processors and that just bothers me, because certainly not all of those people understand that this CPU can not keep with video cards faster than GTX970.

I kinda like how detailed adoredtv tries to be in his videos, but that said he's still amd biased to put it lightly,although he likes to say he isn't. Dunno how the hell he came up with the conclusion that fx 8300 was a better buy than 2500K in the long run cause of how multithreaded games these days are. The videos clearly shows fx8 getting beat soundly when running a gtx 1080.

I agree with both of you about AdoredTV. He is a smart guy, but his "conclusions" are going nowhere and watching his videos i sometimes feel as if he does not know what he is trying to prove to people with all of his research and thoughts trying to stay unbiased, but failing.

Still playing games at 1080 60fps on the 5 year old FX. I will switch to Ryzen some time in the future but not for the sake of gaining more performance in games because really , I wont , not at that resolution and framerate.

Don't take this personal, but you CPU was a weakling 5 years ago and is still today.

This is a benchmark i made 5 years ago showing how bad the Piledriver FX-6300 is compared to Nahelem Core i7 920 when both are clocked at 4 GHz:

https://www.techpowerup.com/forums/...vs-bulldozer-vs-piledriver-benchmarks.177251/

Take a look at Homefront, Hard Reset, World in Conflict, Serious Sam 3, Lost Planet 2 minimal FPS - it will tell you that you should have replaced that CPU a long time ago. But hey, if you are and AMD fan, then respect for keeping it so long, but now is really the time to move on to Ryzen!

The funny thing about FX is, they needed over 5, yes FIVE, years to have a higher IPC with it than with Phenom II! Phenom II was released early 2009 and only with Steamroller (2014, 3rd gen FX) they had a higher IPC. "Piledriver" was actually on par, Bulldozer was LOWER ipc than Phenom II. In the end, they gambled on "coaaarss, mooooar coaars" and it was a utter fail, because almost nobody cared about having so many cores so early.
I was "forcing" this "dogma" to people for a pretty long time:toast:

Well the FX8350 was released in 2012 and was like €150ish for the most part ... Considering it seems to be slightly below the Ryzen 3 CPUs, which are €100-130, in modern games... I would actually say the FX8350 has held up pretty well, considering the general opinion of them.

What? Are you pretending or what?:confused::confused::confused:

It is not slightly below Ryzen 3, it is way worse than Ryzen 3 processors! Honestly.. Core i5 2500K is slightly below Ryzen 5 1400, but FX-8350 is much worse! Did you even watch the Hardware Unboxed video? Yes, for a 5 year old CPU it held quite well due to it's high clock and many cores, but it's IPC was weak in the first place.
 
Last edited:
What did i hear it right fx is overclocked to 4.4 are you serious when the boost clock is 4.3

Al-Pacino-LaughSmoking.gif
 
Certainly not all of them can do 1080/60 on your FX. I'd wager most of them can't, you're just talking big. Hundreds of tests prove it can't keep up. It'd get destroyed in any open world game like GTA5 or Watch Dogs 2. Hell, my 4.9GHz 2500K could not do perfect 60 fps @1080p paired with 290X trix back in 2014.
I just dissagree , i still game at 4k , mostly with settings to acomodate 60fps and few do i have to turn settings down on ,GtaV i can run very high settings at 60pfs@4k my 8350 does limit me for sure , id get more for sure, but its no nightmare to use.
 
Don't take this personal, but you CPU was a weakling 5 years ago and is still today.

This is a benchmark i made 5 years ago showing how bad the Piledriver FX-6300 is compared to Nahelem Core i7 920 when both are clocked at 4 GHz:

https://www.techpowerup.com/forums/...vs-bulldozer-vs-piledriver-benchmarks.177251/

Take a look at Homefront, Hard Reset, World in Conflict, Serious Sam 3, Lost Planet 2 minimal FPS - it will tell you that you should have replaced that CPU a long time ago. But hey, if you are and AMD fan, then respect for keeping it so long, but now is really the time to move on to Ryzen!

You and that other dude just simply don't understand what I'm saying. I game at 1080p/60hz. If were to switch to Ryzen or something faster I would certainly get higher framerates but what good would it be if as soon as I turn V-sync on all that performance becomes dormant and I'll get a marginally better result at best. There is a very good reason I stuck with this CPU for so long , it gets the job done. Hell it was probably a better investment than every other PC part I ever bought , probably even more so than the 1060 that I have that I'm sure in the next 2-3 years is going to become useless.

Actually I wouldn't even call it an investment , got it for dirt cheap in reality. And you know why that is ? Because of people pushing others to buy unnecessary and overpriced CPUs at the time. So in one way I am actually thankful for people like you. Thank you for making perfectly adequate hardware cheap by underestimating it. :)

Point is , I will upgrade to Ryzen soon but no in hell I'll do that just for gaming because it's just fine , I have other uses for a faster CPU now.
 
Last edited:
I happen to own both, an 8370 and a 2600K - Both chips and setups overall do what I expect from them.
TBH I"m not much of a gamer and frankly it's been a long time since I've even purchased a game let alone played online but that's just me.

For what I do the 8370 is quite adequate and serves it's purpose just fine and so does the 2600K. My purpose is competitive OC'ing so I don't have to worry alot about what many see as important, for me it's more along the lines of getting what I can from different pieces to score points and yes, the 8370 chips score points too.
I'm not questioning the performance of an 8370 vs a 2600K or even a 2500K, I'm just saying if it's working for you to your satisfaction then it's all good regardless of other opinions about it.
Do as you will because I certainly do.
 
I happen to own both, an 8370 and a 2600K - Both chips and setups overall do what I expect from them.
TBH I"m not much of a gamer and frankly it's been a long time since I've even purchased a game let alone played online but that's just me.

For what I do the 8370 is quite adequate and serves it's purpose just fine and so does the 2600K. My purpose is competitive OC'ing so I don't have to worry alot about what many see as important, for me it's more along the lines of getting what I can from different pieces to score points and yes, the 8370 chips score points too.
I'm not questioning the performance of an 8370 vs a 2600K or even a 2500K, I'm just saying if it's working for you to your satisfaction then it's all good regardless of other opinions about it.
Do as you will because I certainly do.

I totally agree and +1 to your thought.

And i will also add the price to performance in my case i bought the fx8350 for 100eur now that is an excellent price to performance ratio.

PS: and at that time an i3 was 55eur more expensive.
 
What? Are you pretending or what?:confused::confused::confused:

It is not slightly below Ryzen 3, it is way worse than Ryzen 3 processors! Honestly.. Core i5 2500K is slightly below Ryzen 5 1400, but FX-8350 is much worse! Did you even watch the Hardware Unboxed video? Yes, for a 5 year old CPU it held quite well due to it's high clock and many cores, but it's IPC was weak in the first place.

I was going on this review, generally. Also thank you for agreeing with the basic point. :p
 
What did i hear it right fx is overclocked to 4.4 are you serious when the boost clock is 4.3

While i do agree that 4.4 GHz is a minimum overclock and actually sounds like a joke OC, the general picture would not change even if FX-8370 would be overclocked to 5 GHz - if an IPC set was not a good design in the first place, no overclocking would enhance the performance over Ryzen 3 in games. Take a look at Ryzen 7 1700 - a 300 MHz overclock from it's 3700 MHz turbo clock yields great results matching stock Core i7 7820X in games (Hardware Unboxed benchmarks) - that's a good IPC for you, differently from all FX series.

I was going on this review, generally. Also thank you for agreeing with the basic point. :p

Yes, then i will also correct myself that it is in games that Ryzen 3 is much faster than FX-8350, not in programs. I also know that FX-8370 is faster than Core i5 3570K/4670K in programs.
 
While i do agree that 4.4 GHz is a minimum overclock and actually sounds like a joke OC, the general picture would not change even if FX-8370 would be overclocked to 5 GHz - if an IPC set was not a good design in the first place, no overclocking would enhance the performance over Ryzen 3 in games. Take a look at Ryzen 7 1700 - a 300 MHz overclock from it's 3700 MHz turbo clock yields great results matching stock Core i7 7820X in games (Hardware Unboxed benchmarks) - that's a good IPC for you, differently from all FX series.

Yes, then i will also correct myself that it is in games that Ryzen 3 is much faster than FX-8350, not in programs. I also know that FX-8370 is faster than Core i5 3570K/4670K in programs.[/QUOTE
There is no point to talk aobut the IPC as there is no reason to OC an 2500k aigaints the FX.
 
Still playing games at 1080 60fps on the 5 year old FX. I will switch to Ryzen some time in the future but not for the sake of gaining more performance in games because really , I wont , not at that resolution and framerate.
It is a CPU suited for 60 fps, but not anything higher than that. And as the other guy tried to point out: you'd have higher minimum / avg FPS by switching, even playing on 60 FPS only as a target.
Exactly! But seriously, this CPU seems to be quite iconic, because so many people are using it in their gaming rigs and boasting about it like it's some kind of wonder thing! So many people are selling used computers with this CPU having stronger/newer other components, components better suited for faster processors and that just bothers me, because certainly not all of those people understand that this CPU can not keep with video cards faster than GTX970.
People that are still blinded by "cooooaaaars", simply numbers. Typically those people are also AMD fanboys, because nobody else believes the FX to be good. I mean it's not a real 8 core CPU, it's just a real 8 core CPU for specific tasks that only need Integer, but not for games. For games, it is in fact, only a real QUAD core CPU. Now, a Quad core CPU, coupled with low IPC, what's the use of that? Right. Core 1st gen easily had more IPC than Phenom II, Phenom II had more IPC than Bulldozer, but Sandy Bridge has 20% more IPC than Nehalem. So, there's at least a 30-40% IPC disadvantage for FX 8350 here. 30-40%! And that's me being optimistic and only talking about Nehalem and Sandy Bridge. ;)
I was "forcing" this "dogma" to people for a pretty long time:toast:
Sometimes people don't want to see the truth, but I guess most smarter people did. That said, FX 8350 was simply a server CPU to me, not at all suited for normal users. Now take Ryzen as a counter example - it is a server CPU originally, but because of its high IPC, real core design with SMT, it is still very good suited for eg. gamers. In the end, what AMD did, they kinda "copied" Intel arch, chopped away some limitations like utterly big monolithic CPU designs, increased core amounts dramatically and still with a very good yield - something Intel put down as being "glued together" because they were too stupid to think of this ingenious design first, which solves the problem of ever growing amounts of cores and downing of yields at the same time. On top of that they added features like, being able to clock the CPU in 25 MHz steps among other things like Neural Network stuffs - and lets not forget its great efficiency that is higher than any Intel product. Ryzen in general is a big success. Now, they maybe need to change something at RTG (in words: fire some people who failed and replace them with better) and they will do better as well. Ryzen is a success because genius people were at work there, I don't see the same at RTG. Either that, or RTG simply lacks money to make up for the competition that is Nvidia. But honestly, what's so complicated about designing a GPU Boost future that has lowest voltage possible in mind? Nvidia has it since 2012, and AMD just now got a turbo function in through Vega, but still lacks the auto voltage tuning ability, which is very important for efficiency and one of the main reasons why Nvidia won every efficiency battle since 2012.
 
Last edited:
But honestly, what's so complicated about designing a GPU Boost future that has lowest voltage possible in mind?

A not so great manufacturing node prevents them from doing that. I suspect GloFo simply cannot deliver what AMD wants for their GPUs , hence they can't have fine grained voltage/frequency control and thus have to over volt their cards in order to get reliable characteristics out of them. Regardless of the reason , it's clear that it is a limitation of the silicon itself.
 
A not so great manufacturing node prevents them from doing that. I suspect GloFo simply cannot deliver what AMD wants for their GPUs , hence they can't have fine grained voltage/frequency control and thus have to over volt their cards in order to get reliable characteristics out of them. Regardless of the reason , it's clear that it is a limitation of the silicon itself.
Has nothing to do with that. Users can downvolt the cards by hand and achieve far greater effiency than stock, AMD simply lacks the software to do it automatically. GloFo or Samsung 14nm LPP being bad is a myth. Nvidia produced GTX 1050 Ti on 14nm Samsung and it's a great GPU, running perfectly fine, like all those other 1000-series GPUs produced at TSMC in 16nm.
 
Its not maxed out like these seem to be though...in that they are running out of their efficiency range and seemingly at the cusp of that point of diminshing returns.
 
Has nothing to do with that. Users can downvolt the cards by hand and achieve far greater effiency than stock, AMD simply lacks the software to do it automatically. GloFo or Samsung 14nm LPP being bad is a myth. Nvidia produced GTX 1050 Ti on 14nm Samsung and it's a great GPU, running perfectly fine, like all those other 1000-series GPUs produced at TSMC in 16nm.

It's just not that simple , different architectures , nothing suggest you can make those GPUs on that node and be able to get the same characteristics out of them.

Die sizes are important too , it could be that they can achieve that quality of silicon at the expense of lower yields , in the case of GP107 it wouldn't be an issue since it's a small chip. Vega is already massive , it could be that binning it even further would make it unfeasible.

Bottom of the line is this is a complicated matter , to me the fact that they grossly over volt their cards suggest a clear issue that they have with the quality of the silicon they use. In order of them to create said software they need very consistent characteristics on their chips and they are clearly not getting that.
 
64 Air is maxed out at what's possible with air and 300 W TDP. And Liquid is maxed out at a 350W TDP - so yes indeed, it is maxed out. ;) Same is true for Hawaii, Fiji especially, Tahiti. Pretty much any big AMD GPU ever released.

@Vya Domus : Still has nothing to do with that. Nvidia could perfectly do the same with GloFo nodes and Samsung nodes and already prove it via GTX 1050 Ti. You don't seem to read my posts, I advise you to do so. I won't explain the same thing over and over and over again, I already did this in numerous threads. AMD having no auto voltage tuning feature is their big downside compared to Nvidia. It's a fact, seeing that users can downvolt and tune AMD cards themselves. Seeing that Vega achieves very high clocks further proves that 14nm GloFo/Samsung is perfectly fine and everything else is a myth.
 
And no card will ever reach the same voltages if they are manually undervolted by the user , it's a bad way to determine if this is possible. You need reliable characteristics of the silicon to be known beforehand and the ability to manufacture GPUs according to those characteristics , you can't just plop such software on any GPU and have it magically work.

AMD always overvolted their cards ever since the first iteration of CGN , they're doing it to get better yields not because they are stupid to not make such a piece of software , they simply can't do it by the looks of things. The quality of the silicon that they use is at fault for bad power consumption and lack of fine grained voltage control capabilities , not software. And this is a general rule.

This matter is definitely tied to the manufacturing process and what it can do. AMD either can't afford better quality silicon or GloFo is just simply not great. I lean towards the latter. You can look up some info about this , GloFo is consistently behind everyone else , their equivalent nodes are always worse than what the competition can offer. If GloFo's 14nm node is so good , why isn't Nvidia switching to them entirely for the consumer cards ?

GloFo is a hindrance to AMD, I am convinced of it. They get penalties by GloFo for not using their dies for Christ sake , they're like a leech. AMD will see no true success until they get rid of them in my opinion.

Anyway without any sort of reliable data about this we're just guessing on why they don't/can't do a more advanced boost functionality. This is off topic though so I'll end it here.
 
Last edited:
:roll: you're not gettiny my point buddy. I perfectly know what AMD is doing and I even wrote about that, the posts you ignored or handpicked things out to attack me on. And because I know what AMD is doing - doing it the old way - I know they lack a autotune voltage feature, which Nvidia has implemented since 2012 on their GTX 600 series. A feature that sets them apart from AMD and gives them superiority in efficiency with ease. I also described how the feature works numerous times in TPU forum in different threads, so I wouldn't care about your ignorance too much. The differences between TSMC / GloFo / Samsung, those 2 latter being too bad, is a myth, long ago debunked by Nvidia proving that their architecture is doing perfectly fine with their node as well. How many times did I say this now? 3? AMD prove it themselves because Vega is doing perfectly fine on that "inferior" node as well. :laugh: Vega once untervolted, or lets say, properly volted, is way more efficient. The "inferior node" is also able to give Vega very high overclocks, something which would be impossible with a "inferior node".

PS. Polaris refresh is doing perfectly fine on 14nm Samsung/GloFo as well. Clocks are pretty high, efficiency is high as well, once undervolted (properly volted).
 
to attack me on..

o_O

Attack on you ? For some odd reason you seem to take this take this personally. I am going to back off , didn't know I looked so menacing.

Anyway , I said I'll end it here , it's off topic and I see no point in arguing about things that are nothing more that guesses/opinions.
 
Anyway , I said I'll end it here , it's off topic and I see no point in arguing about things that are nothing more that guesses/opinions.
I don't think so. But anyway, yeah let's end it. ;)
 
Hardware Unboxed, together with Gamer's Nexus and Digital Foundry are my favorite Youtube video reviewers. Techpowerup, however, is still my favorite web page reviewer...:lovetpu:

That being said i absolutely love benchmarks which include old stuff tested vs. new stuff! So, Steve from Hardware Unboxed did it once again - pitting the new Ryzen 3 processors vs. the legendary Core i5 2500K and iconic FX-8370.

Honestly, after watching this video i think AMD FX users should just die at this point:D


Have fun!
bb-but 2500k isn't future proof because only 4 cores...
-2011 argument
 
bb-but 2500k isn't future proof because only 4 cores...
-2011 argument

Never said it is future a proof CPU. Even i myself upgraded it to Core i7 3770K in my secondary rig, because i had the chance to earn some money by component sell/trades. All i said was that in games the FX-8350 was a piece of shit CPU compared to Core i5 2500K back in 2012, and it is still now. The mere fact that it looses to Pentium G4560 in 18 games out of 20, with exceptions lilke Mafia 3 and Witcher 3 should tell the whole story how ignorant must one be to have FX-8350 for pure gaming now in 2017, especially since the release of Pentium G4560, and even more now so since AMD Ryzen.

Single threaded performance is total garbage, and you've made it even worse by Underclocking. I've helped people upgrade from FX8320's over to Intel i5 2500's (dirt cheap here seocond hand) and seen massive gains - they really are just terrible for gaming.


Yeah but your i5 kicks the crap out from any FX series CPU.

Upgrading a piece of shit to something little less piece of shit is not a smart solution.
 
Last edited:
Never said it is future a proof CPU. Even i myself upgraded it to Core i7 3770K in my secondary rig, because i had the chance to earn some money by component sell/trades. All i said was that in games the FX-8350 was a piece of shit CPU compared to Core i5 2500K back in 2012, and it is still now. The mere fact that it looses to Pentium G4560 in 18 games out of 20, with exceptions lilke Mafia 3 and Witcher 3 should tell the whole story how ignorant must one be to have FX-8350 for pure gaming now in 2017, especially since the release of Pentium G4560, and even more now so since AMD Ryzen.

I think we already have established your hatred towards fx8350 now moving on its obvious that you dont buy an fx cpu today so let focus on what i said before it all comes to price to performance if i pay 105 eur for it

and pair it with gtx 970 witch was released in 2014 and play games @2017 with high settings @1080p with avarage 60 fps now tell me why at that time should i have payed 50euros more for an i3 or 90+euros for an i5.
 
I've had FX6300 machine for couple weeks and it run the games just fine and it felt snappy in applications.
And I had i7 960 at the same time and it didn't felt any slower.
And I know how much slower it is compared to the i7.
It's just strange.
 
I think we already have established your hatred towards fx8350 now moving on its obvious that you dont buy an fx cpu today so let focus on what i said before it all comes to price to performance if i pay 105 eur for it

and pair it with gtx 970 witch was released in 2014 and play games @2017 with high settings @1080p with avarage 60 fps now tell me why at that time should i have payed 50euros more for an i3 or 90+euros for an i5.

At that time you made the logic choice! Now FX-8350 makes no sense for 100 EU, when i see used Core i5 2500K being sold for 60 EU second hand and new Pentium G4560 being sold for 60 EU first hand. Yes, FX-8350 can play all games at 60 FPS with a GTX970/GTX1060, but do not forget the minimal FPS are not guaranteed with it like they would be with Core i5 2500K, even more so with Ryzen 5 1600. And yes, i hate everything AMD put since the beginning of "FX bulldozer line". And no, i am not an Intel fan boy, i loved S939 AMD processors before Core 2 Duo came to town and i do like Ryzen 7 now more than Intel counterparts!
 
Back
Top