• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Hardware Unboxed benchmarks: Ryzen 3 vs. Core i5 2500K vs. FX-8370

Frick

Fishfaced Nincompoop
Joined
Feb 27, 2006
Messages
18,928 (2.86/day)
Location
Piteå
System Name Black MC in Tokyo
Processor Ryzen 5 5600
Motherboard Asrock B450M-HDV
Cooling Be Quiet! Pure Rock 2
Memory 2 x 16GB Kingston Fury 3400mhz
Video Card(s) XFX 6950XT Speedster MERC 319
Storage Kingston A400 240GB | WD Black SN750 2TB |WD Blue 1TB x 2 | Toshiba P300 2TB | Seagate Expansion 8TB
Display(s) Samsung U32J590U 4K + BenQ GL2450HT 1080p
Case Fractal Design Define R4
Audio Device(s) Line6 UX1 + some headphones, Nektar SE61 keyboard
Power Supply Corsair RM850x v3
Mouse Logitech G602
Keyboard Cherry MX Board 1.0 TKL Brown
VR HMD Acer Mixed Reality Headset
Software Windows 10 Pro
Benchmark Scores Rimworld 4K ready!
Honestly, after watching this video i think AMD FX users should just die at this point.

Well the FX8350 was released in 2012 and was like €150ish for the most part ... Considering it seems to be slightly below the Ryzen 3 CPUs, which are €100-130, in modern games... I would actually say the FX8350 has held up pretty well, considering the general opinion of them.
 
Joined
Feb 22, 2009
Messages
762 (0.14/day)
System Name Lenovo 17IMH05H
Processor Core i7 10750H
Video Card(s) GTX 1660 Ti
Audio Device(s) SSL2
Software Windows 10 Pro 22H2
Benchmark Scores i've got a shitload of them in 15 years of TPU membership
Then it's even more funny he spoke about the FX 8350 as an "icon". :laugh: Icon of low IPC and efficiency or what.

Exactly! But seriously, this CPU seems to be quite iconic, because so many people are using it in their gaming rigs and boasting about it like it's some kind of wonder thing! So many people are selling used computers with this CPU having stronger/newer other components, components better suited for faster processors and that just bothers me, because certainly not all of those people understand that this CPU can not keep with video cards faster than GTX970.

I kinda like how detailed adoredtv tries to be in his videos, but that said he's still amd biased to put it lightly,although he likes to say he isn't. Dunno how the hell he came up with the conclusion that fx 8300 was a better buy than 2500K in the long run cause of how multithreaded games these days are. The videos clearly shows fx8 getting beat soundly when running a gtx 1080.

I agree with both of you about AdoredTV. He is a smart guy, but his "conclusions" are going nowhere and watching his videos i sometimes feel as if he does not know what he is trying to prove to people with all of his research and thoughts trying to stay unbiased, but failing.

Still playing games at 1080 60fps on the 5 year old FX. I will switch to Ryzen some time in the future but not for the sake of gaining more performance in games because really , I wont , not at that resolution and framerate.

Don't take this personal, but you CPU was a weakling 5 years ago and is still today.

This is a benchmark i made 5 years ago showing how bad the Piledriver FX-6300 is compared to Nahelem Core i7 920 when both are clocked at 4 GHz:

https://www.techpowerup.com/forums/...vs-bulldozer-vs-piledriver-benchmarks.177251/

Take a look at Homefront, Hard Reset, World in Conflict, Serious Sam 3, Lost Planet 2 minimal FPS - it will tell you that you should have replaced that CPU a long time ago. But hey, if you are and AMD fan, then respect for keeping it so long, but now is really the time to move on to Ryzen!

The funny thing about FX is, they needed over 5, yes FIVE, years to have a higher IPC with it than with Phenom II! Phenom II was released early 2009 and only with Steamroller (2014, 3rd gen FX) they had a higher IPC. "Piledriver" was actually on par, Bulldozer was LOWER ipc than Phenom II. In the end, they gambled on "coaaarss, mooooar coaars" and it was a utter fail, because almost nobody cared about having so many cores so early.
I was "forcing" this "dogma" to people for a pretty long time:toast:

Well the FX8350 was released in 2012 and was like €150ish for the most part ... Considering it seems to be slightly below the Ryzen 3 CPUs, which are €100-130, in modern games... I would actually say the FX8350 has held up pretty well, considering the general opinion of them.

What? Are you pretending or what?:confused::confused::confused:

It is not slightly below Ryzen 3, it is way worse than Ryzen 3 processors! Honestly.. Core i5 2500K is slightly below Ryzen 5 1400, but FX-8350 is much worse! Did you even watch the Hardware Unboxed video? Yes, for a 5 year old CPU it held quite well due to it's high clock and many cores, but it's IPC was weak in the first place.
 
Last edited:
Joined
Apr 12, 2013
Messages
1,192 (0.30/day)
Processor 11700
Motherboard TUF z590
Memory G.Skill 32gb 3600mhz
Video Card(s) ROG Vega 56
Case Deepcool
Power Supply RM 850
What did i hear it right fx is overclocked to 4.4 are you serious when the boost clock is 4.3

Al-Pacino-LaughSmoking.gif
 
Joined
Mar 10, 2010
Messages
11,878 (2.30/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R5 5900X/ Intel 8750H
Motherboard Crosshair hero8 impact/Asus
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Corsair Vengeance Rgb pro 3600cas14 16Gb in four sticks./16Gb/16GB
Video Card(s) Powercolour RX7900XT Reference/Rtx 2060
Storage Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dell shiter
Case Lianli 011 dynamic/strix scar2
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores 8726 vega 3dmark timespy/ laptop Timespy 6506
Certainly not all of them can do 1080/60 on your FX. I'd wager most of them can't, you're just talking big. Hundreds of tests prove it can't keep up. It'd get destroyed in any open world game like GTA5 or Watch Dogs 2. Hell, my 4.9GHz 2500K could not do perfect 60 fps @1080p paired with 290X trix back in 2014.
I just dissagree , i still game at 4k , mostly with settings to acomodate 60fps and few do i have to turn settings down on ,GtaV i can run very high settings at 60pfs@4k my 8350 does limit me for sure , id get more for sure, but its no nightmare to use.
 
Joined
Jan 8, 2017
Messages
8,929 (3.36/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
Don't take this personal, but you CPU was a weakling 5 years ago and is still today.

This is a benchmark i made 5 years ago showing how bad the Piledriver FX-6300 is compared to Nahelem Core i7 920 when both are clocked at 4 GHz:

https://www.techpowerup.com/forums/...vs-bulldozer-vs-piledriver-benchmarks.177251/

Take a look at Homefront, Hard Reset, World in Conflict, Serious Sam 3, Lost Planet 2 minimal FPS - it will tell you that you should have replaced that CPU a long time ago. But hey, if you are and AMD fan, then respect for keeping it so long, but now is really the time to move on to Ryzen!

You and that other dude just simply don't understand what I'm saying. I game at 1080p/60hz. If were to switch to Ryzen or something faster I would certainly get higher framerates but what good would it be if as soon as I turn V-sync on all that performance becomes dormant and I'll get a marginally better result at best. There is a very good reason I stuck with this CPU for so long , it gets the job done. Hell it was probably a better investment than every other PC part I ever bought , probably even more so than the 1060 that I have that I'm sure in the next 2-3 years is going to become useless.

Actually I wouldn't even call it an investment , got it for dirt cheap in reality. And you know why that is ? Because of people pushing others to buy unnecessary and overpriced CPUs at the time. So in one way I am actually thankful for people like you. Thank you for making perfectly adequate hardware cheap by underestimating it. :)

Point is , I will upgrade to Ryzen soon but no in hell I'll do that just for gaming because it's just fine , I have other uses for a faster CPU now.
 
Last edited:
Joined
Dec 14, 2013
Messages
2,615 (0.69/day)
Location
Alabama
Processor Ryzen 2700X
Motherboard X470 Tachi Ultimate
Cooling Scythe Big Shuriken 3
Memory C.R.S.
Video Card(s) Radeon VII
Software Win 7
Benchmark Scores Never high enough
I happen to own both, an 8370 and a 2600K - Both chips and setups overall do what I expect from them.
TBH I"m not much of a gamer and frankly it's been a long time since I've even purchased a game let alone played online but that's just me.

For what I do the 8370 is quite adequate and serves it's purpose just fine and so does the 2600K. My purpose is competitive OC'ing so I don't have to worry alot about what many see as important, for me it's more along the lines of getting what I can from different pieces to score points and yes, the 8370 chips score points too.
I'm not questioning the performance of an 8370 vs a 2600K or even a 2500K, I'm just saying if it's working for you to your satisfaction then it's all good regardless of other opinions about it.
Do as you will because I certainly do.
 
Joined
Apr 12, 2013
Messages
1,192 (0.30/day)
Processor 11700
Motherboard TUF z590
Memory G.Skill 32gb 3600mhz
Video Card(s) ROG Vega 56
Case Deepcool
Power Supply RM 850
I happen to own both, an 8370 and a 2600K - Both chips and setups overall do what I expect from them.
TBH I"m not much of a gamer and frankly it's been a long time since I've even purchased a game let alone played online but that's just me.

For what I do the 8370 is quite adequate and serves it's purpose just fine and so does the 2600K. My purpose is competitive OC'ing so I don't have to worry alot about what many see as important, for me it's more along the lines of getting what I can from different pieces to score points and yes, the 8370 chips score points too.
I'm not questioning the performance of an 8370 vs a 2600K or even a 2500K, I'm just saying if it's working for you to your satisfaction then it's all good regardless of other opinions about it.
Do as you will because I certainly do.

I totally agree and +1 to your thought.

And i will also add the price to performance in my case i bought the fx8350 for 100eur now that is an excellent price to performance ratio.

PS: and at that time an i3 was 55eur more expensive.
 

Frick

Fishfaced Nincompoop
Joined
Feb 27, 2006
Messages
18,928 (2.86/day)
Location
Piteå
System Name Black MC in Tokyo
Processor Ryzen 5 5600
Motherboard Asrock B450M-HDV
Cooling Be Quiet! Pure Rock 2
Memory 2 x 16GB Kingston Fury 3400mhz
Video Card(s) XFX 6950XT Speedster MERC 319
Storage Kingston A400 240GB | WD Black SN750 2TB |WD Blue 1TB x 2 | Toshiba P300 2TB | Seagate Expansion 8TB
Display(s) Samsung U32J590U 4K + BenQ GL2450HT 1080p
Case Fractal Design Define R4
Audio Device(s) Line6 UX1 + some headphones, Nektar SE61 keyboard
Power Supply Corsair RM850x v3
Mouse Logitech G602
Keyboard Cherry MX Board 1.0 TKL Brown
VR HMD Acer Mixed Reality Headset
Software Windows 10 Pro
Benchmark Scores Rimworld 4K ready!
What? Are you pretending or what?:confused::confused::confused:

It is not slightly below Ryzen 3, it is way worse than Ryzen 3 processors! Honestly.. Core i5 2500K is slightly below Ryzen 5 1400, but FX-8350 is much worse! Did you even watch the Hardware Unboxed video? Yes, for a 5 year old CPU it held quite well due to it's high clock and many cores, but it's IPC was weak in the first place.

I was going on this review, generally. Also thank you for agreeing with the basic point. :p
 
Joined
Feb 22, 2009
Messages
762 (0.14/day)
System Name Lenovo 17IMH05H
Processor Core i7 10750H
Video Card(s) GTX 1660 Ti
Audio Device(s) SSL2
Software Windows 10 Pro 22H2
Benchmark Scores i've got a shitload of them in 15 years of TPU membership
What did i hear it right fx is overclocked to 4.4 are you serious when the boost clock is 4.3

While i do agree that 4.4 GHz is a minimum overclock and actually sounds like a joke OC, the general picture would not change even if FX-8370 would be overclocked to 5 GHz - if an IPC set was not a good design in the first place, no overclocking would enhance the performance over Ryzen 3 in games. Take a look at Ryzen 7 1700 - a 300 MHz overclock from it's 3700 MHz turbo clock yields great results matching stock Core i7 7820X in games (Hardware Unboxed benchmarks) - that's a good IPC for you, differently from all FX series.

I was going on this review, generally. Also thank you for agreeing with the basic point. :p

Yes, then i will also correct myself that it is in games that Ryzen 3 is much faster than FX-8350, not in programs. I also know that FX-8370 is faster than Core i5 3570K/4670K in programs.
 
Joined
Apr 12, 2013
Messages
1,192 (0.30/day)
Processor 11700
Motherboard TUF z590
Memory G.Skill 32gb 3600mhz
Video Card(s) ROG Vega 56
Case Deepcool
Power Supply RM 850
While i do agree that 4.4 GHz is a minimum overclock and actually sounds like a joke OC, the general picture would not change even if FX-8370 would be overclocked to 5 GHz - if an IPC set was not a good design in the first place, no overclocking would enhance the performance over Ryzen 3 in games. Take a look at Ryzen 7 1700 - a 300 MHz overclock from it's 3700 MHz turbo clock yields great results matching stock Core i7 7820X in games (Hardware Unboxed benchmarks) - that's a good IPC for you, differently from all FX series.

Yes, then i will also correct myself that it is in games that Ryzen 3 is much faster than FX-8350, not in programs. I also know that FX-8370 is faster than Core i5 3570K/4670K in programs.[/QUOTE
There is no point to talk aobut the IPC as there is no reason to OC an 2500k aigaints the FX.
 

Kanan

Tech Enthusiast & Gamer
Joined
Aug 22, 2015
Messages
3,517 (1.11/day)
Location
Europe
System Name eazen corp | Xentronon 7.2
Processor AMD Ryzen 7 3700X // PBO max.
Motherboard Asus TUF Gaming X570-Plus
Cooling Noctua NH-D14 SE2011 w/ AM4 kit // 3x Corsair AF140L case fans (2 in, 1 out)
Memory G.Skill Trident Z RGB 2x16 GB DDR4 3600 @ 3800, CL16-19-19-39-58-1T, 1.4 V
Video Card(s) Asus ROG Strix GeForce RTX 2080 Ti modded to MATRIX // 2000-2100 MHz Core / 1938 MHz G6
Storage Silicon Power P34A80 1TB NVME/Samsung SSD 830 128GB&850 Evo 500GB&F3 1TB 7200RPM/Seagate 2TB 5900RPM
Display(s) Samsung 27" Curved FS2 HDR QLED 1440p/144Hz&27" iiyama TN LED 1080p/120Hz / Samsung 40" IPS 1080p TV
Case Corsair Carbide 600C
Audio Device(s) HyperX Cloud Orbit S / Creative SB X AE-5 @ Logitech Z906 / Sony HD AVR @PC & TV @ Teufel Theater 80
Power Supply EVGA 650 GQ
Mouse Logitech G700 @ Steelseries DeX // Xbox 360 Wireless Controller
Keyboard Corsair K70 LUX RGB /w Cherry MX Brown switches
VR HMD Still nope
Software Win 10 Pro
Benchmark Scores 15 095 Time Spy | P29 079 Firestrike | P35 628 3DM11 | X67 508 3DM Vantage Extreme
Still playing games at 1080 60fps on the 5 year old FX. I will switch to Ryzen some time in the future but not for the sake of gaining more performance in games because really , I wont , not at that resolution and framerate.
It is a CPU suited for 60 fps, but not anything higher than that. And as the other guy tried to point out: you'd have higher minimum / avg FPS by switching, even playing on 60 FPS only as a target.
Exactly! But seriously, this CPU seems to be quite iconic, because so many people are using it in their gaming rigs and boasting about it like it's some kind of wonder thing! So many people are selling used computers with this CPU having stronger/newer other components, components better suited for faster processors and that just bothers me, because certainly not all of those people understand that this CPU can not keep with video cards faster than GTX970.
People that are still blinded by "cooooaaaars", simply numbers. Typically those people are also AMD fanboys, because nobody else believes the FX to be good. I mean it's not a real 8 core CPU, it's just a real 8 core CPU for specific tasks that only need Integer, but not for games. For games, it is in fact, only a real QUAD core CPU. Now, a Quad core CPU, coupled with low IPC, what's the use of that? Right. Core 1st gen easily had more IPC than Phenom II, Phenom II had more IPC than Bulldozer, but Sandy Bridge has 20% more IPC than Nehalem. So, there's at least a 30-40% IPC disadvantage for FX 8350 here. 30-40%! And that's me being optimistic and only talking about Nehalem and Sandy Bridge. ;)
I was "forcing" this "dogma" to people for a pretty long time:toast:
Sometimes people don't want to see the truth, but I guess most smarter people did. That said, FX 8350 was simply a server CPU to me, not at all suited for normal users. Now take Ryzen as a counter example - it is a server CPU originally, but because of its high IPC, real core design with SMT, it is still very good suited for eg. gamers. In the end, what AMD did, they kinda "copied" Intel arch, chopped away some limitations like utterly big monolithic CPU designs, increased core amounts dramatically and still with a very good yield - something Intel put down as being "glued together" because they were too stupid to think of this ingenious design first, which solves the problem of ever growing amounts of cores and downing of yields at the same time. On top of that they added features like, being able to clock the CPU in 25 MHz steps among other things like Neural Network stuffs - and lets not forget its great efficiency that is higher than any Intel product. Ryzen in general is a big success. Now, they maybe need to change something at RTG (in words: fire some people who failed and replace them with better) and they will do better as well. Ryzen is a success because genius people were at work there, I don't see the same at RTG. Either that, or RTG simply lacks money to make up for the competition that is Nvidia. But honestly, what's so complicated about designing a GPU Boost future that has lowest voltage possible in mind? Nvidia has it since 2012, and AMD just now got a turbo function in through Vega, but still lacks the auto voltage tuning ability, which is very important for efficiency and one of the main reasons why Nvidia won every efficiency battle since 2012.
 
Last edited:
Joined
Jan 8, 2017
Messages
8,929 (3.36/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
But honestly, what's so complicated about designing a GPU Boost future that has lowest voltage possible in mind?

A not so great manufacturing node prevents them from doing that. I suspect GloFo simply cannot deliver what AMD wants for their GPUs , hence they can't have fine grained voltage/frequency control and thus have to over volt their cards in order to get reliable characteristics out of them. Regardless of the reason , it's clear that it is a limitation of the silicon itself.
 

Kanan

Tech Enthusiast & Gamer
Joined
Aug 22, 2015
Messages
3,517 (1.11/day)
Location
Europe
System Name eazen corp | Xentronon 7.2
Processor AMD Ryzen 7 3700X // PBO max.
Motherboard Asus TUF Gaming X570-Plus
Cooling Noctua NH-D14 SE2011 w/ AM4 kit // 3x Corsair AF140L case fans (2 in, 1 out)
Memory G.Skill Trident Z RGB 2x16 GB DDR4 3600 @ 3800, CL16-19-19-39-58-1T, 1.4 V
Video Card(s) Asus ROG Strix GeForce RTX 2080 Ti modded to MATRIX // 2000-2100 MHz Core / 1938 MHz G6
Storage Silicon Power P34A80 1TB NVME/Samsung SSD 830 128GB&850 Evo 500GB&F3 1TB 7200RPM/Seagate 2TB 5900RPM
Display(s) Samsung 27" Curved FS2 HDR QLED 1440p/144Hz&27" iiyama TN LED 1080p/120Hz / Samsung 40" IPS 1080p TV
Case Corsair Carbide 600C
Audio Device(s) HyperX Cloud Orbit S / Creative SB X AE-5 @ Logitech Z906 / Sony HD AVR @PC & TV @ Teufel Theater 80
Power Supply EVGA 650 GQ
Mouse Logitech G700 @ Steelseries DeX // Xbox 360 Wireless Controller
Keyboard Corsair K70 LUX RGB /w Cherry MX Brown switches
VR HMD Still nope
Software Win 10 Pro
Benchmark Scores 15 095 Time Spy | P29 079 Firestrike | P35 628 3DM11 | X67 508 3DM Vantage Extreme
A not so great manufacturing node prevents them from doing that. I suspect GloFo simply cannot deliver what AMD wants for their GPUs , hence they can't have fine grained voltage/frequency control and thus have to over volt their cards in order to get reliable characteristics out of them. Regardless of the reason , it's clear that it is a limitation of the silicon itself.
Has nothing to do with that. Users can downvolt the cards by hand and achieve far greater effiency than stock, AMD simply lacks the software to do it automatically. GloFo or Samsung 14nm LPP being bad is a myth. Nvidia produced GTX 1050 Ti on 14nm Samsung and it's a great GPU, running perfectly fine, like all those other 1000-series GPUs produced at TSMC in 16nm.
 
Joined
Dec 31, 2009
Messages
19,366 (3.71/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
Its not maxed out like these seem to be though...in that they are running out of their efficiency range and seemingly at the cusp of that point of diminshing returns.
 
Joined
Jan 8, 2017
Messages
8,929 (3.36/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
Has nothing to do with that. Users can downvolt the cards by hand and achieve far greater effiency than stock, AMD simply lacks the software to do it automatically. GloFo or Samsung 14nm LPP being bad is a myth. Nvidia produced GTX 1050 Ti on 14nm Samsung and it's a great GPU, running perfectly fine, like all those other 1000-series GPUs produced at TSMC in 16nm.

It's just not that simple , different architectures , nothing suggest you can make those GPUs on that node and be able to get the same characteristics out of them.

Die sizes are important too , it could be that they can achieve that quality of silicon at the expense of lower yields , in the case of GP107 it wouldn't be an issue since it's a small chip. Vega is already massive , it could be that binning it even further would make it unfeasible.

Bottom of the line is this is a complicated matter , to me the fact that they grossly over volt their cards suggest a clear issue that they have with the quality of the silicon they use. In order of them to create said software they need very consistent characteristics on their chips and they are clearly not getting that.
 

Kanan

Tech Enthusiast & Gamer
Joined
Aug 22, 2015
Messages
3,517 (1.11/day)
Location
Europe
System Name eazen corp | Xentronon 7.2
Processor AMD Ryzen 7 3700X // PBO max.
Motherboard Asus TUF Gaming X570-Plus
Cooling Noctua NH-D14 SE2011 w/ AM4 kit // 3x Corsair AF140L case fans (2 in, 1 out)
Memory G.Skill Trident Z RGB 2x16 GB DDR4 3600 @ 3800, CL16-19-19-39-58-1T, 1.4 V
Video Card(s) Asus ROG Strix GeForce RTX 2080 Ti modded to MATRIX // 2000-2100 MHz Core / 1938 MHz G6
Storage Silicon Power P34A80 1TB NVME/Samsung SSD 830 128GB&850 Evo 500GB&F3 1TB 7200RPM/Seagate 2TB 5900RPM
Display(s) Samsung 27" Curved FS2 HDR QLED 1440p/144Hz&27" iiyama TN LED 1080p/120Hz / Samsung 40" IPS 1080p TV
Case Corsair Carbide 600C
Audio Device(s) HyperX Cloud Orbit S / Creative SB X AE-5 @ Logitech Z906 / Sony HD AVR @PC & TV @ Teufel Theater 80
Power Supply EVGA 650 GQ
Mouse Logitech G700 @ Steelseries DeX // Xbox 360 Wireless Controller
Keyboard Corsair K70 LUX RGB /w Cherry MX Brown switches
VR HMD Still nope
Software Win 10 Pro
Benchmark Scores 15 095 Time Spy | P29 079 Firestrike | P35 628 3DM11 | X67 508 3DM Vantage Extreme
64 Air is maxed out at what's possible with air and 300 W TDP. And Liquid is maxed out at a 350W TDP - so yes indeed, it is maxed out. ;) Same is true for Hawaii, Fiji especially, Tahiti. Pretty much any big AMD GPU ever released.

@Vya Domus : Still has nothing to do with that. Nvidia could perfectly do the same with GloFo nodes and Samsung nodes and already prove it via GTX 1050 Ti. You don't seem to read my posts, I advise you to do so. I won't explain the same thing over and over and over again, I already did this in numerous threads. AMD having no auto voltage tuning feature is their big downside compared to Nvidia. It's a fact, seeing that users can downvolt and tune AMD cards themselves. Seeing that Vega achieves very high clocks further proves that 14nm GloFo/Samsung is perfectly fine and everything else is a myth.
 
Joined
Jan 8, 2017
Messages
8,929 (3.36/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
And no card will ever reach the same voltages if they are manually undervolted by the user , it's a bad way to determine if this is possible. You need reliable characteristics of the silicon to be known beforehand and the ability to manufacture GPUs according to those characteristics , you can't just plop such software on any GPU and have it magically work.

AMD always overvolted their cards ever since the first iteration of CGN , they're doing it to get better yields not because they are stupid to not make such a piece of software , they simply can't do it by the looks of things. The quality of the silicon that they use is at fault for bad power consumption and lack of fine grained voltage control capabilities , not software. And this is a general rule.

This matter is definitely tied to the manufacturing process and what it can do. AMD either can't afford better quality silicon or GloFo is just simply not great. I lean towards the latter. You can look up some info about this , GloFo is consistently behind everyone else , their equivalent nodes are always worse than what the competition can offer. If GloFo's 14nm node is so good , why isn't Nvidia switching to them entirely for the consumer cards ?

GloFo is a hindrance to AMD, I am convinced of it. They get penalties by GloFo for not using their dies for Christ sake , they're like a leech. AMD will see no true success until they get rid of them in my opinion.

Anyway without any sort of reliable data about this we're just guessing on why they don't/can't do a more advanced boost functionality. This is off topic though so I'll end it here.
 
Last edited:

Kanan

Tech Enthusiast & Gamer
Joined
Aug 22, 2015
Messages
3,517 (1.11/day)
Location
Europe
System Name eazen corp | Xentronon 7.2
Processor AMD Ryzen 7 3700X // PBO max.
Motherboard Asus TUF Gaming X570-Plus
Cooling Noctua NH-D14 SE2011 w/ AM4 kit // 3x Corsair AF140L case fans (2 in, 1 out)
Memory G.Skill Trident Z RGB 2x16 GB DDR4 3600 @ 3800, CL16-19-19-39-58-1T, 1.4 V
Video Card(s) Asus ROG Strix GeForce RTX 2080 Ti modded to MATRIX // 2000-2100 MHz Core / 1938 MHz G6
Storage Silicon Power P34A80 1TB NVME/Samsung SSD 830 128GB&850 Evo 500GB&F3 1TB 7200RPM/Seagate 2TB 5900RPM
Display(s) Samsung 27" Curved FS2 HDR QLED 1440p/144Hz&27" iiyama TN LED 1080p/120Hz / Samsung 40" IPS 1080p TV
Case Corsair Carbide 600C
Audio Device(s) HyperX Cloud Orbit S / Creative SB X AE-5 @ Logitech Z906 / Sony HD AVR @PC & TV @ Teufel Theater 80
Power Supply EVGA 650 GQ
Mouse Logitech G700 @ Steelseries DeX // Xbox 360 Wireless Controller
Keyboard Corsair K70 LUX RGB /w Cherry MX Brown switches
VR HMD Still nope
Software Win 10 Pro
Benchmark Scores 15 095 Time Spy | P29 079 Firestrike | P35 628 3DM11 | X67 508 3DM Vantage Extreme
:roll: you're not gettiny my point buddy. I perfectly know what AMD is doing and I even wrote about that, the posts you ignored or handpicked things out to attack me on. And because I know what AMD is doing - doing it the old way - I know they lack a autotune voltage feature, which Nvidia has implemented since 2012 on their GTX 600 series. A feature that sets them apart from AMD and gives them superiority in efficiency with ease. I also described how the feature works numerous times in TPU forum in different threads, so I wouldn't care about your ignorance too much. The differences between TSMC / GloFo / Samsung, those 2 latter being too bad, is a myth, long ago debunked by Nvidia proving that their architecture is doing perfectly fine with their node as well. How many times did I say this now? 3? AMD prove it themselves because Vega is doing perfectly fine on that "inferior" node as well. :laugh: Vega once untervolted, or lets say, properly volted, is way more efficient. The "inferior node" is also able to give Vega very high overclocks, something which would be impossible with a "inferior node".

PS. Polaris refresh is doing perfectly fine on 14nm Samsung/GloFo as well. Clocks are pretty high, efficiency is high as well, once undervolted (properly volted).
 
Joined
Jan 8, 2017
Messages
8,929 (3.36/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
to attack me on..

o_O

Attack on you ? For some odd reason you seem to take this take this personally. I am going to back off , didn't know I looked so menacing.

Anyway , I said I'll end it here , it's off topic and I see no point in arguing about things that are nothing more that guesses/opinions.
 

Kanan

Tech Enthusiast & Gamer
Joined
Aug 22, 2015
Messages
3,517 (1.11/day)
Location
Europe
System Name eazen corp | Xentronon 7.2
Processor AMD Ryzen 7 3700X // PBO max.
Motherboard Asus TUF Gaming X570-Plus
Cooling Noctua NH-D14 SE2011 w/ AM4 kit // 3x Corsair AF140L case fans (2 in, 1 out)
Memory G.Skill Trident Z RGB 2x16 GB DDR4 3600 @ 3800, CL16-19-19-39-58-1T, 1.4 V
Video Card(s) Asus ROG Strix GeForce RTX 2080 Ti modded to MATRIX // 2000-2100 MHz Core / 1938 MHz G6
Storage Silicon Power P34A80 1TB NVME/Samsung SSD 830 128GB&850 Evo 500GB&F3 1TB 7200RPM/Seagate 2TB 5900RPM
Display(s) Samsung 27" Curved FS2 HDR QLED 1440p/144Hz&27" iiyama TN LED 1080p/120Hz / Samsung 40" IPS 1080p TV
Case Corsair Carbide 600C
Audio Device(s) HyperX Cloud Orbit S / Creative SB X AE-5 @ Logitech Z906 / Sony HD AVR @PC & TV @ Teufel Theater 80
Power Supply EVGA 650 GQ
Mouse Logitech G700 @ Steelseries DeX // Xbox 360 Wireless Controller
Keyboard Corsair K70 LUX RGB /w Cherry MX Brown switches
VR HMD Still nope
Software Win 10 Pro
Benchmark Scores 15 095 Time Spy | P29 079 Firestrike | P35 628 3DM11 | X67 508 3DM Vantage Extreme
Anyway , I said I'll end it here , it's off topic and I see no point in arguing about things that are nothing more that guesses/opinions.
I don't think so. But anyway, yeah let's end it. ;)
 
Joined
Oct 21, 2005
Messages
6,879 (1.02/day)
Location
USA
System Name Computer of Theseus
Processor Intel i9-12900KS: 50x Pcore multi @ 1.18Vcore (target 1.275V -100mv offset)
Motherboard EVGA Z690 Classified
Cooling Noctua NH-D15S, 2xThermalRight TY-143, 4xNoctua NF-A12x25,3xNF-A12x15, 2xAquacomputer Splitty9Active
Memory G-Skill Trident Z5 (32GB) DDR5-6000 C36 F5-6000J3636F16GX2-TZ5RK
Video Card(s) EVGA Geforce 3060 XC Black Gaming 12GB
Storage 1x Samsung 970 Pro 512GB NVMe (OS), 2x Samsung 970 Evo Plus 2TB (data 1 and 2), ASUS BW-16D1HT
Display(s) Dell S3220DGF 32" 2560x1440 165Hz Primary, Dell P2017H 19.5" 1600x900 Secondary, Ergotron LX arms.
Case Lian Li O11 Air Mini
Audio Device(s) Audiotechnica ATR2100X-USB, El Gato Wave XLR Mic Preamp, ATH M50X Headphones, Behringer 302USB Mixer
Power Supply Super Flower Leadex Platinum SE 1000W 80+ Platinum White
Mouse Zowie EC3-C
Keyboard Vortex Multix 87 Winter TKL (Gateron G Pro Yellow)
Software Win 10 LTSC 21H2
Hardware Unboxed, together with Gamer's Nexus and Digital Foundry are my favorite Youtube video reviewers. Techpowerup, however, is still my favorite web page reviewer...:lovetpu:

That being said i absolutely love benchmarks which include old stuff tested vs. new stuff! So, Steve from Hardware Unboxed did it once again - pitting the new Ryzen 3 processors vs. the legendary Core i5 2500K and iconic FX-8370.

Honestly, after watching this video i think AMD FX users should just die at this point:D


Have fun!
bb-but 2500k isn't future proof because only 4 cores...
-2011 argument
 
Joined
Feb 22, 2009
Messages
762 (0.14/day)
System Name Lenovo 17IMH05H
Processor Core i7 10750H
Video Card(s) GTX 1660 Ti
Audio Device(s) SSL2
Software Windows 10 Pro 22H2
Benchmark Scores i've got a shitload of them in 15 years of TPU membership
bb-but 2500k isn't future proof because only 4 cores...
-2011 argument

Never said it is future a proof CPU. Even i myself upgraded it to Core i7 3770K in my secondary rig, because i had the chance to earn some money by component sell/trades. All i said was that in games the FX-8350 was a piece of shit CPU compared to Core i5 2500K back in 2012, and it is still now. The mere fact that it looses to Pentium G4560 in 18 games out of 20, with exceptions lilke Mafia 3 and Witcher 3 should tell the whole story how ignorant must one be to have FX-8350 for pure gaming now in 2017, especially since the release of Pentium G4560, and even more now so since AMD Ryzen.

Single threaded performance is total garbage, and you've made it even worse by Underclocking. I've helped people upgrade from FX8320's over to Intel i5 2500's (dirt cheap here seocond hand) and seen massive gains - they really are just terrible for gaming.


Yeah but your i5 kicks the crap out from any FX series CPU.

Upgrading a piece of shit to something little less piece of shit is not a smart solution.
 
Last edited:
Joined
Apr 12, 2013
Messages
1,192 (0.30/day)
Processor 11700
Motherboard TUF z590
Memory G.Skill 32gb 3600mhz
Video Card(s) ROG Vega 56
Case Deepcool
Power Supply RM 850
Never said it is future a proof CPU. Even i myself upgraded it to Core i7 3770K in my secondary rig, because i had the chance to earn some money by component sell/trades. All i said was that in games the FX-8350 was a piece of shit CPU compared to Core i5 2500K back in 2012, and it is still now. The mere fact that it looses to Pentium G4560 in 18 games out of 20, with exceptions lilke Mafia 3 and Witcher 3 should tell the whole story how ignorant must one be to have FX-8350 for pure gaming now in 2017, especially since the release of Pentium G4560, and even more now so since AMD Ryzen.

I think we already have established your hatred towards fx8350 now moving on its obvious that you dont buy an fx cpu today so let focus on what i said before it all comes to price to performance if i pay 105 eur for it

and pair it with gtx 970 witch was released in 2014 and play games @2017 with high settings @1080p with avarage 60 fps now tell me why at that time should i have payed 50euros more for an i3 or 90+euros for an i5.
 

r9

Joined
Jul 28, 2008
Messages
3,300 (0.57/day)
System Name Primary|Secondary|Poweredge r410|Dell XPS|SteamDeck
Processor i7 11700k|i7 9700k|2 x E5620 |i5 5500U|Zen 2 4c/8t
Memory 32GB DDR4|16GB DDR4|16GB DDR4|32GB ECC DDR3|8GB DDR4|16GB LPDDR5
Video Card(s) RX 7800xt|RX 6700xt |On-Board|On-Board|8 RDNA 2 CUs
Storage 2TB m.2|512GB SSD+1TB SSD|2x256GBSSD 2x2TBGB|256GB sata|512GB nvme
Display(s) 50" 4k TV | Dell 27" |22" |3.3"|7"
VR HMD Samsung Odyssey+ | Oculus Quest 2
Software Windows 11 Pro|Windows 10 Pro|Windows 10 Home| Server 2012 r2|Windows 10 Pro
I've had FX6300 machine for couple weeks and it run the games just fine and it felt snappy in applications.
And I had i7 960 at the same time and it didn't felt any slower.
And I know how much slower it is compared to the i7.
It's just strange.
 
Joined
Feb 22, 2009
Messages
762 (0.14/day)
System Name Lenovo 17IMH05H
Processor Core i7 10750H
Video Card(s) GTX 1660 Ti
Audio Device(s) SSL2
Software Windows 10 Pro 22H2
Benchmark Scores i've got a shitload of them in 15 years of TPU membership
I think we already have established your hatred towards fx8350 now moving on its obvious that you dont buy an fx cpu today so let focus on what i said before it all comes to price to performance if i pay 105 eur for it

and pair it with gtx 970 witch was released in 2014 and play games @2017 with high settings @1080p with avarage 60 fps now tell me why at that time should i have payed 50euros more for an i3 or 90+euros for an i5.

At that time you made the logic choice! Now FX-8350 makes no sense for 100 EU, when i see used Core i5 2500K being sold for 60 EU second hand and new Pentium G4560 being sold for 60 EU first hand. Yes, FX-8350 can play all games at 60 FPS with a GTX970/GTX1060, but do not forget the minimal FPS are not guaranteed with it like they would be with Core i5 2500K, even more so with Ryzen 5 1600. And yes, i hate everything AMD put since the beginning of "FX bulldozer line". And no, i am not an Intel fan boy, i loved S939 AMD processors before Core 2 Duo came to town and i do like Ryzen 7 now more than Intel counterparts!
 
Top