• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

6 core vs 8 core AMD FX ?

if i'd be an owner of amd CPU, i'd cant stop until i get a big tower cooler and OC close/over 5Ghz
cause what's the point to own AMD and not to superOC?:)
:)) That was so funny! And it worries me at the same time.

At this moment I really don't think that getting a dual core can be justified when there are so many cheap CPU's with more cores, even from intel.
PS: for gaming for the foreseeable future PCIE-3.0 doesn't seem to be needed (yet). Even the newest games will only slow down by about 2-5% in the absolute worst case scenario when going from PCIE-3.0 to 2.0.

i am deeply pondering on your statement. As i don't wanna give up on a FX 6300 maybe even the 6350. I just don't have the money to go after intel. So i wont play on ultra but on high and with bad console ports it will be "high settings" with 50fps. I can gladly live with that. Other then gaming it will still be a fast PC for everyday usage.
 
Last edited:
also, when budget is low the best advice is to go for aftermarket deals - you can find some realy balanced options if you're not nub:)
 
I've used the 8320 in gaming, it was my secondary gaming rig(my main rig at work) for the longest time, it had a Phenom X6 in it before the 8320. I can tell you, in almost all scenarios there was really no noticeable difference in real world gaming experience with the 8320 compared to my 4790K. Reason being? Simple, the graphics card was always the limiting factor. Sure, you can say an i3 will give you 100FPS and the 8320 will only give 90FPS, but if you graphics card is limiting you to 60 what does it matter? And that is the problem with most CPU benchmarks when comparing games, they lower the settings as much as possible to over exaggerate the difference, but that isn't what we do in the real world. So with the exception of a very select few games that are extremely single threaded and extremely CPU intensive(Starcraft II comes to mind) there really is no difference between an 8320 and an i3.

That being said, I've worked with i3 machines, I'm posting this now from an i3-4160. And I'll tell you what, for doing pretty much anything other than gaming the 8320 machine wins hands down. Even for just doing basic functions in the OS, browsing the internet and things like that, the i3 is sluggish compared to the 8320. And both have SSDs so it isn't an issue of the HDD slowing the machine down. To give an example, if I have a youtube video playing in one tab and browse the internet in another, the video stutters on the i3 system and it remains smooth on the 8320. If I'm opening or creating a zip file, the video stutters on the i3(and the zip file takes a lot longer to open/create too).

That being said, if you aren't opposed to upgrading the CPU in a short while, I'd go with an i3 Intel system until you have the cash for an i5. But if you just want to stick something in it, and leave it for a few years, go AMD 8300.
 
I have an FX8360 (8 core) running at 4GHz. I can play most games on ultra settings with 60fps, so when people say AMD processors fail to perform at a high level, what they say is rubbish!
So you can still build a top-of-the-range PC with an AMD processor. And I really recommend the 8350 - the extra 0.5GHz will make a difference =)

Same here !!


If you compare benchmark reviews of AMD compared to Intel (just have a quick google) you'll see that they're a lot slower and that really matters - we're talking 30% differences in some tests.

Exactly - "tests and benchmarks" - I love real world use instead - my experience is that the AMD cores outperform Intel's - (same priced CPU's) my video converters (Handbrake and FreeVideoConverter) both are faster on AMD than on Intel (AND i am not using CUDA cores for converting)
 
Last edited by a moderator:
@Cvrk I am running a 290x in a Gigabyte 990FXa-Ud3 Am3+ board with a Fx 8350 clocked at 4.4. Gaming is just beautiful. I also own a 3930k i7 rig and have yet seen a need to want me make it my main rig. It just crunches. My main rig does all I need it to and then some. So yes you can run that 270x in a am3+ board with no issues. And you can game just fine. I also run all my games in Ultra and they are smooth as butter.
 
one that i cant stand from AMD is the power consumption
 
Well Myself and a few others here have real world experience and I recommend the op keep to his original plan of an Fx and he will be happy and just fine. Hell I ran mine with 7850's and it ran all games on ultra smooth as butter. And like I said I have not yet be convinced to make my 3930k my main rig over the Fx8350. Don't see where it would be worth the time and hassle. And we are talking a 6c/12t i7.

one that i cant stand from AMD is the power consumption

Umm under normal use you will never see the difference in power consumption. You may use 5 to $10 more a year using a Amd over intel. The power consumption thing is way blown out of proportion. I run 13 rigs and my electric bill for the whole house is less then $250 a month. SO running a Amd proc. is not going to put you in the poor house over a Intel. In a nutshell you will never recoupe the difference in price due to the power savings unless you run the rig for many many years, which we don't.
 
Last edited:
dont got proofs, but what i saw in calculating was about 10-20$(depends on models and OC) over year in using 100% fullload CPU AMD(OCed) vs Intel

PS:Annual home energy cost for i5-3450: 18.55 $/year
Annual home energy cost for FX-6300: 22.89 $/year
http://www.tomshardware.co.uk/answers/id-1729936/amd-6300-power-consumption.html
thats nonOC(OCed is practicaly .7-.85 more, so you can pay not 22.89 $/year but 40$/year)
afterall, it's not a big difference(as i think) and the more important point here is cooling(TDP much higher->it should be cooled properly)
 
Last edited:
i think that amd is prefered in mid-ranged and low-end systems
they got low price, naked performance per dollar(but not everyone thinks that you should build a good cooling case, get at least a tower for cpu-stock cooler cant produce enough cooling at low rpm)
modern PCs are aimed not only for naked performnce, but for comfortable use too - intel and nvidia tries to go low TDP and performance is on 2nd place:)
just give me silent PC and it will be ok if it loses 20-30% of performance:)
also, PCs are going to HTPCs connected to TVs, so more than 60 fps are not needed(srsly, this is for 95%, not for enthusiasts)
also, you can check monitor resolution in steam users statistics - 40% gamers got less than 1080p(and it's ok for them)

Indeed, the tradeoff between performance and noise is often made and is a fair one, too.

Yeah, any modern AMD CPU will be able to play videos smoothly, even a mobile phone can now. It's in games where you really notice the difference and that's what I'm basing my comments on here.


Same here !!

Exactly - "tests and benchmarks" - I love real world use instead - my experience is that the AMD cores outperform Intel's - (same priced CPU's) my video converters (Handbrake and FreeVideoConverter) both are faster on AMD than on Intel (AND i am not using CUDA cores for converting)
The benchmarks I'm talking about are based on real world games, not synthetics like 3DMark, so they're perfectly valid as a comparison.

Personally, when I look at reviews I don't pay much attention to 3DMark scores for either CPU or GPU.

@newtekie1 Did your i3 system have HT? The difference between 2 cores and 4 cores or 2 cores and HT is noticeable, even in general desktop usage.

@Frick My comment is based on my general experience and that of other people's that I know. PCs with Intel CPUs just seem to work that bit more reliably in that way. It's not something I can prove either way with formal tests and hard evidence, but is worth sharing.
 
@qubit dont be angry but
I just don't have the money to go after intel. So i wont play on ultra but on high and with bad console ports it will be "high settings" with 50fps. I can gladly leave with that.
The fact you can get 60fps in a particular scenario doesn't really mean very much. Also, I'd like to see it hit a consistent 120fps on a modern monitor with modern games, which is the new standard for framerate.
yea, we all want 2k(now it's not the best solution cause of input lag/colors) or 1440p 144Hz monitors but in real life we all need to eat, sleep somewhere and not all of us can afford
GeForce Titan i7 cpu water cooling type of gaming
so you're absolutely right that when we compare last gen intel i7(i5 or i3) to AMD 8350(6300) we see the difference even if AMD is OCed(i know, AMD owners can give me their user expirience that says benches are wrong - i got AMD card and in 95% benches comparing to benches from wellknown resources my card shows about 20% more performance - dont know why)
but not everyone needs that ULTRA grafics at 144fps:)
so in situation of OP we recommend to save some money for more useful things:)
all that i wanted to say is when you got 270x and use your computer only for gaming there's not a huge abyss(expt Dying Light) between 6300(8300) and 5960x
 
Last edited:
My thoughts on the subject of AMD compared to Intel. I started off with Intel Celeron. I moved to a AMD Athelon 64X2 then moved to a FX6100 and now I have an Core i7-5820K. Intel SPANKS AMDs cpus in every way. But the APUs are a different story. If they work it right and Mantel takes off it might replace Physix. Bang for buck though AMD all the way.
 
And like I said I have not yet be convinced to make my 3930k my main rig over the Fx8350. Don't see where it would be worth the time and hassle. And we are talking a 6c/12t i7..

But it was worth the time and hassle to remove it from the case it was in and install it in a different case. :rolleyes::fear:



Just picking on you Shot, I'm happy to have my case back
 
But it was worth the time and hassle to remove it from the case it was in and install it in a different case. :rolleyes::fear:



Just picking on you Shot, I'm happy to have my case back
LOL bitch. :toast: I was meaning swapping the drives and all over from my main rig to that one and vise versa. I also wanted that 3930k in a case that had more fans for bling factor.
 
Bling factor? o_O

oh, ygpm btw
 
Yeah, any modern AMD CPU will be able to play videos smoothly, even a mobile phone can now. It's in games where you really notice the difference and that's what I'm basing my comments on here.

Except, like I already said, you won't notice the difference except in extremely rare games that are very single threaded and very CPU intensive(again Starcraft II is really the only one I can think of). This becomes even more true with the OP's 270X, which will almost always be the limiting factor.

The benchmarks I'm talking about are based on real world games, not synthetics like 3DMark, so they're perfectly valid as a comparison.

As I pointed out, based on real world games still doesn't mean it gives an idea of how things are in the real world. Again, most of those reviews turn the graphics settings way down, or use beefy dual GPU graphics solutions, to exaggerate the difference the CPU makes as much as possible.

The real benchmarks to look at for this situation are the ones that use real world settings with reasonable systems. For example, look at BF4 run at 1080p, max settings, on a single GTX770. The FPS difference is within 1 FPS between the i3-4330 and the FX-6350. Why? Because even the GTX770 is limiting the CPU. You can look at Bioshock Infinite, again 3 FPS difference. Tomb Raider and Sleeping Dogs, less than 1 FPS difference. All of these numbers are within the margin of error for benchmarking, so you can say performance is the same.

You can set up benchmarks based on real world games, that still aren't real world tests, that make the i3 look way better than the AMD FX parts, and make it look like it would be seemingly impossible to even play games on an AMD FX. You can do stupid shit like HardOCP does and run the benchmarks at 640x480 and then say "ZOMG the Intel CPUs are getting 100FPS more!" But that is just plain stupid. The fact of the matter is when you run almost all modern games at reasonable settings, the CPU makes essentially no difference.

@newtekie1 Did your i3 system have HT? The difference between 2 cores and 4 cores or 2 cores and HT is noticeable, even in general desktop usage.

There are i3's without HT? Anyway, yes the i3-4160 has HT. And before we say it, the i3 system actually had a more powerful graphics card in it(GTX650Ti vs. GTX640 in the AMD).
 
AM3+ IS A DEAD SOCKET
save your self some greif, spend the extra money and get a intel based platform you will have enough cpu grunt for years to come. AMD has zero upgrade path,and compounding that is there sub-standard ed cpu architecture which is only gonna make it age faster
 
Last edited:
I am keeping my 1100t another year the see whats on the market. It has been a great chip.
 
AM3+ IS A DEAD SOCKET
save your self some greif spend the extra money and get a intel based platform you will have enough cpu grunt for years to come which is not the case with AMD.

Been reading all of these comments over and over. You people know much more about computers then i do.Fact.
It's scary when people say a i3 cpu is better then a FX 6xxx hexa core.
I don't really have the money (not without a financial effort) but still in 2015 can i consider buying a i5 quad core instead of a 8 or 6 core cpu. Isn't it worth ignoring those benchmark numbers,when you think in the real world a 8 core cpu makes more sense ?

They got phones with 8 cores now. Your sending me to the stone ages with a i3 dual core . And who cares if it's hyperthread, that thing came out 10 years ago.

I'm 70% convinced i need the FX 6xxx. 30% i'm pondering if the 8 core will make a more stronger impact on 2016.
I'm not building this computer to last forever. A max 3 years before i will change it. 2017 would be dead line.
 
Been reading all of these comments over and over. You people know much more about computers then i do.Fact.
It's scary when people say a i3 cpu is better then a FX 6xxx hexa core.
I don't really have the money (not without a financial effort) but still in 2015 can i consider buying a i5 quad core instead of a 8 or 6 core cpu. Isn't it worth ignoring those benchmark numbers,when you think in the real world a 8 core cpu makes more sense ?

They got phones with 8 cores now. Your sensing me to the stone ages with a i3 dual core . And who cares if it's hyperthread, that thing came out 10 years ago.

I'm 70% convinced i need the FX 6xxx. 30% i'm pondering if the 8 core will make a more stronger impact on 2016.
I'm not building this computer to last forever. A max 3 years before i will change it. 2017 would be dead line.
more threads (notice I didn't say cores) is not automatically better
AMD is not a 8 core its a 8 thread there is a huge difference
AMD's have 4 of what you could call 'cores' each Module has two threads for a total of 8 (think hyper threading here)
a intel i5 will walk all over a 2 or 8 thread AMD FX in any gaming test,about the only time AMD manages to match the intel chips is in encoding and compression
asking for 3 years out of a already weak platform such as the AMD FX is taking a fair gamble
 
AMD is not a 8 core its a 8 thread there is a huge difference
AMD's have 4 of what you could call 'cores' each Module has two threads for a total of 8 (think hyper threading here)

True.My bad
 
Except, like I already said, you won't notice the difference except in extremely rare games that are very single threaded and very CPU intensiv
Its not as rare as you might think, speaking from personal experience DCS, MWO, Star Citizen, ARMA3 MGS:GS and Assetto Corsa all perform much worse IRL for me on the Fx 8320 OC'd than they do on the i3/i5

I wouldn't recommend throwing more money at the 8 core vs 6 core for gaming its going to be the same only time you will notice a difference is recording/encoding HD video.
 
Back
Top