1. Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Is it worth upgrading to a 6 core yet?

Discussion in 'General Hardware' started by Go To Sleep, Dec 9, 2013.

  1. Maleko

    Maleko

    Joined:
    May 5, 2008
    Messages:
    62 (0.02/day)
    Thanks Received:
    53
    I have a Velociraptor drive that now serves as a data drive and got a Samsung 830 SSD for OS, best upgrade ever.
     
  2. Go To Sleep

    Go To Sleep

    Joined:
    Dec 30, 2012
    Messages:
    144 (0.15/day)
    Thanks Received:
    15
    Location:
    Melton Mowbray, UK
    Yeah, when will the 800 series cards be due out you think?
     
  3. happita

    happita

    Joined:
    Aug 7, 2007
    Messages:
    2,464 (0.84/day)
    Thanks Received:
    491
    Most likely Q2 2014 or Q3 2014 at the latest I would think. That's if AMD doesn't have any ideas up their sleeves like an R9 refresh or something.
     
  4. Cotton_Cup

    Cotton_Cup

    Joined:
    Mar 19, 2012
    Messages:
    378 (0.30/day)
    Thanks Received:
    35
    Location:
    Rizal, Philippines
    My Processor goes @ 4.6Ghz on 1.38 using a noctua nh d-14 not too long ago before I went water. It was stable on prime 95 for 48 hours stress test.
     
  5. Aquinus

    Aquinus Resident Wat-man

    Joined:
    Jan 28, 2012
    Messages:
    8,135 (6.21/day)
    Thanks Received:
    3,376
    Location:
    Concord, NH
    2011 can go kind of high, it does depend on the motherboard and cpu though. Anything above 4.2 requires a voltage boost on my 3820. The amount of voltage I need to feed thing thing to get anything higher than 4.5 is pretty nuts. The highest I've been able to boot the machine at is 4.96. In all honesty, there comes a point of diminishing returns. I'm just as happy with 4.2 as I am with 4.5.

    As many others have said, your 920 is still perfectly capable so if you do any upgrade, I would go with a GPU if gaming is what you're going after.
     
  6. pr0fessor

    pr0fessor

    Joined:
    Oct 29, 2013
    Messages:
    65 (0.10/day)
    Thanks Received:
    3
    Location:
    Switzerland
    I would never buy a 6 core hexa cpu. Better wait for 8 cores from Intel or get an AMD. 4 or 8 core ftw. 6 is irrelevant imo.
     
  7. Hilux SSRG

    Hilux SSRG

    Joined:
    May 1, 2012
    Messages:
    1,024 (0.84/day)
    Thanks Received:
    170
    Location:
    New Jersey, USA
    Like the OP, I have an Intel i7 920 but oc'ed to 3.5ghz. If I had to upgrade to a z87 1150 mobo, what worthwhile haswell cpu should I look at? 4770k? or 4670? or wait until broadwell/skylake>?
     
  8. kn00tcn

    kn00tcn

    Joined:
    Feb 9, 2009
    Messages:
    1,053 (0.44/day)
    Thanks Received:
    239
    Location:
    Toronto
    dont get too excited about 880 (maxwell specifically) talk, anything early 2014 will still be 28nm not 22nm, a 780 or 290 would be fine for most anyway, the real killer or '22nm big maxwell' will be late 2014 or most certainly 2015
     
  9. Frag Maniac

    Frag Maniac

    Joined:
    Nov 9, 2010
    Messages:
    3,675 (2.09/day)
    Thanks Received:
    1,007
    For now there's no real reason, but depending on what types of games you play, whether Mantle succeeds, and how long down the road you're talking about, much of that could change.
     
  10. Melvis

    Melvis

    Joined:
    Mar 18, 2008
    Messages:
    3,621 (1.33/day)
    Thanks Received:
    538
    Location:
    Australia
    Since you run an i7 that has HT making it a 8 threaded CPU you should be fine, but I went from Phenom II 965 to the FX 8350 and and I got up to double the FPS depending on the game, like Metro 2033 doubled in FPS. So if you had a CPU that doesn't have HT then Id say yes, but since you do and its clocked at 4.GHz then no your fine I would think.
     
  11. Aquinus

    Aquinus Resident Wat-man

    Joined:
    Jan 28, 2012
    Messages:
    8,135 (6.21/day)
    Thanks Received:
    3,376
    Location:
    Concord, NH
    HT doesn't give you double the cores worth of performance. If you did a little bit of research, a HT thread is only something like 10-40% the power of a real core depending on the workload being put on to it. I'm willing to bet that most of the FPS gain you're seeing is from the higher clock speed on the 8350. HT only uses the extra resources of each core that isn't being used. It is only to utilize components that aren't already being utilized and most games are pretty poor at trying to use more than 4 threads right now.
     
  12. BiggieShady

    BiggieShady

    Joined:
    Feb 8, 2012
    Messages:
    1,552 (1.19/day)
    Thanks Received:
    839
    Location:
    Zagreb, Croatia
    Pretty much so. Hyper threading loves integer instructions with low memory traffic ... unless it's Dhrystone, for some reason :confused:
    [​IMG]
     
  13. Pill Monster

    Pill Monster

    Joined:
    Sep 6, 2013
    Messages:
    490 (0.68/day)
    Thanks Received:
    146
    Location:
    Oceania
    That's rubbish.
    If you did your research you'd realize game engines are becoming more and more multi-threaded, BF4 is a prime example where cores make all the difference, not clock speed. MetroLL FarCry3, and ArmAIII are also games which can utilize more than 6 cores.
    In any heavily threaded game an 8 core FX 83xx will stomp all over the X4's due to it's mutithreading capability, not clock speed.
    The 8320/50 even outperforms the i5 in many heavily thread scenarios.
    Have a look at any CPU load chart in BF4 and you'll see i5's have at least 85% load on every core compared to I7 and Vishera. For this reason HT is recommended to be enabled on Intel i7's.
     
    Last edited: Dec 18, 2013
  14. Aithos

    Joined:
    Sep 30, 2013
    Messages:
    175 (0.25/day)
    Thanks Received:
    35
    No they aren't. BF4 still runs better on the i5-4670k than the AMD 8350. I can go google the benchmarks if you don't want to take my word for it, I've posted several in other threads before. There is no such thing as a "heavily" threaded game, BF4 is probably the most optimized game to date and the 8350 barely breaks even with the 4670k because of it. It doesn't matter what the CPU utilization is, it matters what your framerate is, and the 4670k still wins. Here is the simple truth:

    Game developers lag waaaaay behind current technology because they are developing games for the "average" user. In order for a game to sell well, it needs to run on virtually every machine from a 6 year window because *most* people don't upgrade every couple years. People who browse tech forums and build their own computers or stay on the top end of the curve are a fraction of the gaming market. Companies don't heavily optimize because they wouldn't recoup development costs if they did, we just don't make up enough of the population. You can *almost* get away with a dual core CPU today if you overclock and still get decent framerates. If the motherboards and memory were up to par I'd wager you could, the problem is that 1080p modern games are a lot more memory and bandwidth intensive and the legacy sockets wouldnt' be able to keep up.

    The long and short is that it will be another 5 years *at least* before 6+ cores are common enough for developers to spend the tens of millions of dollars it would cost to do more optimization of the game engines to be multi-threaded, if ever. The second problem besides hardware capabilities is that multi-threaded development is HARD. I'm a senior systems programmer myself, trust me when I say there is a huge difference between making something that works and making something that is optimized. Games are already more demanding on the performance front, in order to be truly optimized for 6+ cores and multiple threads per core it would cost upwards of hundreds of millions in development dollars for just a handful of main game engines.

    The 8350 is a terrible gaming CPU if you want to compare performance to the Intel CPUs, I'm sorry that you would rather champion a brand than performance but it's true. The 8350 only beats an i5 in video encoding, virtualization or 3d modeling applications. For gaming, daily tasks and OS use the i5 keeps up with the i7-4770k and both of those chips even beat the Ivy-E chips in those tasks. I get really tired of people who have no idea about game development, engine development or software development in general piping in talking about "trends" in the industry. If you think games are becoming heavily threaded you're very, very uninformed.

    Aquinus was correct in saying that more than 4 cores is pointless for gaming. It is.

    Edit: Here is the 4960k review that shows CPU performance in games. Pay attention to the fact the $1000 Ivy-E flagship CPU loses in many games to the 4 core and non-multi threaded CPU and barely wins when it does. Also note the 8350 gets stomped all around by the Intel chips.

    http://www.anandtech.com/show/7255/intel-core-i7-4960x-ivy-bridge-e-review/5
     
    Last edited: Dec 18, 2013
  15. Melvis

    Melvis

    Joined:
    Mar 18, 2008
    Messages:
    3,621 (1.33/day)
    Thanks Received:
    538
    Location:
    Australia
    Huh? I didn't say HT would give you double the FPS? Where in my statement did I say anything about double the FPS with HT? You need to read my post before commenting dude. HT will only give you a max of around 25%, not up to 40% this has already been discussed. I said in my post that I compared my 965 to my 8350 and I got (not in all) but 99% of the games I tested I got a FPS increase, up to almost double, one game didnt change and that was AVP. And no it wasnt because the 8350 is clocked higher, it will help to a point but remember the IPC is lower then on the Phenom II's far as I know, I haven't seen proof of it otherwise? so it would be maybe at best a 300MHz increase over the 965 in IPC performance which will make pretty much no difference in games. So in certain Multi threaded, heavy intensive games I would think HT would give you a slight performance increase? in theory its the same as me going from 4 threads to 8 threads and seeing an increase is it not? I personally tested both my CPU's in many gaming benchmarks to see if this was indeed true or false, as I held back from getting the 8 core from alot of people saying on this forum it was pointless, but for me it made a difference, so Im glad I did.

    I never did get around to testing my 8350 against my i7 940 before I sold it in gaming, but I did do a very quick test with my m8s 8120 vs the i7 940 in Metro 2033 with a GTX 295 and the game ran smooth on my m8s 8120 but laggy on my i7 940, I was surprised, but it was true :S Its just one of those games I found that loves cores like BC2, and Crysis 2, use all my 8 cores.

    My current Intel CPU is a i7 970 12 threaded monster, and I have done some benching with it against my 8350 (not in games) to see how close or far apart they are from one another. I also put the i7 940 against the 8350, again not in games if anyone wanted to know how much of a difference there really? is in real world programs?

    @ Aithos, no one trusts anandtech, I wouldnt trust them as far as I can throw them, there consistently low, try someone thats tested CPU's and isnt bias like this guy >


    haters going to hate!
     
    Last edited: Dec 19, 2013
  16. Pill Monster

    Pill Monster

    Joined:
    Sep 6, 2013
    Messages:
    490 (0.68/day)
    Thanks Received:
    146
    Location:
    Oceania
    You couldn't be more wrong.
     
  17. MxPhenom 216

    MxPhenom 216 Corsair Fanboy

    Joined:
    Aug 31, 2010
    Messages:
    10,971 (6.01/day)
    Thanks Received:
    2,906
    Location:
    Seattle, WA
    Even then the GTX 800 will come out regardless and be on the new Maxwell architecture, but still be 28nm. As far as im aware 20nm will not be ready by the time Nvidia wants to release their first iteration of Maxwell.
     
  18. Frick

    Frick Fishfaced Nincompoop

    Joined:
    Feb 27, 2006
    Messages:
    11,666 (3.36/day)
    Thanks Received:
    2,975
    About BF4:

    [​IMG]

    ???
     
  19. marsey99

    marsey99

    Joined:
    Jul 18, 2007
    Messages:
    1,955 (0.66/day)
    Thanks Received:
    460
    790 is coming next, 800s will follow closer to easter.
     
  20. Aquinus

    Aquinus Resident Wat-man

    Joined:
    Jan 28, 2012
    Messages:
    8,135 (6.21/day)
    Thanks Received:
    3,376
    Location:
    Concord, NH
    Your post couldn't be any more useless.

    All that means is that the i5 has less resources to utilize after doing everything BF4 needs it to do. Low CPU utilization doesn't mean that the game has more CPU power to use. If an application can only effectively use 4 threads at once, any 8-core AMD CPU will never go above 50% because it can't use the rest of the CPU's resources in tandem. All in all, AMD is pushing the envolope with more cores, the problem is only a handful of applications truly takes advantage of that.

    Since the improved hit ratio on cache, faster memory controller, and new architecture have nothing to do with the performance improvements and shortcomings.

    I realize your new here, but doing some research instead of just blurting out whatever comes to mind usually is pretty helpful in understanding what is going on.
     
  21. Pill Monster

    Pill Monster

    Joined:
    Sep 6, 2013
    Messages:
    490 (0.68/day)
    Thanks Received:
    146
    Location:
    Oceania
    You must be kidding me, that benchmark was taken from a SINGLE PLAYER game and has no relevance whatsoever to multiplayer performance.
     
  22. Pill Monster

    Pill Monster

    Joined:
    Sep 6, 2013
    Messages:
    490 (0.68/day)
    Thanks Received:
    146
    Location:
    Oceania
    I may be new here bud, but I've been a member of Guru3D for 8yrs and have more than 22, 000 posts over there, I'm also an Engineer for Fujitsu.

    There's a lot I could teach you however I get the feeling you wouldn't listen anyway so I won't bother.... ;)

    Have a nice day.
     
  23. Aithos

    Joined:
    Sep 30, 2013
    Messages:
    175 (0.25/day)
    Thanks Received:
    35
    Umm you do realize that benchmarks are always single player right? And you do realize that multiplayer performance for framerate will mirror single player performance right? I mean you wouldn't be dumb enough to suggest that a chip that loses in a benchmark is magically going to perform better in multiplayer right? It's all the same engine, the game doesn't magically become more multi-threaded in one game mode vs another. I don't care how long you've been a member of Guru3D, I've been building computers and programming since 1996. I'm a senior systems programmer and I've worked in game development as a side job. I love how engineers think they know everything, I'm betting I could teach you a lot but I get the feeling you wouldn't listen so I won't bother.

    Oh and stop talking about the 8350, it doesn't stomp the Intel CPUs in a damn thing. It can't even beat the previous generation Intel single threaded processors in gaming because it's such a gimpy CPU that it needs a heavily threaded application just to keep up. Clock speed > multi-threading for gaming, daily tasks and OS use. The *only* thing that multi-threading is better for is encoding video and high end graphics work or virtualization in software development, that's it. If you argue differently you're an idiot and I'm going to ignore you. Besides, anything the 8350 can do, the 4770k will do better. It's too bad because I used to be a big fan of AMD, but their current CPU lineup SUCKS.

    Never mind the fact that there isn't such a thing as a heavily threaded video game. You clearly don't know anything about game development, stick to engineering for Fujitsu, you're out of your element.

    Oh, and you couldn't be more wrong. Have a nice day.
     
    Last edited: Dec 19, 2013
  24. Aithos

    Joined:
    Sep 30, 2013
    Messages:
    175 (0.25/day)
    Thanks Received:
    35
    Right, don't trust one of the biggest, most respected sites for tech in the world. Tell me again how biased they are when they publish their test methods and do everything exactly the same? Also, they used SLI Titans to completely isolate the CPUs unlike most reviews that end up with a GPU bottleneck in some games/resolutions. That article is one of the most honest comparisons that I've ever seen, the fact you're calling it garbage just shows how much your opinion is worth. Please don't bother posting again, you're not adding anything and you're clearly biased.

    Edit: Oh, and that video is worthless, it's comparing the 8350 to the previous generation of intel chips...not the current one. Way to cherry pick and not be biased at all -sarcasm-
     
  25. niko084

    niko084

    Joined:
    Dec 5, 2006
    Messages:
    7,636 (2.39/day)
    Thanks Received:
    729
    I've got a few running around 4.6-4.8ghz 24/7 doing some hard work, think 100% cpu usage for 24 hours a day for 2 months straight.
    They have been rock solid, safe voltages and good temps.

    To the initial post... Not much reason, even my 3770k at stock clocks I'm often running SC2 and D3 in the background, doing some random media work, 4 dozen browser tabs, and playing another game generally BF4 lately and I don't notice a difference in FPS, Intel i7's are ridiculous, clock for clock a sandybridge and newer is going to be slightly faster.

    Now for the AMD/Intel argument... Lets just drop it before it gets out of hand.
    It's like arguing Diesel vs Gas in racing, it's HIGHLY situational.
    ************
    For a good performance boost with i7's I recommend disabling Core Parking.
     
    Last edited: Dec 19, 2013
    Crunching for Team TPU More than 25k PPD

Currently Active Users Viewing This Thread: 1 (0 members and 1 guest)

Share This Page