Discussion in 'General Hardware' started by Go To Sleep, Dec 9, 2013.
Ah, I missed that you were talking about multiplayer. I am so sorry I have trespassed on your time.
8350 is closer in generation to the 3570k than the 4670k. You are definitely not biased at all. /sarcasm
Ugh, I really dislike coming across your posts.
Wrong. Multiplayer is more cpu intensive in BF4.
Comparing a >$300 cpu to a <$200 cpu. Might as well say anything a Radeon 7770 will do, a 280X will do better.
I get it, you like Intel. It's just funny how blind it has made you.
I don't care when something came out, I care what is available right now and what it costs. I also don't care what brand it is, I care what performance it gets. When I come into a thread I look at what the conversation is about, what the OP is asking and talk in that context. I've built AMD machines, I've build Intel machines. I've done both in the past 4 years. What I hate is people like the guys who refuse to look at benchmarks and data because of a blind preference. Go look up my posts and filter through them and you'll see that when I'm talking about:
1) gaming, daily use and hobby video editing I generally recommend Intel
2) heavy or professional video editing, software development with VM or heavy graphical work (3d modeling) I recommend higher core counts and multi-threading. I've said very clearly within the past few days in other threads that if you intend to do heavily threaded work on a budget the 8350 is the best CPU for the money. Would an Intel Ivy-E be better? Sure, it's also a LOT more expensive.
People like you piss me off. I'm not biased. I just have very little patience for stupidity when it comes to measurable performance. There is no gray area, there is no "subjective" to CPU performance. You have certain tasks each CPU is more well suited for and when you go to buy a CPU you should figure out what they are and then balance budget vs your uses. Period.
As for the single player/multiplayer thing. Stop it. Games are not more heavily threaded in multiplayer. You are implying that the engine behaves differently depending on whether it is CPU controlled people on the screen or human and that's just plain ignorant. Single player is more controlled because the developers can control exactly how many people are on screen and what they are doing, but if you have two CPUs that get 100fps and 50fps respectively...and you take them into the same situation in multiplayer...chances are that the former is still going to be roughly twice the FPS. They might be closer together because BOTH go down a lot (50fps vs 25fps) but the ratio will roughly be the same.
Otherwise benchmarks would be WORTHLESS. There would be no point in gaming benchmarks for CPUs or GPUs. To imply that is the worst kind of stupid.
Edit: wanted to clarify something. Of course CPU usage is more intensive in multiplayer, the computer has additional information coming in and going out related to the other players. But the engine isn't more heavily threaded...so the extra cores/threads don't make nearly the impact they would in a case where you have a heavily threaded game vs one that isn't at all. The point is the engine is exactly the same. BF4 numbers from multiplayer if you could get the exact same setup would be similar to what you see from the single player benchmarks. If one CPU got 50fps....the others that are the same in the benchmark would also get 50fps.
The entire reason the i5-4670k is better for gaming is because it's more powerful and games aren't very heavily threaded or optimized. It just plain doesn't matter that you have extra cores to split the work load because it doesn't scale very well above 4 cores. It's been proven time and again by reputable experts. You don't have to take my word for it, go READ. I've posted this a dozen times, game developers make games for average, not high end. 6 core is high end, it makes up a tiny part of the gaming community, so they don't optimize for it. It costs to much money for virtually no return.
I've made this comparison before, but more than 4 cores is like 3 and 4 way SLI, you get slightly better performance but it doesn't scale for crap. It's the same reason, no one optimizes engines to deal with more than two GPUs in drivers.
Agreed. I have a q6600 running at stock with a gtx570 and it loads games just as fast as the i7 920 i sold and plays them nearly as smoothly. It really isn't noticable at 1920x1080. No point in shelling out the cash if you ask me. So no, to answer OPs question , it is not worth upgrading.
I'm sorry if I sometimes come off as rude or abbrasive, I just get really tired of writing detailed posts and backing my opinions up with reviews, benchmarks and technical explanations only to have people come in and be like "nuh uh, you're wrong and you're stupid". It's very, very frustrating. I try really hard not to act like a jerk but it's hard when I see the same threads with the same misleading, or even worse completely false information over and over again. There are a couple people I've noticed here and on other forums who literally have no clue and say some of the most ridiculous things you could imagine...and will argue tooth and nail without saying a single fact or a shred of proof. I'm not saying that is what you're doing I'm just trying to explain. As an example:
How would you respond if someone told you: "I can see a difference between DVD and bluray on a 27" CRT with componant connection. You have to be really close, but it's there" how do you respond to that? I sat there and explained that a 27" crt TV would have been 480p with component because they didn't make CRTs that size with 720p capability. So the picture couldn't have been any better because you're looking at a standard definition picture. The only difference is the picture is progressive scan instead of interlaced, so it will be slightly crisper. But it has nothing to do with the DVD or Bluray, it's how the TV is drawing the picture in one pass vs two.
When someone refuses to acknowledge the FACTS about the situation I can't do anymore. When they constantly make false, often baseless opinionated statements I can't do anything else than try to explain why it's wrong. When I have to do it 20 times a day about the same topics, at some point I get mad. So I apologize if I came off that way in my previous posts.
Every single computer in my house is quad core 3570k/I7 920/Q6600/A8-3850/750k x4/A6-5200. I've gotten my hands on many other CPU's, and OS's, and it's fascinating to watch both synthetic and non-synthetic "benchmarking". Most notable is the AMD Piledriver and Jaguar CPU's.
Skyrim a game I love (and you can watch this with task manager) will utilize core 0 and core 2 on my 750k Athlon, yet on my Jaguar laptop I see all 4 cores light up. Hmm that's a bit mysterious. To further that statement a 8150 I play around with at work (it's a diagnoses machine) will light up again core 0 and core 6 in Skyrim, Battlefield 3 I'll see core 0/2/4/6 light up, finally Crysis 3 all 8 threads run some sort of load but again core 0/2/4/6 are experiencing the largest load. When I let my old 920 run Crysis 3 as an example core 0/2/4/6 are fully utilized core 1 and 3 are running something, and 5 and 7 are just chillin.
I'm not goanna get around to a point I'm just sharing my experience with some machines. However if just 1 game seems to stress a quad core enough that more cores make a difference....
You come off as arrogant. You let your opinions get in the way of the facts you point out.
He said he didn't trust them. (Dunno where garbage came from...) As a side note, I must also have a worthless opinion and shouldn't bother posting again because I also don't completely trust their benchmarks.
The petty squabbling is becoming annoying now, please keep it clean and on topic, opinions are encouraged, facts equally but lets play nice.
ssd is better for upgrade. CPU speed is enough.
SSD only helps loading speed. It doesn't actually make the game perform any better once it is loaded up. Not to say that an SSD is a bad upgrade, but it won't achieve anything other than faster loading times.
Lol..omg you're seriously full of it. I've just had a quick look through your post history and now I see what FX-GMC meant now about your arrogance.. Unbelievable. ..
A 6 core amd isn't as good as most quad core Intel's
Stick on 4.
Blanket statement. Plenty of applications where a AMD 6 core could outperform a Intel 4 core. Depends what you do.
I would be rather surprised if AMD has a 6-core CPU that's faster than a modern Intel quad-core with HT. Maybe not an i5, but HT gives the i7s a little bit of leverage and it tends to have leverage in the right places when you compare it to fairly heavily threaded tasks. I agree though, it definitely depends on what you do when you start talking about 6 or more cores.
actually 8 core amd is not so fully 8 cores....
the 8 cores are infact mashed down into 4 modules, each module contains 2 small cores, each module contains 1 decoder per 2 cores. The decoder can only supply 1 core information at a time which hinders its speed.
Anyway top amd 8core cpu gets owned by any i5 (4c) in games.
There are many variants of builds but only one of them can contain amd cpu - when you dont have money but you need this only for work with multithreaded applications
Watch the language. While Intel does perform better in single threaded applications, there is no denying the 8350 is a very good gaming processor. I wouldn't buy one myself nowadays, but it is most certainly an option for those that feel the need.
Actually, IIRC it had 2 decoders and I think Steamroller upped it to 3 but my numbers could be wrong... maybe it was 3 to 4... AMD's 8xxx and 9xxx series FX CPUs have 8 integer cores and 4 256-bit FPUs which can be split apart using something like FMA. AMD is expecting developers to use the GPU for heavy floating point math and keep the CPU good at what CPUs do best, which is integer math. AMD's 8-core CPU is much more a 8-core CPU than it isn't. 1 module isn't too far off from how the Q6600 was made, with two dual-core dies with a shared L2 cache between each two cores. The only difference is AMD added extra hardware where they felt it was needed and removed hardware where it wasn't. Die space is precious because there will come a point where you simply can't make CPUs smaller.
Was there a performance hit on single threaded applications? Sure.
Are they "fucked" because of it? No way.
Bumping this post.
Separate names with a comma.