• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

10700 vs 3700x which one if both priced almost the same ?

Status
Not open for further replies.
1080p and below is the resolution where you are limited by the CPU. I guess room temperature IQ would have you do CPU reviews in resolutions where the GPU is the bottleneck?
Yeah, I measure in Kelvin. :roll:
Still, if he won't play his spanking $1200 2080 Ti at 1080p he'll never see those gains you're talking about.

But just to get the cognitive dissonance right, you're still comparing two same priced options, one of which is demonstrably 10% faster, and arguing for the slower. :laugh:
It's only that much faster in the benchmarks you pick.
He's not only gaming.
 
and arguing for the slower. :laugh:

I never argued for the slower, I said buy whichever is cheapest, the differences are inconsequential in the long run. But the burning fanboy inside of you thought I dared recommend something without a blue sticker.
 
I'd refer you back to the OP's actual post, where he says that they are pairing it with a next gen GPU. Hence the 2080ti comparisons. But that would require some reading comprehension on your part, so I'll let it slide.
 
170+ fps instead of 130fps is not a 5% difference buddy, and that's on current gen GPUs.

Learn to read, the guy is not changing his CPU/Mobo for next 5-7 years, and is planning on buying a next gen GPU. This means "upgrade path" is worthless.

"AMD value" both options he's looking at are the same price, and all the "advantages" you rave about are relevant to your needs not to what the OP has said. Once again, learn to read.

fair enough, I missed the most recent post about not upgrading cpu for 5-7 years. I'll agree to learn to read if you do as well :laugh:


His current CPU is i7 860 ( 10 years old cpu ) lets say it's worthless or not up-to the task anymore, his next upgrade if any will be when the 10700 or 3700x is worthless too or struggle to do the job.

We didn't count in the 6 cores CPU because we want more future proof and it's a one time buy for the CPU/MB .
I don't understand why it's only a one time buy for the cpu if "future proofing" is a concern then locking yourself into 1 hardware config for the next 5-7 years seems like a strange way to go about it. Regardless, If that's your/his approach and the 3700x costs $400+ dollars where you are from, at that point you can basically kiss everything good about Amd goodbye. also at that point, the choice really doesn't even matter does it? Intel for slightly better gaming, amd for slightly better workloads since they both cost the same in your neck of the woods.
 
I'd refer you back to the OP's actual post, where he says that they are pairing it with a next gen GPU. Hence the 2080ti comparisons. But that would require some reading comprehension on your part, so I'll let it slide.

Who the hell cares, the CPUs perform the same irrespective of the GPU used. The "next-gen" GPUs aren't going to be orders of magnitude faster like you believe for some inexplicable reason.
 
1.png

2.png

Screenshot_20200525-132445.png


9900k being the closest analogue to the 10700k (10700k actually has a slight frequency advantage), on average around 30 more FPS on the high end and 20 on the minimum 1%, which matter a lot, since your 1% lows are when you notice your nice average framerate start to stutter.

Next gen GPUs will have 2080ti level performance at a price point of around $450 so it's a very relevant comparison. Despite your attempts to dissuade comparison where the CPU is the bottleneck and not the GPU.
 
What's laughable and pathetic is that those charts don't even contain a 10700. What the hell are you even trying to do, this is embarrassing. Did you really try and disprove those TPU percentages with this ? Holy crap. :roll:

Someone stop this spamming troll.
 
  • Like
Reactions: SL2
^ I don't see the 10700 anywhere on those graphs?

I see the 3700x though
 
  • Like
Reactions: SL2
"Who the hell cares", and "cpu's perform the same". Nice logic.
 
your graph with a bunch of intel cpus that nobody was talking about is somehow better logic though, right? :slap:
 
We seem to have dropped into a strange alternate universe where someone pairs a $1320 graphics card with an $85 monitor and a $280 CPU. :confused:
 
^ I don't see the 10700 anywhere on those graphs?

I see the 3700x though
10700 is a 9900 with a updated solder and thinner die for better cooling. So anything the 9900 does, the 10700 does slightly better. I wouldn't complain, using the 9900 instead of the 10700 is giving your AMD preference an even easier ride. It's why most reviewers didn't even test the 10700, since it already exists as the 9900, they all did 10600 and 10900 reviews since they're both new.

We seem to have dropped into a strange alternate universe where someone pairs a $1320 graphics card with an $85 monitor and a $280 CPU. :confused:
How about a universe where someone is going with either a 10700 or a 3700x, and pairing it with a next gen GPU? That simple enough and easy enough for you to understand?
 
Last edited:
TPU has their own reviews, you know.
He is in denial forget about it, he searched for games like Far Cry that have the worst engines to try and somehow convince us that this is the norm and those TPU charts must be garbage. Next level braindead.
 
TPU has their own reviews, you know.
View attachment 163553
TPU doesn't compare 1% lows yet, they're looking into getting the equipment.

He is in denial forget about it, he searched for games like Far Cry that have the worst engines to try and somehow convince us that this is the norm and those TPU charts must be garbage. Next level braindead.
Coming from the guy who can't count or apparently read :laugh: .
 
1080p and below is the resolution where you are limited by the CPU. I guess room temperature IQ would have you do CPU reviews in resolutions where the GPU is the bottleneck?
You know, I'm pretty sure the OP's friend will play games, not review and benchmark them. Letting the CPU be the bottleneck when gaming sounds like a bad idea.

It's a whole different thing when doing reviews tho.
 
Coming from the guy who can't count or apparently read

I said I had them mixed up, doesn't compare to your 4D hyperspace chess trying to prove a 10700 is so much better with charts that don't contain a 10700. That's staggeringly bad and hilarious.
 
TPU doesn't compare 1% lows yet, they're looking into getting the equipment.
Well you're not looking into the 10700, so I can't see how that matters. You know there are 10700 reviews out there, right? Yeah you have to look for them yourselves, don't ask me.
 
You know, I'm pretty sure the OP's friend will play games, not review and benchmark them. Letting the CPU be the bottleneck when gaming sounds like a bad idea.

It's a whole different thing when doing reviews tho.
CPU bottleneck tests are useful for determining long term choice in CPU. They artificially stress the CPU, which is objectively the point of testing. You're using high refresh rate gaming to find out where each CPU tested runs out of steam. Once again, why would you be interested in a review where the part in question is being bottlenecked by some other part of the system? :kookoo:
 
We seem to have dropped into a strange alternate universe where someone pairs a $1320 graphics card with an $85 monitor and a $280 CPU. :confused:
Don't bother.
It's the only combination that makes it possible to pick the right winner. :D

CPU bottleneck tests are useful for determining long term choice in CPU. They artificially stress the CPU, which is objectively the point of testing. You're using high refresh rate gaming to find out where each CPU tested runs out of steam. Once again, why would you be interested in a review where the part in question is being bottlenecked by some other part of the system? :kookoo:
You missed my second line.
It's a whole different thing when doing reviews tho.
You're saying one CPU is 10 % faster, I'm saying that's just for reviews and he'll never see that gain when gaming, realistically.
 
You're right, two 14nm++ skylake chips that have the same core count, speed and architecture, with the updated version having slightly better cooling characteristics due to a revised IHS design are incomparable and any testing is irrelevant. Nice logic. But comparing an i3 10320 to a 6700k is totally fine right? Same CPU.
 
It's the only combination that makes it possible to pick the right winner. :D
If he buys something mid range then he's potentially looking at 2070-2080 performance at best for about 300$. All of these benchmarks are done with a 2080ti though, so in practice the differences are going to be even lower.
 
You're saying one CPU is 10 % faster, I'm saying that's just for reviews and he'll never see that gain when gaming, realistically.

"realistically" he will see the benefits of a CPU that is 10% faster minimum in CPU bottlenecked games when he is going to keep the system for 5-7 years. The 140-160FPS on current gen game benchmarks will drop to half that in 7 years time, and in that case, running at 60fps instead of 45fps will certainly be noticeable even if you don't have a high refresh rate monitor by then.

It's like the haswell i5 vs i7 all over again. At the time the games didn't really need 8 threads and the gains were minimal, but 5 years later, the i5's are worthless due to most AAA games running like shit on a 4t cpu, whereas the i7s can actually hold their own and sustain above 60fps most of the time. Saying a clear performance advantage is pointless because there's no way to notice higher FPS "realistically" is moronic.
 
60/45 = 1.33, I may not know how to count but you can't figure out division and percentages.

That's not even close to 10% but nice try, if these CPUs would maintain that 10% (which isn't accurate nor realistic even today) which they probably wouldn't you'd be looking at 60 vs 54 worst case, wow, colossal.

I bet you'll say to yourself, "man am I glad I made the right decision 7 years ago, now I'm getting 5 more frames".
 
Status
Not open for further replies.
Back
Top