• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Core i7-10700K and i5-10600K Geekbenched, Inch Ahead of 3800X and 3600X

In the same bench my default settings with xmp on get 5654 single core 34240multi, so yeah well done intel?!:P :D Ryzen 3800X rocks.

 
Last edited:
Well this is not going to convince anyone who bought into Ryzen to change back or over to Intel. Not that the CPU(s) are not impressive (for gaming) but that continued conceit that you must buy a new MB is even more pronounced in a world where you could use (for the most part) a MB from 2017 with a CPU from 2019 or vice versa.



Well it is a 12 core processor....I guess.
The i9-10xxx series are 10 cores being compared to AMD 8 cores.
 
core count is no longer relevant in 2020. It's all about efficiency, IPCs & all-core boost duration (in regards to thermals, workloads etc). Also I wouldn't want to spend $1k or more on just the processor & mobo alone each time there's a new generation being released, leaving my RAM, SSD, Cooler & GPU out in the cold? no freaking way. Sad to say this but Intel has come to the point where I see AMD as the better option coz I know I don't need to spend a ton of money each time a new processor is released that requires a new board. Also, Intel's top of the line Z490 is still using PCIe Gen3 while AMD's mid-end B550 has PCIe Gen4 support.
 
Not mentioned here seems to be price. It looks like it is significantly faster than a 3600X, but if it's priced like a 3700X or 3800X that would be the proper comparison.
 
Not mentioned here seems to be price. It looks like it is significantly faster than a 3600X, but if it's priced like a 3700X or 3800X that would be the proper comparison.

Lol, exactly. The performance doesn't mean anything without dollar amounts. Price to performance ratio is what most people care about.
 
Lol, exactly. The performance doesn't mean anything without dollar amounts. Price to performance ratio is what most people care about.

That's exactly what I care about. I love AMD for the competition they've provided hence driving that ration.

But that said, look at most benchmarks and at the top Intel sits. I don't buy top end CPUs, but it's a fact. With 6 cores / 12 threads the gen 10 Intel i5 chips do look promising, in particular the i5-10400 through 10600k. And I could care a rats about an extra 30 or 50W of power draw on a desktop chip.
 
If so, it'll be at considerably more power used, a point that's missing from this but wasn't during similar reports of the FX series back int day.
If you care for the earth you can't buy Intel, :p

No I jest at least Intel are competing on some points.
We wouldn't want a one horse race again.
Use your head for a bit
Because FX was proper shiett!!
 
Mmmm Daddy can smell 5Ghz dead silent
 
Use your head for a bit
Because FX was proper shiett!!
Consider there was sarcasm in my post while I am not sure there is in yours, sound's like baiting yours.
To me £159 -5 years and when retired it was within a margin of the same FPS as an i7 @4K
Some upgraded from peasant 1080p:p years ago:p :D.

You like hot and hungry then go you.
Buy Intel, save on heating this Christmas.
 
Yep... like a stock civic and ferrari both rolling down a hill. :p

It's too bad an overwhelming majority are at 1080p and less.
Less, no one seriously games on a phat back Tv anymore and any laptop with less than 1080p is not a gaming laptop really.

You did have to tune the snot out that civic to be fair.:)
 
Less, no one seriously games on a phat back Tv anymore and any laptop with less than 1080p is not a gaming laptop really.

You did have to tune the snot out that civic to be fair.:)
steam stats, my homie. Youd be shocked what it pulls...(please spare me the disclaimers we all know :) ).

Tuned? No. I said rolling down a hill. The point was that takes the engine out of the equation and is therefore similar in speed.... akin to running that potato for a cpu in 4k... it works, but it's still a civic. :p

Though it's funny because few can afford the graphical horsepower required in the first place... typically a 4k gamer isnt going to have a 2080+ and bulldozer. :)
 
steam stats, my homie. Youd be shocked what it pulls...(please spare me the disclaimers we all know :) ).

Tuned? No. I said rolling down a hill. The point was that takes the engine out of the equation and is therefore similar in speed.... akin to running that potato for a cpu in 4k... it works, but it's still a civic. :p

Though it's funny because few can afford the graphical horsepower required in the first place... typically a 4k gamer isnt going to have a 2080+ and bulldozer. :)
His point was FX was shit, it alludes to the fact that on release as I was pointing out it (the FX series) was initially hammered on it's performance per watt, it's actual performance against it's contemporary wasn't That bad but at way more power, roll into nowadays and as I implied, power draw is glanced over, hypocrisy by some not all and his FX is shit statement is therefore laughable as he believes these chips won't be seen the same in ten years.

I'd say 5.
 
His point was FX was shit, it alludes to the fact that on release as I was pointing out it (the FX series) was initially hammered on it's performance per watt, it's actual performance against it's contemporary wasn't That bad but at way more power, roll into nowadays and as I implied, power draw is glanced over, hypocrisy by some not all and his FX is shit statement is therefore laughable as he believes these chips won't be seen the same in ten years.

I'd say 5.


FX was at least 10 years ahead of its time. What would you think if right now AMD released a CPU with 64 small cores in place of the Ryzen 7 3700X ?
It only needs developer support and will be top notch.
 
FX was at least 10 years ahead of its time. What would you think if right now AMD released a CPU with 64 small cores in place of the Ryzen 7 3700X ?
It only needs developer support and will be top notch.
The conversation should be brought to bare on this chip really, not AMD's, I was comparing the presentation, I would prefer the same level of attention on it's power use is all.

We already have upto 64 big cores thanks, I'm fine with those all the small cores stuff I've used were Very limited in use and remain so.

But we'll see how it pans out.
 
His point was FX was shit, it alludes to the fact that on release as I was pointing out it (the FX series) was initially hammered on it's performance per watt, it's actual performance against it's contemporary wasn't That bad but at way more power, roll into nowadays and as I implied, power draw is glanced over, hypocrisy by some not all and his FX is shit statement is therefore laughable as he believes these chips won't be seen the same in ten years.

I'd say 5.
That's a rosey outlook. These things didnt compare favorably to sandy bridge in a pot of tests (and a 2500k vs 8350, mind you) nonetheless ivybridge...and that is raw performance across everything not zip files, x264 and cinebench multi, lol.


All piledriver/bd/vishera had going for them is price.
 
Last edited:
That's a rosey outlook. These things didnt compare favorably to sandy bridge in a pot of tests (and a 2500k vs 8350, mind you) nonetheless ivybridge...and that is raw performance across everything not zip files, x264 and cinebench multi, lol.


All piledriver/bd/vishera had going for them is price.
I agree but as I said, back then power use and heat were it's main detractors in forums etc as is not the case with these new chips, but it should be, at least more than it is, no mega dramma required though.

Poorly optimised hardware with poorly optimised software will do that , and that's mostly what it was IMHO back then.
 
The conversation should be brought to bare on this chip really, not AMD's, I was comparing the presentation, I would prefer the same level of attention on it's power use is all.

We already have upto 64 big cores thanks, I'm fine with those all the small cores stuff I've used were Very limited in use and remain so.

But we'll see how it pans out.


We have but most applications are still limited to the usage of 4 or 6 of them, and rarely more.

That's a rosey outlook. These things didnt compare favorably to sandy bridge in a pot of tests (and a 2500k vs 8350, mind you) nonetheless ivybridge...and that is raw performance across everything not zip files, x264 and cinebench multi, lol.


All piledriver/bd/vishera had going for them is price.

Back in 2011 most applications used only a single or dual core maximum. Since this, most of the 8-thread FX was not utilised optimally.
 
I agree but as I said, back then power use and heat were it's main detractors in forums etc as is not the case with these new chips, but it should be, at least more than it is, no mega dramma required though.

Poorly optimised hardware with poorly optimised software will do that , and that's mostly what it was IMHO back then.
Back in 2011 most applications used only a single or dual core maximum. Since this, most of the 8-thread FX was not utilised optimally.
True.. but that bar was equal for both Intel and amd CPUs, both had 8t. ;)

There are newer tests that show the same story. It didnt age well either. Slow is slow, my dude. What made these so attractive was the price... that's about it. It surely wasnt single threaded performamce, IPC, nor did it do well gaming at resolutions even more common back then (and less).

Edit: but I digress, this thread isnt about piledriver/bd/vishera. ;)
 
Last edited:
I agree but as I said, back then power use and heat were it's main detractors in forums etc as is not the case with these new chips, but it should be, at least more than it is, no mega dramma required though.

Poorly optimised hardware with poorly optimised software will do that , and that's mostly what it was IMHO back then.

FWIW, I had an FX (8300 I think). I did not then, nor now, care about power use on a desktop.

I have never understood the focus on that aspect of desktop chips for typical users at home. For mobile insofar as it affects thermals and battery life yes, for business class PCs (usually these are SFF), for servers, or worstation farms where the system remains at high use much of the time then sure.

But at home, where 99% of folks here are talking about?

Doing the math on typical workloads and based on my own experience with kilowatt measurements, you might be talking about +40W for 4 or 5 hours a day from a high power draw CPU - all else being equal - and assuming you put your system under heavy load for 4-5 hours per day 365/7 (which is a lot to average, even for power users and the most avid of gamers). That comes out to about 200WH / day or 1.4KWH / week. X52 weeks per year you get 72KWH per year.
The average KWH in the USA is 0.12c/kwh, so the cost here is 72KWH * 0.12c/KWH = $8.64 per year.

That isn't even worth anyone's time to discuss.

I mean, if you tell me chip A draws 250W vs chip B drawing 65W with the same performance, I might listen just a little. But 95W vs 135W? 65W vs 95W? No man, who gives a rat?

Now if I were buying 3000 PCs for my employer, I would care, but I'm not doing that and very few here are.
 
Last edited:
FWIW, I had an FX (8300 I think). I did not then, nor now, care about power use on a desktop.

I have never understood the focus on that aspect of desktop chips for typical users at home. For mobile insofar as it affects thermals and battery life yes, for business class PCs (usually these are SFF), for servers, or worstation farms where the system remains at high use much of the time then sure.

But at home, where 99% of folks here are talking about?

Doing the math on typical workloads and based on my own experience with kilowatt measurements, you might be talking about +40W for 4 or 5 hours a day from a high power draw CPU - all else being equal - and assuming you put your system under heavy load for 4-5 hours per day 365/7 (which is a lot to average, even for power users and the most avid of gamers). That comes out to about 400WH / day or 1.4KWH / week. X52 weeks per year you get 72KWH per year.
The average KWH in the USA is 0.12c/kwh, so the cost here is 14.6KWH * 0.12c/KWH = $8.74 per year.

That isn't even worth anyone's time to discuss.

I mean, if you tell me chip A draws 250W vs chip B drawing 65W with the same performance, I might listen just a little. But 95W vs 135W? 65W vs 95W? No man, who gives a rat?

Now if I were buying 3000 PCs for my employer, I would care, but I'm not doing that and very few here are.
Were all different, and our uses and reasons are ,my pc is on all day everyday for example with as high a load as my cooling system will support, others are similar, the large majority of pc user's don't care I agree.
But as far as simplifying everyone into one bracket, that's a stretch IMHO.

I've paid about £30-50 for pc power alone for years and at one point 20 times that amount.

But some do not care indeed.
 
Back
Top