• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

7800X3D vs 14900K video by HWUB. What would you choose for gaming?

That's a big ask.. lol

I think everyone knows AMD is more efficient than intel, that comes with the smaller node.
It's the out of the box settings that makes it bad though (Motherboards are somewhat to blame for this too).

Its really easy to cap power to something that makes more sense, my 14900k runs mid 50s in games and that's on air (noctua U12a); also mid 70s under full cinebench load.

That said, you shouldnt HAVE to cap power, it should be less than 253 out of the box.
To be fair, my 7950x was also power capped and undervolted with curve optimizer.
Watch the review, they create controlled scenarios where power is capped and it still down on the metric of fps/watt. It's like Steve knew fanboys would take issue and was prepared.
 
Watch the review, they create controlled scenarios where power is capped and it still down on the metric of fps/watt. It's like Steve knew fanboys would take issue and was prepared.

I may watch if i get bored, i think everyone knows amd wins in perf/watt though, or.. they should know, it shouldn't be big news to anyone.
 

I don't think I have to say much here :laugh:.
The other Steve screwed up big time completely ignoring frequencies. You have to limit the frequency (and voltage dependent of the frequency) to increase efficiency. It would be unfair comparing 7800X3D running at 5.0 GHz and at 5.5 GHz, the latter one would always have worse efficiency (fps per watt) results.

EDIT: 7800X3D at 5.5 GHz would also overheat and peel off its secret sauce topping... :D It would become 7800X-lost-3D.
 
Last edited:
The other Steve screwed up big time completely ignoring frequencies. You have to limit the frequency (and voltage dependent of the frequency) to increase efficiency. It would be unfair comparing 7800X3D running at 5.0 GHz and at 5.5 GHz, the latter one would always have worse efficiency (fps per watt) results.

EDIT: 7800X3D at 5.5 GHz would also overheat and peel off its secret sauce topping... :D It would become 7800X-lost-3D.
So all this begs some questions.
  • If X3D cache wasn't so power constraining on the CPU would AMD still have produced such a power efficient CPU or is this a technical coincidence that happened to fall into AMD favor?
  • If X3D cache gets better at being able to handle higher power consumption in the future will AMD fall into the over-juicing trap and AMD loses the out of box power efficient advantage?
 
I would choose the one that functions as well in summer as it does in winter.
 
If X3D cache wasn't so power constraining on the CPU would AMD still have produced such a power efficient CPU or is this a technical coincidence that happened to fall into AMD favor?
No I don't think so.

If X3D cache gets better at being able to handle higher power consumption in the future will AMD fall into the over-juicing trap and AMD loses the out of box power efficient advantage?
Probably not..
 
  • Like
Reactions: Jun
I simply want a CPU that I can cool without having to resort to exotic cooling methods and AMD has done just that.
 
The other Steve screwed up big time completely ignoring frequencies. You have to limit the frequency (and voltage dependent of the frequency) to increase efficiency. It would be unfair comparing 7800X3D running at 5.0 GHz and at 5.5 GHz, the latter one would always have worse efficiency (fps per watt) results.

EDIT: 7800X3D at 5.5 GHz would also overheat and peel off its secret sauce topping... :D It would become 7800X-lost-3D.

Stock to stock is how every review collects and presents it's data. Supplementary OC / UV data may be provided but stock data always take precedence. I would not call it a screw up to follow generally accepted bests practices presenting data that is most applicable to the vast majority of PC users.

Setting / limiting a CPU to an arbitrary frequency like 5 GHz would not yield useful effciency data relevant to anyone. This is for multiple reasons:

1) Setting / limiting frequency hampers the CPU's boost algorithm. By setting / limiting the CPU frequency you are not allowing the CPU to fully control one of it's most crucial levers controlling it's efficiency.
2) It leaves performance / efficiency on the table. If you were to set a CPU's frequency to 5 GHz for example and that CPU could have boosted to 5.1 GHz within the same power envelope you in effect just reduced the efficiency.
3) There's really no rationaly in selecting 5 GHz specifically other than perhaps that being the 7800X3D's max clock. One could make an argument for testing at any frequency from 0.1 GHz to 5 GHz to compare the 7800X3D to the 14900K, with each different frequency producing varying results depending on where that frequency lands on the voltage curve for that processor and how well the architecture oeprates at that frequency. One could easily make the argument that it's baised to run the 7800X3D at it's max clock, which is naturally going to be at the more aggressive end of it's voltage curve. Again, an argument can be made for any frequency.
4) No one looking for efficiency is setting / limiting the frequency directly. They are setting the TDP / power draw of the CPU as that allows the CPU to dynamically scale performace according to the selection option.

In essence you'd be providing data that worthless to both regular consumers and ethusiasts. Frequency limiting is typically done to do IPC comparisons and even then the selected frequency is typically much lower to ensure much greater cross-comparability with older CPU generations.

The TLDR of this entire thread is that the 14900K can get to reasonable power consumption levels when UV'd. It's not as efficient as the 7800X3D as mutliple videos have demonstrated but des it really need to be? No. This thread was great for a few pages when people were talking about tweaking their 14900K, I liked reading that. Most of it though has been non-productive for the people dragged into the convo but to be fair, given the opening post most people should have been aware from the start the quality of discussion to be had.
 
The TLDR of this entire thread is that the 14900K can get to reasonable power consumption levels when UV'd.
True, but if you're like most people who simply build and forget or they're not fully knowledgeable in the arcane magics of tweaking UEFI, then those people are running their processors bone stock. In that condition, Intel chips are going to gulp down power and output a hell of a lot of heat while a comparable AMD chip that's in a bone stock config is going to run using less power and with less heat output.
 
I feel like a lot of the power issues could be mooted if we could run an app to power profile based on game/app.

Maybe a safe tune that works on all chips (per app) with an option to tune it further for silicon lottery winners.
 
  • If X3D cache wasn't so power constraining on the CPU would AMD still have produced such a power efficient CPU or is this a technical coincidence that happened to fall into AMD favor?
Yes. Running a CPU at lower frequency and voltage always worsens performance and improves efficiency. AMD was forced to run the CPU efficiently even if they did not want to.
  • If X3D cache gets better at being able to handle higher power consumption in the future will AMD fall into the over-juicing trap and AMD loses the out of box power efficient advantage?
I have no idea what technology are the two competing brands going to produce their future chips with and what is going to happen, but Intel is now in the worst position running CPUs made on an older inefficient process too fast and at too high voltages.

BTW it would be interesting matching gaming efficiency of 14900K and 7950X (no extra cache employed and so special advantage) by selecting frequencies these CPUs run at. I wonder how much lower performance would 14900K have at that frequency. If it would be really bad or not that much worse.
 
I read W1z's reviews on the 14 gen cpu's and all I got out of it was Intel added more E-cores.

Yep - kinda a worthless upgrade, in particular for gaming.
 
Stock to stock is how every review collects and presents it's data.
How is this relevant for a special video, which uses lowered power limits?
Setting / limiting a CPU to an arbitrary frequency like 5 GHz would not yield useful effciency data relevant to anyone. This is for multiple reasons:

1) Setting / limiting frequency hampers the CPU's boost algorithm. By setting / limiting the CPU frequency you are not allowing the CPU to fully control one of it's most crucial levers controlling it's efficiency.
By limiting a frequency you disable the CPU to run inefficiently (and hot).
2) It leaves performance / efficiency on the table. If you were to set a CPU's frequency to 5 GHz for example and that CPU could have boosted to 5.1 GHz within the same power envelope you in effect just reduced the efficiency.
No. You increased energy efficiency.
3) There's really no rationaly in selecting 5 GHz specifically other than perhaps that being the 7800X3D's max clock. One could make an argument for testing at any frequency from 0.1 GHz to 5 GHz to compare the 7800X3D to the 14900K, with each different frequency producing varying results depending on where that frequency lands on the voltage curve for that processor and how well the architecture oeprates at that frequency.
I do not believe that comparing 7800X3D with 14900K is just, the whole thread is about this. Having data about efficiency from the CPUs running at different frequencies is good in any case, however I doubt that anyone would be really interested in lower half or two thirds of the range, because the performance suffers too much there.
4) No one looking for efficiency is setting / limiting the frequency directly. They are setting the TDP / power draw of the CPU as that allows the CPU to dynamically scale performace according to the selection option.
In essence you'd be providing data that worthless to both regular consumers and enthusiasts.
I think that the regular consumer is worried about the total power draw and what is he going to cool the CPU with. Enthusiasts can be very well interested in energy efficiency too.
The TLDR of this entire thread is that the 14900K can get to reasonable power consumption levels when UV'd. It's not as efficient as the 7800X3D as mutliple videos have demonstrated but des it really need to be? No. This thread was great for a few pages when people were talking about tweaking their 14900K, I liked reading that. Most of it though has been non-productive for the people dragged into the convo but to be fair, given the opening post most people should have been aware from the start the quality of discussion to be had.
I do not agree with the TLDR and I also have no idea what is wrong with the opening post? :D
 
The 14900 series are just boring CPU's...

 
It's good if you live in Alaska or syberia, never cold feet anymore. :D
 
If X3D cache wasn't so power constraining on the CPU would AMD still have produced such a power efficient CPU or is this a technical coincidence that happened to fall into AMD favor?
Based on what I've seen with my 7700X and 7800X3D, the problem is the voltage-frequency curve, which is brilliant on the X3D, but is pure horseshit on the X. If AMD could tell me why the X needs to run +0.3 or even +0.4 V higher (edit: higher than the X3D) to gain 300-500 MHz on its boost frequency, thus maxing out a 142 W PPT in all-core workloads while the X3D is doing just fine with 80 W, I'd be very happy.

If X3D cache gets better at being able to handle higher power consumption in the future will AMD fall into the over-juicing trap and AMD loses the out of box power efficient advantage?
I don't think that'll ever happen. X3D's problem is heat dissipation through the extra cache layer, not power.
 
Last edited:
If AMD could tell me why the X needs to run +0.3 or even +0.4 V higher to gain 300-500 MHz on its boost frequency
Because that's how semiconductors work outside of abnormally low (cooling) temps ~ so basically anything except superconductors.
 
Last edited:
Stock to stock is how every review collects and presents it's data. Supplementary OC / UV data may be provided but stock data always take precedence. I would not call it a screw up to follow generally accepted bests practices presenting data that is most applicable to the vast majority of PC users.

Setting / limiting a CPU to an arbitrary frequency like 5 GHz would not yield useful effciency data relevant to anyone. This is for multiple reasons:

1) Setting / limiting frequency hampers the CPU's boost algorithm. By setting / limiting the CPU frequency you are not allowing the CPU to fully control one of it's most crucial levers controlling it's efficiency.
2) It leaves performance / efficiency on the table. If you were to set a CPU's frequency to 5 GHz for example and that CPU could have boosted to 5.1 GHz within the same power envelope you in effect just reduced the efficiency.
3) There's really no rationaly in selecting 5 GHz specifically other than perhaps that being the 7800X3D's max clock. One could make an argument for testing at any frequency from 0.1 GHz to 5 GHz to compare the 7800X3D to the 14900K, with each different frequency producing varying results depending on where that frequency lands on the voltage curve for that processor and how well the architecture oeprates at that frequency. One could easily make the argument that it's baised to run the 7800X3D at it's max clock, which is naturally going to be at the more aggressive end of it's voltage curve. Again, an argument can be made for any frequency.
4) No one looking for efficiency is setting / limiting the frequency directly. They are setting the TDP / power draw of the CPU as that allows the CPU to dynamically scale performace according to the selection option.

In essence you'd be providing data that worthless to both regular consumers and ethusiasts. Frequency limiting is typically done to do IPC comparisons and even then the selected frequency is typically much lower to ensure much greater cross-comparability with older CPU generations.

The TLDR of this entire thread is that the 14900K can get to reasonable power consumption levels when UV'd. It's not as efficient as the 7800X3D as mutliple videos have demonstrated but des it really need to be? No. This thread was great for a few pages when people were talking about tweaking their 14900K, I liked reading that. Most of it though has been non-productive for the people dragged into the convo but to be fair, given the opening post most people should have been aware from the start the quality of discussion to be had.
I agree - every architecture is different, and boost algorithms work differently even among different models of the same architecture, and so, comparing with a frequency limit is only ever good to measure IPC, not in a consumer scenario, ever.

Because that's how semiconductors work outside of abnormally low (cooling) temps ~ so basically anything outside superconductors.
I mean, why does the 7700X need to run at 1.4 V to achieve a max boost of 5.4 GHz, when the 7800X3D can do 5.05 GHz at 1.1 V? Are the 7700X's clock bins so far up into diminishing returns territory, or is it just unreasonably overvolted by default? Does the chip actually need so much voltage for so little difference in clock speed compared to the X3D?
 
Last edited:
The last few hundred MHz always take the most, I guess limitation of our current understanding of physics. You see this everywhere in nature ~ sports, automobiles, space et al.
 
If AMD could tell me why the X needs to run +0.3 or even +0.4 V higher to gain 300-500 MHz on its boost frequency, thus maxing out a 142 W PPT in all-core workloads while the X3D is doing just fine with 80 W, I'd be very happy.
Well, my old 13600K needed 0.2V more than my current 14900K to run at the same frequency . If I wanted the 13600K to run 500 MHz quicker, it could have easilly needed 0.35V more.

It is the silicone quality. AMD is choosing better silicone for 3D CPUs, they need them to be able to run at lower voltage to produce less heat.
 
Last edited:
400 Amp-250W CPU, that's really ridiculous in my eyes... You need a compressor cooling solution to keep them cool enough. Cool CPU if you ask me.
You really need that to play a game at home, or send an email? It's just over engineered like today's car's. All that misery for an shameless 100Mhz more...

Hopefully Intel will now follow AMD's example; More powerful and at the same time more economical, that is what we really want. Less heat, less cooling needed, less noise. Even better for the environment and our children in the long run. Just my two cents... :)

On the other side; an 500W video-card, i still can't believe my eyes, melting connectors and so on. Soon a 2000W power supply needed?
 
Last edited:
I see the children got excited by the new troll GN video.
 
Hopefully Intel will now follow AMD's example
What do you mean? BOTH manufacturers now run their CPUs so quickly and inefficiently as they can and the only limit they employ is the thermal limit. The only difference here is that AMD CPUs are hitting this limit easier due to the thick heatspreaders.
 
Back
Top